- Help Center
- TaxBuzz / CountingWorks
- TaxBuzz Ai
-
Getting Started with CountingWorks PRO
-
CountingWorks PRO 3.0
-
Portal 2.0
-
CountingWorks Learning Center
-
Green Portal
- Automated Newsletter
- eSign
- Online Appointments & Reminders
- Client Management
- Improve Your Presence
- Message Center
- Professional View Portal
- Tips & Tricks
- Domains
- Website Editor Plugins
- Website Editor
- Social Media
- Admin Users and Roles
- Security
- Leads - Automated Lead Management
- Communication - Special Announcements
- Subscriber Management
- Client Alerts
- Greetings
- Portal Client View
-
TaxBuzz / CountingWorks
-
Rackspace Email Set Up
-
General
-
ADP
Maximizing Token Allotment with Effective Conversation Management
Start new conversations for unrelated topics to reduce token usage, as the AI references previous messages in a thread to provide context. Separating topics into different threads helps maximize your token allotment & ensures more efficient responses
We’ve recently improved our AI model to better understand context by looking at previous messages sent within the same chat thread. This enhancement has led to more accurate responses, especially when the conversation is ongoing and related. However, this also means that the model more closely references previous messages in the thread, which can increase token usage.
For users who keep multiple unrelated items in the same thread, this can cause higher token consumption, as the AI has to process all the previous information every time, even when it's not relevant to the current inquiry.
How to Minimize Token Usage
To help you maximize your token allotment, we recommend starting a new chat for unrelated topics. This way, the AI doesn't have to consider previous irrelevant messages, reducing the number of tokens used for each response.
By creating separate threads for each topic, you’ll:
- Keep the context more focused.
- Reduce the number of tokens used.
- Ensure that the AI provides more accurate and efficient responses.