You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I would like to suggest a potential enhancement that could improve the monitoring of user activity.
Currently, the system saves each conversation in the Azure Cosmos DB, this is a great feature, but I believe it could be further improved by also storing the count of tokens used in each conversation.
The GPT model sends the token count with each response in the following format:
"usage": {
"prompt_tokens": 23,
"completion_tokens": 9,
"total_tokens": 32
}
By storing the total_tokens data in the Azure Cosmos DB, we could gain a better understanding of how each user is utilizing the chat. This could be particularly useful for monitoring purposes, as it would allow us to track the usage of the chat for each user more effectively.
I hope you find this suggestion useful. Thanks
The text was updated successfully, but these errors were encountered:
This one is tricky to implement for streaming chat completions, since this data is not passed back from the API. We'll need to compute it approximately using a tokenizer.
I would like to suggest a potential enhancement that could improve the monitoring of user activity.
Currently, the system saves each conversation in the Azure Cosmos DB, this is a great feature, but I believe it could be further improved by also storing the count of tokens used in each conversation.
The GPT model sends the token count with each response in the following format:
"usage": {
"prompt_tokens": 23,
"completion_tokens": 9,
"total_tokens": 32
}
By storing the total_tokens data in the Azure Cosmos DB, we could gain a better understanding of how each user is utilizing the chat. This could be particularly useful for monitoring purposes, as it would allow us to track the usage of the chat for each user more effectively.
I hope you find this suggestion useful. Thanks
The text was updated successfully, but these errors were encountered: