You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
These two warnings are showing up. Has anyone @CharlieFRuan seen them before? I messed around with ChatCompletion.usage instead of runtimeStatsTest() but could not get it working.
llm_chat.ts:165 Cannot find tokenizer_info or token_table_postproc_method in mlc-chat-config.json, using default token_postproc_method raw.
This field is only used for json mode.
engine.ts:1274 WARNING: runtimeStatsText() will soon be deprecated. Please use ChatCompletion.usage for non-streaming requests, or ChatCompletionChunk.usage for streaming requests, enabled by stream_options. The only flow that expects to use runtimeStatsText() as of now is forwardTokensAndSample().
The text was updated successfully, but these errors were encountered:
These two warnings are showing up. Has anyone @CharlieFRuan seen them before? I messed around with ChatCompletion.usage instead of runtimeStatsTest() but could not get it working.
My typically working demo is at https://hpssjellis.github.io/my-examples-of-ai-agents/public/web-llm/web-llm00.html
llm_chat.ts:165 Cannot find
tokenizer_info
ortoken_table_postproc_method
inmlc-chat-config.json
, using default token_postproc_methodraw
.This field is only used for json mode.
engine.ts:1274 WARNING:
runtimeStatsText()
will soon be deprecated. Please useChatCompletion.usage
for non-streaming requests, orChatCompletionChunk.usage
for streaming requests, enabled bystream_options
. The only flow that expects to useruntimeStatsText()
as of now isforwardTokensAndSample()
.The text was updated successfully, but these errors were encountered: