-
Notifications
You must be signed in to change notification settings - Fork 2.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ChatAnthropic streaming with tools not working and instead forces non-stream response #6045
Comments
Hey @Moe03! I'm here to help you with any bugs, questions, or contributions. Let's tackle this together while we wait for a human maintainer. To enable proper streaming of events in the Here is an example of how you can use the import { ChatAnthropic } from "@langchain/anthropic";
import { AgentExecutor, createToolCallingAgent } from "langchain/agents";
import { ChatPromptTemplate, MessagesPlaceholder } from "@langchain/core/prompts";
import { HumanMessage, SystemMessage } from "@langchain/core/messages";
import { DynamicStructuredTool } from "@langchain/core/tools";
import { z } from "zod";
async function main() {
const defaultTools = [
new DynamicStructuredTool({
name: "GetWeather",
description: "Return the weather.",
schema: z.object({
query: z.string().describe("The city to return weather for."),
}),
func: async (input) => {
console.log(`STARTING TOOL: GetWeather`);
return 'sunny';
},
}),
];
const chatPrompt = ChatPromptTemplate.fromMessages([
new MessagesPlaceholder("messages"),
new MessagesPlaceholder("agent_scratchpad"),
]);
const messagesHistory = [
new SystemMessage("you are a helpful assistant!"),
new HumanMessage("Whats the weather like in Cairo?")
];
const llm = new ChatAnthropic({
apiKey: ANTHROPIC_API_KEY,
model: "claude-3-5-sonnet-20240620",
streaming: true, // Enable streaming
});
const agent = await createToolCallingAgent({
llm,
tools: defaultTools,
prompt: chatPrompt,
});
const agentExec = new AgentExecutor({
agent,
tools: defaultTools,
}).withConfig({ runName: "Agent" });
let llmStream = await agentExec.streamEvents(
{
messages: messagesHistory,
},
{
version: "v1"
}
);
for await (const event of llmStream) {
const eventType = event.event;
if (eventType === "on_chain_start") {
if (event.name === "Agent") {
console.log("\n-----");
console.log(`Starting agent: ${event.name} with input: ${JSON.stringify(event.data.input)}`);
}
} else if (eventType === "on_chain_end") {
if (event?.name === "Agent") {
console.log("\n-----");
console.log(`Finished agent: ${event?.name}\n`);
console.log(`Agent output was: ${event?.data?.output}`);
console.log("\n-----");
}
} else if (eventType === "on_llm_stream" || eventType === "on_chat_model_stream" || eventType === "on_chain_stream") {
console.log("\n-----");
const content = event?.data?.chunk?.output || event.data?.chunk?.message?.content || event.data?.chunk?.content;
if (content !== undefined && content !== "") {
console.log(`| ${content}`);
}
} else if (eventType === "on_tool_start") {
console.log("\n-----");
console.log(`Starting tool: ${event.name} with inputs: ${event.data.input}`);
} else if (eventType === "on_tool_end") {
console.log("\n-----");
console.log(`Finished tool: ${event.name}\n`);
console.log(`Tool output was: ${event.data.output}`);
}
}
}
main(); This example ensures that chunks of data are yielded and events are sent in real-time, improving the performance of real-time applications [1][2][3]. |
Tracking in #6056 |
This should now be fixed! |
I ran into the same problem and after upgrading to @langchain/anthropic v0.2.6, |
@bracesproul can you have a look? |
@bracesproul Thanks in advance! Basically, Here's the step-by-step:
Then, the streamed tool chunk content is set as the following object in
The next step is where the issue lies. Basically, we have the
Because token is undefined, the callback is never triggered:
In reality, though, if For example:
Is this helpful in terms of tracking it down? |
@bracesproul that looks great! Looks like it accounts for it 🙏 (added a comment on the PR itself re: making sure it's handling the needed type) |
@bracesproul Ah looks like the new implementation introduces it's own bug 😅 . Opened a 1-liner to fix at #6183 |
Can confirm it works now - thanks again! |
Checked other resources
Example Code
The following is a more advanced example but it is to showcase the full problem, I've also implemented a fix for it in a pr.
We basically needed tools to stream properly on @langchain/anthropic package to enable streaming events across any other langchain method like the agentExecutor I'm using here.
Error Message and Stack Trace (if applicable)
There are no errors but it doesn't stream when it should.
Description
We basically needed tools to stream properly on @langchain/anthropic package to enable streaming events across any other langchain method like the agentExecutor.
Currenlty it doesn't stream and instead wait for the whole tool + response to finish then outputs the response which takes a really long time and is bad UX in realtime apps.
System Info
Node version 18.17.1
yarn version 1.22.22
platform windows
yarn info langchain:
(result was too big)
The text was updated successfully, but these errors were encountered: