-
Notifications
You must be signed in to change notification settings - Fork 16
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
HTTPStatusError("Client error '422 Unprocessable Entity' ... #5
Comments
I've been trying to run the template using langgraph up and the debugger. but it seems like the memory isn't updating when I use this method. It runs perfectly on the desktop CLI. But when I use the debugger and langgraph up command I get this error, resulting in memories not getting updated
|
I've solved this problem by modifying langgraph.json file. I've initially assigned the graph names like these I think this for some reason hindered the ability for get_client to fetch the memory graph. but when I changed the graph names to "chatBot" "memoryGraph", it worked as intended. can somebody possibly explain why this solution worked? |
It's
the assistant ID configuration value can etiher be a graph name or an assistant. |
Hi, there are some issues trying to implement this template to the project I am working on.
When I import this project straight into my project without any modification, it returns such error.
There is no problem running the memory state separately, and I've figured that it is a problem with get_client() not being able to run in the background. I've tried inputting url parameter as http://localhost:8123, yet the error changes to ConnectError('All connection attempts failed') but does not fix.
what possibly could happen here? All I did was to copy paste the code into my repository, changed the import to absolute path, and all the models to openAI. the same code works perfectly in the template.
The text was updated successfully, but these errors were encountered: