You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The default "method" parameter is "json_schema" for ChatOpenAI and this results in success. However, for ChatOllama, it is "function_calling" which results in the parsing errors mentioned in the issue.
To get ChatOllama working with constrained nodes, I needed to change the indicated line to:
I opened an issue on the main repo first (langchain-ai/langchain#29515) since I wasn't aware the experimental utilities had their own repo.
I won't bother reproducing the text of the issue but my problem boiled down to this line:
langchain-experimental/libs/experimental/langchain_experimental/graph_transformers/llm.py
Line 827 in b3172d8
The default "method" parameter is "json_schema" for
ChatOpenAI
and this results in success. However, forChatOllama
, it is "function_calling" which results in the parsing errors mentioned in the issue.To get
ChatOllama
working with constrained nodes, I needed to change the indicated line to:I'm not sure what the general correct fix is. Perhaps the
method
should be passed into theLLMGraphTransformer
constructor?The text was updated successfully, but these errors were encountered: