Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Gemini models - ValueError: Unrecognized tool choice format #697

Open
pedropcamellon opened this issue Jan 16, 2025 · 1 comment
Open

Gemini models - ValueError: Unrecognized tool choice format #697

pedropcamellon opened this issue Jan 16, 2025 · 1 comment
Labels
enhancement New feature or request

Comments

@pedropcamellon
Copy link

I'm getting this error: ValueError: Unrecognized tool choice format, when using Gemini models.
The project is based on langchain-ai/executive-ai-assistant.

Similar issue to this one: #330

Error Log:

File "D:\Documents\GitHub\executive-ai-assistant\.venv\Lib\site-packages\langchain_google_vertexai\chat_models.py", line 1337, in _prepare_request_gemini
    tool_config = _tool_choice_to_tool_config(tool_choice, all_names)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Documents\GitHub\executive-ai-assistant\.venv\Lib\site-packages\langchain_google_vertexai\functions_utils.py", line 448, in _tool_choice_to_tool_config
    raise ValueError(
ValueError: Unrecognized tool choice format:

tool_choice={'type': 'function', 'function': {'name': 'RespondTo'}}

Should match VertexAI ToolConfig or FunctionCallingConfig format.
async def triage_input(state: State, config: RunnableConfig, store: BaseStore):
    """Agent responsible for triaging the email, can either ignore it, try to respond, or notify user."""

    chat_model = ChatModel()
    llm = chat_model.get_model()

    examples = await get_few_shot_examples(state["email"], store, config)

    prompt_config = get_config(config)

    input_message = triage_prompt.format(
        email_thread=state["email"]["page_content"],
        author=state["email"]["from_email"],
        to=state["email"].get("to_email", ""),
        subject=state["email"]["subject"],
        fewshotexamples=examples,
        name=prompt_config["name"],
        full_name=prompt_config["full_name"],
        background=prompt_config["background"],
        triage_no=prompt_config["triage_no"],
        triage_email=prompt_config["triage_email"],
        triage_notify=prompt_config["triage_notify"],
    )

    model = llm.with_structured_output(RespondTo).bind(
        tool_choice={"type": "function", "function": {"name": "RespondTo"}}
    )

    response = await model.ainvoke(input_message)

    if len(state["messages"]) > 0:
        delete_messages = [RemoveMessage(id=m.id) for m in state["messages"]]
        return {"triage": response, "messages": delete_messages}
    else:
        return {"triage": response}

Google Docs Reference:

Model Version Function calling launch stage Support for parallel function calling Support for forced function calling
Gemini 1.0 Pro all versions General Availability No No
Gemini 1.5 Flash all versions General Availability Yes Yes
Gemini 1.5 Pro all versions General Availability Yes Yes
Gemini 2.0 Flash all versions Preview Yes Yes

from:
https://cloud.google.com/vertex-ai/generative-ai/docs/multimodal/function-calling#supported_models

@lkuligin
Copy link
Collaborator

we don't support this type of tool_choice yet (only ANY | NONE | AUTO) is provided. We should parse a dict and send it to _tool_choice_to_tool_config appropriately (as tool_choice mode and functions allowed if provided).

@lkuligin lkuligin added the enhancement New feature or request label Jan 27, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants