Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

(Community): Adding Structured Support for ChatPerplexity #29361

Open
wants to merge 14 commits into
base: master
Choose a base branch
from

Conversation

keenborder786
Copy link
Contributor

@keenborder786 keenborder786 commented Jan 23, 2025

Copy link

vercel bot commented Jan 23, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

1 Skipped Deployment
Name Status Preview Comments Updated (UTC)
langchain ⬜️ Ignored (Inspect) Visit Preview Feb 2, 2025 2:49pm

@keenborder786 keenborder786 marked this pull request as ready for review January 25, 2025 19:21
@dosubot dosubot bot added size:L This PR changes 100-499 lines, ignoring generated files. community Related to langchain-community labels Jan 25, 2025
@keenborder786
Copy link
Contributor Author

@ccurme

@chain
def _oai_structured_outputs_parser(ai_msg: AIMessage) -> PydanticBaseModel:
if ai_msg.additional_kwargs.get("parsed"):
return ai_msg.additional_kwargs["parsed"]
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is a BaseModel instance getting populated under "parsed" in .additional_kwargs?

@keenborder786
Copy link
Contributor Author

@ccurme please see now. I have double checked now and tested as well with Preplexity Docs.

@keenborder786
Copy link
Contributor Author

@ccurme looking all good, please review

@keenborder786
Copy link
Contributor Author

@ccurme

1 similar comment
@keenborder786
Copy link
Contributor Author

@ccurme

Copy link
Collaborator

@ccurme ccurme left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I enabled standard tests for perplexity to pick up tests for structured output. It's currently failing-- we expect to handle TypedDict, Pydantic, and JSON schema.

More importantly, this doesn't appear to work for any input type. Let me know if I'm doing something wrong.

from langchain_community.chat_models import ChatPerplexity
from pydantic import BaseModel, Field

class Joke(BaseModel):
    """Joke to tell user."""

    setup: str = Field(description="question to set up a joke")
    punchline: str = Field(description="answer to resolve the joke")

llm = ChatPerplexity(model="sonar").with_structured_output(Joke)
result = llm.invoke("Tell me a joke about cats.")

BadRequestError: Error code: 400 - {'error': {'message': '["At body -> response_format -> ResponseFormatText -> type: Input should be 'text'", "At body -> response_format -> ResponseFormatJSONSchema -> type: Input should be 'json_schema'", "At body -> response_format -> ResponseFormatJSONSchema -> json_schema: Field required", "At body -> response_format -> ResponseFormatRegex -> type: Input should be 'regex'", "At body -> response_format -> ResponseFormatRegex -> regex: Field required"]', 'type': 'bad_request', 'code': 400}}

@keenborder786
Copy link
Contributor Author

okay @ccurme

@keenborder786
Copy link
Contributor Author

@ccurme I have ensured that we are handling TypedDict, Pydantic, and JSON Schema. To clarify, currently, Perplexity only supports JSON Schema for structured output. Additionally, I have accounted for both Pydantic V1 and Pydantic V2 when converting schemas to JSON.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
community Related to langchain-community size:L This PR changes 100-499 lines, ignoring generated files.
Projects
Status: Triage
Development

Successfully merging this pull request may close these issues.

2 participants