Llama-3.2-3B-Instruct failed to use with HuggingfacePipeline because of setting a non-string value as the pad_token #29431
Labels
🤖:bug
Related to a bug, vulnerability, unexpected error with an existing feature
Checked other resources
Example Code
The following code:
Error Message and Stack Trace (if applicable)
Description
I tried to do HuggingfacePipeline of langchain_community with Llama-3.2-3B-Instruct, but the error occured.
I think that it's the same bug as transformers issue 34869.
When I changed /libs/community/langchain_community/llms/huggingface_pipeline.py L172 as followings, the error didn't occur.
It's the same procedure as this pull request.
System Info
System Information
OS: Ubuntu 24.04
Kernel Version: 6.8.0-51-generic
Python Version: 3.12.7
Model: Llama-3.2-3B-Instruct
langchain 0.3.15
langchain-community 0.3.15
langchain-core 0.3.31
transformers 4.47.1
The text was updated successfully, but these errors were encountered: