-
Notifications
You must be signed in to change notification settings - Fork 111
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
'str' object has no attribute 'parameters' #136
Comments
Which workflow does start_with_LLM json use, how is the local model loaded, direct loading or ollama loading? Is there a screenshot of the workflow? |
I checked and found that the model you provided is only compatible with VLLM loading. I really want VLLM to become one of the loading methods for Party, but VLLM itself relies on too many things, and Party can't possibly add this library. Party mainly relies on transformer and llama cpp to load models, and only models that are compatible with these two methods can be directly loaded by Party.
|
To Reproduce
Steps to reproduce the behavior:
The text was updated successfully, but these errors were encountered: