Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature]: Support for serving two llms at the same time #11950

Open
1 task done
czg1225 opened this issue Jan 11, 2025 · 0 comments
Open
1 task done

[Feature]: Support for serving two llms at the same time #11950

czg1225 opened this issue Jan 11, 2025 · 0 comments

Comments

@czg1225
Copy link

czg1225 commented Jan 11, 2025

🚀 The feature, motivation and pitch

I want to implement the interaction between two llms. But I found it will encounter error or lock when initializing two llms like following. Could you help to support this feature? It will be great useful!

first_model_name = "Qwen/Qwen2.5-1.5B-Instruct"
second_model_name = "Qwen/Qwen2.5-7B-Instruct"
first_llm = LLM(model=first_model_name, tensor_parallel_size=2)
second_llm = LLM(model=second_model_name, tensor_parallel_size=2)

Alternatives

No response

Additional context

No response

Before submitting a new issue...

  • Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant