Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can it support 910Pro B #8

Closed
rickywu opened this issue Feb 6, 2025 · 2 comments
Closed

Can it support 910Pro B #8

rickywu opened this issue Feb 6, 2025 · 2 comments

Comments

@rickywu
Copy link

rickywu commented Feb 6, 2025

I used mindie to infer got slow, can vllm more faster than it?

@Yikun
Copy link
Collaborator

Yikun commented Feb 6, 2025

@rickywu Thanks for raising the issue.

Currently, only Atlas A2 series devices supported as shown in:
https://github.com/vllm-project/vllm-ascend?tab=readme-ov-file#support-devices

910ProB is not supported yet. from a technical view, vLLM ascend support would be possible if the torch-npu is supported, we also welcome your contributions.

@Yikun
Copy link
Collaborator

Yikun commented Feb 15, 2025

I will close the issue. If you have any other question feel free to raise a new one.

@rickywu rickywu closed this as completed Feb 17, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants