Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Any plan for supporting Mistral 7B model? #13

Open
leocnj opened this issue Feb 28, 2024 · 1 comment
Open

Any plan for supporting Mistral 7B model? #13

leocnj opened this issue Feb 28, 2024 · 1 comment

Comments

@leocnj
Copy link

leocnj commented Feb 28, 2024

I saw Mixtral 7Bx8 model has been supported. Just wondering whether you plan to support Mistral 7B in near future given its excellent performance? Thanks.

@AniZpZ
Copy link
Owner

AniZpZ commented Mar 1, 2024

Hi! We are currenty working on improving quantized model performance. We cannot promise to support Mistral 7B very soon. We will look into the problem once we got the time.
And you are more than welcome to give us a PR about supporting Mistral !!!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants