Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature: Local Ollama.insteaf of OpenAI #77

Open
2 tasks done
eugeis opened this issue Dec 11, 2023 · 0 comments
Open
2 tasks done

Feature: Local Ollama.insteaf of OpenAI #77

eugeis opened this issue Dec 11, 2023 · 0 comments

Comments

@eugeis
Copy link

eugeis commented Dec 11, 2023

Type of feature

🍕 Feature

Current behavior

In the current solution OpenAI APII is used.

Suggested solution

I would like to use the solution a local LLM model, concrete the ollama REST API,
https://github.com/jmorganca/ollama/blob/main/docs/api.md.

Would it be possible and what parts of the solution are to change?

Additional context

No response

Code of Conduct

  • I agree to follow this project's Code of Conduct

Contributing Docs

  • I agree to follow this project's Contribution Docs
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant