Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multiple ollama config with different base-url #431

Open
ArthurSaintDenis2 opened this issue Feb 3, 2025 · 0 comments
Open

Multiple ollama config with different base-url #431

ArthurSaintDenis2 opened this issue Feb 3, 2025 · 0 comments
Labels
enhancement New feature or request

Comments

@ArthurSaintDenis2
Copy link

I currently have a setup where I can run ollama both on my laptop (for small models) an my desktop (for bigger models).

I tried using a config like this one:

ollama-desktop:
  base-url: http://192.168.1.15:11434/api
  models:
    "deepseek-r1:14b":
      aliases: ["deepseek"]
      max-input-chars: 650000
    "deepseek-r1:32b":
      aliases: ["deepseek-big"]
      max-input-chars: 650000
ollama-local:
  base-url: http://localhost:11434/api
  models:
    "deepseek-r1:1.5b":
      aliases: ["deepseek"]
      max-input-chars: 650000
    "llama3.2:1b":
      aliases: ["llama"]
      max-input-chars: 650000

But I always get ERROR OpenAI authentication failed.
It seems like ollama only works if the api is named "ollama". I can get around it by changing the names of the apis to only have one with the name ollama but this means I have to edit the config every time I want to change model.

Is there a way to achieve this ?

@ArthurSaintDenis2 ArthurSaintDenis2 added the enhancement New feature or request label Feb 3, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant