Skip to content

Installing Ollama

Jeffry Samuel edited this page Mar 7, 2025 · 6 revisions

Flatpak

Alpaca used to include Ollama in it's Flatpak packages, this changed to make Ollama optional.

Flathub

Go to Alpaca's store page in your system's app store and look for the extension called Ollama Instance and install it, then reopen Alpaca and enjoy running local models!

You can also install the extension with this command:

# Check which installation type you have
flatpak list --columns=app,installation | grep Alpaca

# If you have a system installation
flatpak install com.jeffser.Alpaca.Plugins.Ollama

# If you have a user installation
flatpak install --user com.jeffser.Alpaca.Plugins.Ollama

AMD GPU Support

AMD GPUs require ROCm to be used with AI tools, Alpaca also packages it as an extension, so, in addition of com.jeffser.Alpaca.Plugins.Ollama you will also need to install com.jeffser.Alpaca.Plugins.AMD, available as an Alpaca extension in your system's app store as Alpaca AMD Support.

Snap

Alpaca's Snap packages are distributed in the releases page of the repository, please make sure you install the package that doesn't include -no-ollama in the name by following the instructions in the installation wiki page.

AMD GPU Support

Alpaca doesn't support AMD GPUs on Snap because of a lack of testing, it should work by itself, if not try installing ROCm on your system.

MacOS

Alpaca includes Ollama in MacOS by default.

Arch Linux

Important

Alpaca doesn't support Arch Linux officially

Ollama is installed in Arch Linux the same way as any other package, the base ollama package is available in the stable repo, whilst there are other alternatives available in the AUR.

NixPkgs

Important

Alpaca doesn't support Nix officially

The Nix package is maintained by @Aleksanaa, any issues with the package should be reported at NixPkgs issues

Please read the installation instructions, there you can learn how to select between ollama-cuda and ollama-rocm.

Clone this wiki locally