Skip to content

cosmic-utils/cosmic-ext-applet-ollama

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Ollama applet for COSMIC Desktop


chat settings

Before using this applet, you must have Ollama installed on your system. To do this, run this in your terminal:

curl -fsSL https://ollama.com/install.sh | sh

Source: Ollama Github

After installing Ollama. Pull some models, you would like to use with chat, for example

ollama pull llama3

More models you can find in library: https://ollama.com/library

Installing this applet

Clone the repository, and use just

If you don't have just installed, it is available in PopOS repository, so you can install it with apt

sudo apt install just

Now you can clone repo and install applet.

git clone https://github.com/elevenhsoft/cosmic-ext-applet-ollama.git
cd cosmic-ext-applet-ollama

Building

Run just:

just

Installing

sudo just install

Done

From now, you will be able to add applet to your desktop panel/dock and chat with different models in real time :)

Cheers!

Known wgpu issue

There are currently some rendering issues with the wgpu libcosmic features in some (older?) gpus. This doesn't affect Ollama, only the applet. If you are affected by this, you can build and install it with this feature disabled:

just build-no-wgpu
sudo just install

About

Ollama applet for COSMIC Desktop

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •