Welcome to the Ollama + Open WebUI Docker Setup repository! This project provides a seamless way to deploy Ollama and Open WebUI using Docker Compose. Whether you're a developer, researcher, or just curious, this setup will get you up and running in no time! π
This setup includes two main services:
-
Ollama π¦
- Image:
ollama/ollama:rocm
- Port:
11434
- Description: Ollama is a powerful service for running large language models locally.
- Image:
-
Open WebUI π
- Image:
ghcr.io/open-webui/open-webui:cuda
- Port:
3000
(mapped to8080
inside the container) - Description: Open WebUI provides a user-friendly interface to interact with Ollama's models.
- Image:
Before you begin, ensure you have the following installed:
- Docker π³
- Docker Compose π
- AMD GPU Drivers (for ROCm support)
-
Clone this repository:
git clone https://github.com/kabir0st/romc-ollama-docker/ cd romc-ollama-docker
-
Start service:
docker compose up
-
Access the service:
- Ollama API: http://10.1.0.2:11434
- Open WebUI: http://10.1.0.2:8080