Skip to content

This project provides a seamless way to deploy Ollama and Open WebUI in system powered by AMD GPUs using Docker Compose. πŸŽ‰

Notifications You must be signed in to change notification settings

kabir0st/romc-ollama-docker

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

3 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

πŸš€ Ollama + Open WebUI Docker Setup πŸ–₯️

Welcome to the Ollama + Open WebUI Docker Setup repository! This project provides a seamless way to deploy Ollama and Open WebUI using Docker Compose. Whether you're a developer, researcher, or just curious, this setup will get you up and running in no time! πŸŽ‰


πŸ“¦ Services Overview

This setup includes two main services:

  1. Ollama πŸ¦™

    • Image: ollama/ollama:rocm
    • Port: 11434
    • Description: Ollama is a powerful service for running large language models locally.
  2. Open WebUI 🌐

    • Image: ghcr.io/open-webui/open-webui:cuda
    • Port: 3000 (mapped to 8080 inside the container)
    • Description: Open WebUI provides a user-friendly interface to interact with Ollama's models.

πŸ› οΈ Prerequisites

Before you begin, ensure you have the following installed:

  • Docker 🐳
  • Docker Compose πŸ™
  • AMD GPU Drivers (for ROCm support)

πŸš€ Quick Start

  1. Clone this repository:

    git clone https://github.com/kabir0st/romc-ollama-docker/
    cd romc-ollama-docker
  2. Start service:

    docker compose up
  3. Access the service:


About

This project provides a seamless way to deploy Ollama and Open WebUI in system powered by AMD GPUs using Docker Compose. πŸŽ‰

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published