751 words
4 min read

Install Ollama Using Docker Compose

By · Solutions Architect · Docker Captain · IBM Champion
Install Ollama Using Docker Compose

This article is for those looking for a detailed and straightforward guide on installing Ollama using Docker Compose.

Ollama is a streamlined, modular framework designed for developing and operating language models locally.

TIP

Architecture Context

Choose self-hosted Ollama when you need local LLM inference with full data privacy — no prompts or responses leave your infrastructure. The OpenAI API or AWS Bedrock provide managed alternatives with larger model selection and zero GPU maintenance. Self-hosting is justified when data sensitivity prohibits external API calls or when predictable inference costs outweigh the capital expense of GPU hardware.

💾 You can find the repository used in this guide on GitHub.

heyvaldemar
/
ollama-traefik-letsencrypt-docker-compose
Waiting for api.github.com...
0
0
N/A
Waiting...
NOTE

We’ll use Traefik as our reverse proxy. It’ll handle obtaining cryptographic certificates from Let’s Encrypt for your domain names and route requests to the corresponding services based on those domains.

CAUTION

To obtain cryptographic certificates, you will need A-type records in the external DNS zone, which point to the IP address of your server where Traefik is installed. If you have created these records recently, you should wait before starting the installation of the services. Full replication of these records between DNS servers can take from a few minutes to 48 hours or even longer in rare cases.

IMPORTANT

Docker Engine and Docker Compose must be installed on the server.

For a step-by-step guide on installing Docker Engine on Ubuntu Server, see Install Docker Engine and Docker Compose on Ubuntu Server

IMPORTANT

OpenSSH must be installed on the server, and port 22 must be open in order to be able to connect to the server using the SSH protocol.

To install OpenSSH on the server you can use the command:

Terminal window
sudo apt install openssh-server
NOTE

To connect to the server from a Windows system, you can use tools like PuTTY or MobaXterm.

NOTE

This guide walks you through connecting to a server with the iTerm2 terminal emulator on macOS.

CAUTION

You will need to open the following TCP ports for access to the services:

  • TCP port 80 - to obtain a free cryptographic certificate through the Let’s Encrypt certification center.
  • TCP port 443 - to access the Ollama web interface.

We connect to the server on which Ollama is planned to be installed.

Now it is necessary to create networks for your services.

We create a network for Traefik using the command:

Terminal window
docker network create traefik-network

Install Ollama Using Docker Compose - Step 1

We create a network for Ollama using the command:

Terminal window
docker network create ollama-network

Install Ollama Using Docker Compose - Step 2

Next, you need to clone the repository that contains the configuration files, which include all the necessary conditions for Ollama to work.

You can clone the repository using the command:

Terminal window
git clone https://github.com/heyvaldemar/ollama-traefik-letsencrypt-docker-compose.git

Install Ollama Using Docker Compose - Step 3

Navigate to the directory with the repository using the command:

Terminal window
cd ollama-traefik-letsencrypt-docker-compose

Install Ollama Using Docker Compose - Step 4

Next, you need to change the variables in the .env file according to your requirements.

IMPORTANT

The .env file should be in the same directory as ollama-traefik-letsencrypt-docker-compose.yml.

Now let’s start Ollama with the command:

Terminal window
docker compose -f ollama-traefik-letsencrypt-docker-compose.yml -p ollama up -d

Install Ollama Using Docker Compose - Step 5

To access the Ollama management panel, go to https://ollama.heyvaldemar.net from your workstation, where ollama.heyvaldemar.net is the domain name of my service. Accordingly, you need to specify your domain name that points to the IP address of your server with the installed Traefik service, which will redirect the request to Ollama.

NOTE

You need to specify the domain name of the service, previously defined in the .env file.

Click on the “Sign up” button.

Install Ollama Using Docker Compose - Step 6

The next step is to provide: your full name, an email address and a password to create a Ollama administrator account.

Click on the “Create Account” button.

Install Ollama Using Docker Compose - Step 7

Welcome to the Ollama control panel.

Please wait for the models listed in your .env file to download; the duration will depend on your internet speed. Once downloaded, you can select any model from the left corner of the interface to start using it.

Install Ollama Using Docker Compose - Step 8

To access the Traefik control panel, go to https://traefik.ollama.heyvaldemar.net from your workstation, where traefik.ollama.heyvaldemar.net is the domain name of my service. Accordingly, you need to specify your domain name that points to the IP address of your server with the installed Traefik.

NOTE

You need to specify the domain name of the service, previously defined in the .env file.

Enter the username and password previously set in the .env file, and click the “OK” button.

Install Ollama Using Docker Compose - Step 9

Welcome to the Traefik control panel.

Install Ollama Using Docker Compose - Step 10


Vladimir Mikhalev

Docker Captain  ·  IBM Champion  ·  AWS Community Builder

The Verdict — production-tested analysis on YouTube.

Related Posts

Same category
  1. 1
    The Intake Gate Your CISO Is Missing — 300 Million AI Chat Messages Were Public by Default
    AI & MLOps · Over half of AI-enabled apps on major backends carry severe misconfigurations. A hands-on analysis of the 300M-message Firebase breach, the insecure default that caused it, and the 3-layer Operational Discipline Protocol — with specific tooling — to shut down Agent Sprawl before regulators do it for you.
  2. 2
    Docker MCP — Turn GPT into a Real DevOps Assistant (Slack, GitHub, Stripe)
    AI & MLOps · Learn how to turn GPT into a real DevOps assistant using Docker MCP. Discover how AI agents can automate Slack, GitHub, Stripe, and more — securely and at scale.
  3. 3
    Why AI Fails Without DevOps — What No One Tells You
    AI & MLOps · Without DevOps, AI fails fast. Learn how containers, CI/CD, and GitOps keep LLMs and ML systems like OpenAI and Hugging Face running at scale.
  4. 4
    Building AI Solutions with Docker Compose and Kubernetes Expertise
    AI & MLOps · Build scalable AI solutions with Docker Compose and Kubernetes. Master containerized workflows, security, and real-time development features.

Random Posts

Random
  1. 1
    Install Minecraft Server Using Docker Compose
    Self-Hosting · Learn how to install a Minecraft Server using Docker Compose. Set up your own secure multiplayer server on Ubuntu with ease using this step-by-step guide.
  2. 2
    Distinctions Between Terminal, Command Line, Shell, and Prompt
    SysAdmin & IT Pro · Learn the differences between terminal, command line, shell, and prompt in Linux. A beginner-friendly guide to essential CLI concepts and tools.
  3. 3
    Install Ubuntu Server 18.04 LTS
    SysAdmin & IT Pro · Step-by-step guide to install Ubuntu Server 18.04 LTS. Learn disk setup, OpenSSH installation, user configuration, and post-installation steps for server deployment.
  4. 4
    Install ServiceDesk Plus on Ubuntu Server
    SysAdmin & IT Pro · Comprehensive guide to installing ServiceDesk Plus on Ubuntu Server 22.04 LTS. Covers setup, port config, auto-start service, and admin login.
Install Ollama Using Docker Compose
https://heyvaldemar.com/install-ollama-using-docker-compose/
Author
Vladimir Mikhalev
Published
2024-09-27
License
CC BY-NC-SA 4.0