Overview

This guide walks you through deploying an AI assistant powered by Ollama using Docker Deployment. You'll configure the provider, set up the deployment environment, and get your assistant running.

Why Ollama?

  • Completely local
  • No API costs
  • Privacy-focused

Why Docker Deployment?

  • Portable
  • Reproducible
  • Isolation

Requirements

  • Docker: Docker Engine 20.10+
  • Docker Compose: For multi-container orchestration
  • Ollama credentials: API key or authentication

Step 1: Configure Ollama

No authentication required for local instance

  1. Install Ollama from ollama.ai
  2. Run 'ollama pull llama3.3'
  3. Start Ollama service
  4. Configure baseUrl in OpenClaw

Step 2: Prepare Docker Deployment Environment

  1. Create Dockerfile: Create a Dockerfile for OpenClaw
  2. Create docker-compose.yml: Set up Docker Compose configuration
  3. Build Image: Build the Docker image
    docker compose build
  4. Start Container: Start the container
    docker compose up -d
  5. Run Onboarding: Complete setup inside container
    docker compose exec openclaw openclaw onboard

Step 3: Configuration

Create your openclaw.json configuration:

{
  "agents": {
    "defaults": {
      "model": {
        "primary": "ollama/llama3.3"
      }
    }
  },
  "models": {
    "providers": {
      "ollama": {
  "models": {
    "providers": {
      "ollama": {
        "baseUrl": "http://localhost:11434"
      }
    }
  }
}

Step 4: Deploy

# docker-compose.yml
version: '3.8'
services:
  openclaw:
    image: node:22-slim
    command: ["sh", "-c", "npm install -g openclaw@latest && openclaw gateway"]
    ports:
      - "18789:18789"
    volumes:
      - openclaw_data:/root/.openclaw
    environment:
      - ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY}
    restart: unless-stopped

volumes:
  openclaw_data:

Step 5: Verify

# Check deployment status
openclaw status

# View logs
openclaw logs --follow

# Test with a message
openclaw test "Hello, are you working?"

Connect to Channels

Now connect your deployed Ollama assistant to messaging channels: