Available Models

Model Context Images Tools
Llama 3.3
Meta's latest open-source model
128K — ✓
Mistral
Efficient model from Mistral AI
32K — ✓
Code Llama
Specialized for code generation
16K — ✓

Authentication

NONE

No authentication required for local instance

Setup Steps:

  1. Install Ollama from ollama.ai
  2. Run 'ollama pull llama3.3'
  3. Start Ollama service
  4. Configure baseUrl in OpenClaw

Configuration Example

{
  "models": {
    "providers": {
      "ollama": {
        "baseUrl": "http://localhost:11434"
      }
    }
  },
  "agents": {
    "defaults": {
      "model": {
        "primary": "ollama/llama3.3"
      }
    }
  }
}

Use Ollama with Messaging Channels

Connect Ollama to your favorite messaging platforms:

Frequently Asked Questions

How much does Ollama cost with OpenClaw?
Ollama pricing depends on your usage. Ollama charges based on tokens processed. OpenClaw itself is free and open source. You only pay for the AI provider API costs.
How do I authenticate Ollama with OpenClaw?
Ollama supports none. The most common method is using an API key which you can get from the Ollama console.
Which Ollama models work with OpenClaw?
OpenClaw supports all Ollama models including Llama 3.3, Mistral, Code Llama. You can configure your preferred model in openclaw.json or use model aliases for convenience.
Can I use Ollama as a fallback provider?
Yes, OpenClaw supports automatic failover. You can configure Ollama as a fallback in the model configuration. If your primary provider fails, OpenClaw will automatically switch to Ollama.