Available Models
| Model | Context | Images | Tools |
|---|---|---|---|
| Llama 3.3 Meta's latest open-source model | 128K | — | ✓ |
| Mistral Efficient model from Mistral AI | 32K | — | ✓ |
| Code Llama Specialized for code generation | 16K | — | ✓ |
Authentication
NONE
No authentication required for local instance
Setup Steps:
- Install Ollama from ollama.ai
- Run 'ollama pull llama3.3'
- Start Ollama service
- Configure baseUrl in OpenClaw
Configuration Example
{
"models": {
"providers": {
"ollama": {
"baseUrl": "http://localhost:11434"
}
}
},
"agents": {
"defaults": {
"model": {
"primary": "ollama/llama3.3"
}
}
}
} Use Ollama with Messaging Channels
Connect Ollama to your favorite messaging platforms:
Connect AI to WhatsApp via Baileys protocol
View Guide →Telegram
Full-featured Telegram bot with commands and reactions
View Guide →Discord
Discord bot with slash commands and threads
View Guide →Slack
Slack app with Block Kit and threads
View Guide →Signal
Encrypted messaging via signal-cli daemon
View Guide →iMessage
Apple iMessage integration (macOS only)
View Guide →Matrix
Decentralized Matrix protocol support
View Guide →Microsoft Teams
Enterprise Teams integration via Microsoft SDK
View Guide →Nostr
Decentralized Nostr protocol support
View Guide →Frequently Asked Questions
How much does Ollama cost with OpenClaw?
Ollama pricing depends on your usage. Ollama charges based on tokens processed. OpenClaw itself is free and open source. You only pay for the AI provider API costs.
How do I authenticate Ollama with OpenClaw?
Ollama supports none. The most common method is using an API key which you can get from the Ollama console.
Which Ollama models work with OpenClaw?
OpenClaw supports all Ollama models including Llama 3.3, Mistral, Code Llama. You can configure your preferred model in openclaw.json or use model aliases for convenience.
Can I use Ollama as a fallback provider?
Yes, OpenClaw supports automatic failover. You can configure Ollama as a fallback in the model configuration. If your primary provider fails, OpenClaw will automatically switch to Ollama.