Overview
This guide walks you through deploying an AI assistant powered by Google Gemini using Cloud Deployment. You'll configure the provider, set up the deployment environment, and get your assistant running.
Why Google Gemini?
- 1M token context window
- Multimodal understanding
- Fast inference
Why Cloud Deployment?
- Managed infrastructure
- Global edge locations
- Automatic TLS
Requirements
- Fly.io Account: Account on Fly.io (or similar PaaS)
- flyctl CLI: Fly.io command-line tool
- Persistent Volume: For state persistence
- Google Gemini credentials: API key or authentication
Step 1: Configure Google Gemini
Google AI API key
- Go to ai.google.dev
- Create or select a project
- Enable the Gemini API
- Generate an API key
- Set GOOGLE_API_KEY environment variable
Environment Variable:
GOOGLE_API_KEY Step 2: Prepare Cloud Deployment Environment
- Install flyctl: Install the Fly.io CLI
curl -L https://fly.io/install.sh | sh - Login to Fly: Authenticate with Fly.io
fly auth login - Create App: Create a new Fly.io app
fly apps create openclaw-gateway - Create Volume: Create persistent volume for state
fly volumes create openclaw_data --size 1 - Configure fly.toml: Set up the Fly.io configuration file
- Set Secrets: Configure environment variables
fly secrets set ANTHROPIC_API_KEY=sk-ant-... - Deploy: Deploy to Fly.io
fly deploy
Step 3: Configuration
Create your openclaw.json configuration:
{
"agents": {
"defaults": {
"model": {
"primary": "google-gemini/gemini-2.0-flash-exp"
}
}
},
"models": {
"providers": {
"google-gemini": {
"models": {
"providers": {
"google-gemini": {
"apiKey": "${GOOGLE_API_KEY}
}
}
} Step 4: Deploy
# fly.toml
app = "openclaw-gateway"
primary_region = "iad"
[build]
image = "node:22-slim"
[env]
NODE_ENV = "production"
OPENCLAW_STATE_DIR = "/data"
[mounts]
source = "openclaw_data"
destination = "/data"
[[services]]
internal_port = 18789
protocol = "tcp"
[[services.ports]]
port = 443
handlers = ["tls", "http"] Step 5: Verify
# Check deployment status
openclaw status
# View logs
openclaw logs --follow
# Test with a message
openclaw test "Hello, are you working?" Connect to Channels
Now connect your deployed Google Gemini assistant to messaging channels:
Connect AI to WhatsApp via Baileys protocol
Telegram
Full-featured Telegram bot with commands and reactions
Discord
Discord bot with slash commands and threads
Slack
Slack app with Block Kit and threads
Signal
Encrypted messaging via signal-cli daemon
iMessage
Apple iMessage integration (macOS only)