Overview
This guide walks you through connecting OpenAI Codex to Matrix using OpenClaw. You'll learn how to configure authentication, set up the channel, and deploy your AI assistant.
OpenAI Codex Features
- 128K context window
- Image generation and understanding
- Tool/function calling
- Fine-tuning available
Matrix Capabilities
- Direct messages
- Group chats
- Media support
- Reactions
- Thread support
Step 1: Configure OpenAI Codex
Standard API key authentication
- Create an account at platform.openai.com
- Navigate to API Keys section
- Generate a new API key
- Set OPENAI_API_KEY environment variable
Environment variable: OPENAI_API_KEY
Step 2: Configure Matrix
- Create a Matrix account on your preferred homeserver
- Generate an access token
- Configure matrix in openclaw.json
- Start the gateway
- Invite the bot to rooms
Step 3: Combined Configuration
Add both configurations to your openclaw.json:
{
"agents": {
"defaults": {
"model": {
"primary": "openai-codex/gpt-5.2"
}
}
},
"models": {
"providers": {
"openai-codex": {
"models": {
"providers": {
"openai-codex": {
"apiKey": "${OPENAI_API_KEY}
}
},
"channels": {
"matrix": {
"homeserver": "https://matrix.org",
"accessToken": "${MATRIX_ACCESS_TOKEN}",
"userId": "@mybot:matrix.org"
}
}
}
} Step 4: Start the Gateway
# Start the gateway
openclaw gateway start
# Check status
openclaw status
# View logs
openclaw logs --follow Access Control
Matrix supports the following access control policies:
DM Policies
| Policy | Description |
|---|---|
allowlist | Only senders in allowFrom list are processed |
pairing | Unknown senders receive a pairing code; admin must approve |
open | All DMs are processed (requires allowFrom: ["*"]) |
Group Policies
| Policy | Description |
|---|---|
allowlist | Only groups in groupAllowFrom are processed |
open | All groups are processed |
Deploy Options
Choose how to deploy your OpenAI Codex + Matrix setup:
Local Deployment
Run on your personal machine with local-only access
View Guide →VPS Deployment
Always-on deployment on a Linux VPS
View Guide →Cloud Deployment
Managed container deployment on Fly.io or similar
View Guide →Docker Deployment
Containerized deployment with Docker
View Guide →Frequently Asked Questions
How do I set up OpenAI Codex with Matrix?
Configure OpenAI Codex as your AI provider and enable Matrix as a channel in openclaw.json. The gateway routes Matrix messages to OpenAI Codex for processing automatically.
Is OpenAI Codex a good choice for Matrix bots?
OpenAI Codex works great with Matrix. Extensive ecosystem and tooling and Good performance across tasks make it well-suited for Privacy-focused teams and Open source communities.