openrouter-model-config-guide
Configuring OpenRouter as a Model Provider in OpenClaw
OpenRouter is a powerful model routing service that allows OpenClaw to access numerous AI models through a unified API interface. By configuring OpenRouter as a model provider, you can leverage its extensive catalog of models while maintaining OpenClaw's familiar workflow and tooling.
Setting Up OpenRouter
To configure OpenRouter as your model provider, you'll need an API key from OpenRouter and a few configuration changes in your OpenClaw setup.
Prerequisites
- An OpenRouter account
- An API key from OpenRouter (available at openrouter.ai/keys)
- OpenClaw installed and running
CLI Configuration
The easiest way to set up OpenRouter is through the OpenClaw onboarding wizard:
openclaw onboard --auth-choice apiKey --token-provider openrouter --token "$OPENROUTER_API_KEY"
This command will guide you through the authentication process and automatically configure your OpenClaw instance to use OpenRouter.
Manual Configuration
If you prefer to configure OpenRouter manually, add the following to your OpenClaw configuration file (~/.openclaw/openclaw.json):
{
"env": { "OPENROUTER_API_KEY": "sk-or-v1-your-api-key-here" },
"agents": {
"defaults": {
"model": { "primary": "openrouter/anthropic/claude-sonnet-4-5" }
}
}
}
Replace sk-or-v1-your-api-key-here with your actual OpenRouter API key, and adjust the model reference as needed for your use case.
Understanding Model References
OpenRouter uses a specific naming convention for models: openrouter/<provider>/<model>. For example:
openrouter/anthropic/claude-sonnet-4-5- Anthropic's Sonnet 4.5 modelopenrouter/meta-llama/llama-3.3-70b-instruct- Meta's Llama 3.3 70B instruct modelopenrouter/openai/gpt-4o- OpenAI's GPT-4o model
You can browse the complete list of available models at openrouter.ai/models.
Advanced Configuration
Model Fallbacks
To ensure reliability, configure fallback models in case your primary model is unavailable:
{
"agents": {
"defaults": {
"model": {
"primary": "openrouter/anthropic/claude-sonnet-4-5",
"fallbacks": [
"openrouter/anthropic/claude-opus-4-6",
"openrouter/meta-llama/llama-3.3-70b-instruct"
]
}
}
}
}
This configuration will attempt to use Sonnet 4.5 first, falling back to Opus 4.6, and finally to Llama 3.3 70B if needed.
Image Model Configuration
If your workflow involves image processing, you may want to specify a dedicated image model:
{
"agents": {
"defaults": {
"imageModel": {
"primary": "openrouter/mistralai/pixtral-12b",
"fallbacks": [
"openrouter/black-forest-labs/flux-pro-1.1"
]
}
}
}
}
Environment Variables
Instead of hardcoding your API key in the configuration file, you can use environment variables for better security:
export OPENROUTER_API_KEY="sk-or-v1-your-api-key-here"
Then reference it in your configuration:
{
"env": { "OPENROUTER_API_KEY": "${OPENROUTER_API_KEY}" },
"agents": {
"defaults": {
"model": { "primary": "openrouter/anthropic/claude-sonnet-4-5" }
}
}
}
Testing Your Configuration
After configuration, verify that everything is working correctly:
- Check your model status:
openclaw models status
- Test a simple request:
openclaw agent "Hello, world!"
- Verify you can switch models interactively in chat:
/model list
/model 1
Common Issues and Troubleshooting
Authentication Errors
If you receive authentication errors, verify that:
- Your API key is correct
- The environment variable
OPENROUTER_API_KEYis properly set - Your key has not expired
Model Not Found
If a model cannot be found, ensure that:
- The model reference format is correct (
openrouter/provider/model) - The model exists in OpenRouter's catalog
- You have access to the model (some models may require registration)
Rate Limiting
OpenRouter enforces rate limits based on your account type. If you encounter rate limiting:
- Check your usage at openrouter.ai/usage
- Consider upgrading your account for higher limits
- Implement request batching or caching in your workflows
Best Practices
Security
- Never commit API keys to version control
- Use environment variables or secure credential storage
- Rotate keys periodically
Performance
- Choose models appropriate for your use case (e.g., faster models for simple tasks)
- Use fallbacks to maintain availability
- Monitor latency and adjust your model selection as needed
Cost Management
- Monitor your usage through OpenRouter's dashboard
- Set budget alerts if available
- Consider using free models for development and testing
By following these guidelines, you can effectively configure and use OpenRouter as a model provider in OpenClaw, giving you access to a wide range of AI models with a simple, consistent interface.
Enjoyed this article?
Join the ClawMakers community to discuss this and more with fellow builders.
Join on Skool โ It's Free โ