Voice
Onlinellm Claw
OpenClaw plugin — routes all LLM calls through OpenRouter (Grok, Claude, Gemini, GPT, DeepSeek, etc.)
Install
openclaw plugins install onlinellm-claw
README
# onlinellm-claw
OpenClaw plugin that registers **OpenRouter** as a model provider, giving you access to 300+ LLMs (Grok, Claude, Gemini, GPT, DeepSeek, …) through a single API key.
## Install
```bash
openclaw plugins install onlinellm-claw
```
Then add your API key:
```bash
openclaw onboard --auth-choice openrouter-api-key
```
Or set the environment variable:
```bash
export OPENROUTER_API_KEY=sk-or-...
```
## Usage
After installation, every `openrouter/<model-id>` is available in `/model`:
```
/model openrouter/x-ai/grok-4.1-fast:online
/model openrouter/anthropic/claude-sonnet-4-5
/model openrouter/google/gemini-2.5-pro-preview
/model openrouter/openai/gpt-4.1
/model openrouter/deepseek/deepseek-r2
```
## Plugin Config (openclaw.json)
```json5
{
"plugins": {
"entries": {
"onlinellm-claw": {
"enabled": true,
"config": {
// Auto-redirect ALL LLM calls to OpenRouter (replaces built-in LLM):
"autoRoute": true,
"model": "x-ai/grok-4.1-fast:online",
// Inline API key (prefer using onboarding wizard instead):
// "apiKey": "sk-or-...",
// Add extra OpenRouter model ids beyond the built-in catalog:
// "extraModels": ["meta-llama/llama-4-maverick", "cohere/command-r-plus"]
}
}
}
}
}
```
### autoRoute
When `autoRoute: true`, the plugin intercepts **every** agent LLM call via the `before_model_resolve` hook and redirects it to the OpenRouter provider using the model you specify in `model`. This effectively replaces OpenClaw's built-in LLM backend without changing per-agent model settings.
## Built-in Model Catalog
| Model ID | Name |
|---|---|
| `x-ai/grok-4.1-fast:online` | Grok 4.1 Fast (online, web search) |
| `x-ai/grok-4.1` | Grok 4.1 |
| `x-ai/grok-3` | Grok 3 |
| `anthropic/claude-sonnet-4-5` | Claude Sonnet 4.5 |
| `anthropic/claude-opus-4-5` | Claude Opus 4.5 |
| `google/gemini-2.5-pro-preview` | Gemini 2.5 Pro Preview |
| `google/gemini-2.5-flash-preview` | Gemini 2.5 Flash Preview |
| `openai/gpt-4.1` | GPT-4.1 |
| `openai/o4-mini` | o4-mini (reasoning) |
| `deepseek/deepseek-r2` | DeepSeek R2 (reasoning) |
| `deepseek/deepseek-chat-v3-5` | DeepSeek Chat V3.5 |
Add more via `extraModels` in plugin config.
## API Key Resolution Order
1. Plugin config field `apiKey` (not recommended for security)
2. Auth profile `openrouter:default` (set via `openclaw onboard`)
3. Environment variable `OPENROUTER_API_KEY`
## Publish to npm
```bash
npm publish --access public
```
Then anyone can install with:
```bash
openclaw plugins install onlinellm-claw
```
voice
Comments
Sign in to leave a comment