← Back to Plugins
Channels

AlexaClaw Client

SufiSR By SufiSR 👁 3 views ▲ 0 votes

OpenClaw Alexa channel plugin and proxy server for the AlexaClaw Alexa skill

GitHub

Install

pip install -r

Configuration Example

"channels": {
  "alexa": {
    "enabled": true,
    "dmPolicy": "open",
    "allowFrom": ["*"],
    "secret": "<ALEXA_PROXY_SECRET>",   // same value as in .env
    "webhookPort": 18790
    // "systemPrompt": "Keep responses under 3 sentences, optimized for voice."
  }
}

README

# AlexaClaw Client

Connects an [AlexaClaw](https://github.com/SufiSR/AlexaClaw) Alexa skill to an [OpenClaw](https://openclaw.dev) gateway. Two components work together:

| Component | Path | Role |
|-----------|------|------|
| **alexa-server** | `alexa-server/` | FastAPI proxy — receives HTTPS calls from the Alexa skill, forwards them to the OpenClaw plugin webhook |
| **alexa-channel plugin** | `openclaw/extensions/alexa-channel/` | OpenClaw channel plugin — runs inside the gateway, routes Alexa traffic to the `alexa-llm` agent |

```
Alexa Device
  -> AWS Lambda (AlexaClaw skill)
  -> Caddy (HTTPS at <your-domain>)
  -> alexa-server  (FastAPI, 127.0.0.1:8000)
  -> OpenClaw alexa-channel plugin  (127.0.0.1:18790)
      -> alexa-llm agent
  <- spoken response
```

---

## Prerequisites

- OpenClaw installed and running
- Python 3.11+ with `python3-venv`
- Caddy (for HTTPS / TLS)
- An AlexaClaw Alexa skill deployed on AWS Lambda — see [AlexaClaw on GitHub](https://github.com/SufiSR/AlexaClaw)

---

## 1. OpenClaw plugin

Copy the plugin into the OpenClaw extensions directory and restart the gateway:

```bash
cp -r openclaw/extensions/alexa-channel ~/.openclaw/extensions/alexa-channel
openclaw gateway restart
openclaw plugins list    # should show alexa-channel
```

Then apply the required changes to `~/.openclaw/openclaw.json` — see [openclaw.json changes](#openclaw-json-changes) below.

---

## 2. Alexa proxy server

```bash
cd alexa-server
python3 -m venv .venv
.venv/bin/pip install -r requirements.txt
cp .env.example .env
# edit .env — see variable reference below
```

**Key environment variables** (`.env`):

| Variable | Required | Description |
|----------|----------|-------------|
| `ALEXA_PROXY_SECRET` | yes | Bearer token shared with the Alexa skill. Set the same value in the AlexaClaw Lambda env. |
| `OPENCLAW_GATEWAY_TOKEN` | yes | OpenClaw gateway auth token. |
| `OPENCLAW_ALEXA_WEBHOOK_URL` | no | Plugin webhook URL. Default: `http://127.0.0.1:18790` |
| `OPENCLAW_GATEWAY_URL` | no | Gateway URL (fallback path). Default: `http://127.0.0.1:18789` |
| `OPENCLAW_MODEL_DEFAULT` | no | Model for the agent. Default: `openai-codex/gpt-5.1-codex-mini` |
| `OPENCLAW_ALEXA_TIMEOUT` | no | Seconds before returning a "still thinking" reply. Default: `45` |
| `OPENCLAW_ALEXA_THINKING` | no | Message spoken to the user on timeout. |

### HTTPS with Caddy

Point your public domain at this server and place the included `Caddyfile` in `/etc/caddy/`:

```bash
sudo cp Caddyfile /etc/caddy/Caddyfile
# edit Caddyfile: replace the placeholder domain with your own
sudo systemctl reload caddy
```

Caddy obtains and renews TLS certificates automatically. Ensure ports 80 and 443 are open.

### Run as a systemd service (production)

```bash
sudo cp alexa_proxy.service /etc/systemd/system/
sudo systemctl daemon-reload
sudo systemctl enable --now alexa_proxy.service
```

For development, run in the foreground:

```bash
.venv/bin/uvicorn alexa_proxy:app --host 127.0.0.1 --port 8000 --log-level info
```

---

## openclaw.json changes

The `openclaw.json` file at the root of this repo shows only the plugin-relevant additions. Merge these sections into your existing `~/.openclaw/openclaw.json`.

See [`openclaw.json`](./openclaw.json) for the full snippet, or copy the blocks below.

### Channel

```jsonc
"channels": {
  "alexa": {
    "enabled": true,
    "dmPolicy": "open",
    "allowFrom": ["*"],
    "secret": "<ALEXA_PROXY_SECRET>",   // same value as in .env
    "webhookPort": 18790
    // "systemPrompt": "Keep responses under 3 sentences, optimized for voice."
  }
}
```

### Bindings

```jsonc
"bindings": [
  { "match": { "channel": "alexa" }, "agentId": "alexa-llm" }
]
```

### Plugin

```jsonc
"plugins": {
  "allow": ["alexa-channel"],
  "entries": {
    "alexa-channel": { "enabled": true }
  }
}
```

### Agent

Under `agents.list`, add the `alexa-llm` agent (adjust the model to match your setup):

```jsonc
{
  "id": "alexa-llm",
  "name": "alexa-llm",
  "workspace": "~/.openclaw/workspace/agents/alexa-llm",
  "agentDir": "~/.openclaw/agents/alexa-llm/agent",
  "model": { "primary": "openai-codex/gpt-5.1-codex-mini" }
}
```

### Chat Completions fallback (optional)

```jsonc
"gateway": {
  "http": {
    "endpoints": {
      "chatCompletions": { "enabled": true }
    }
  }
}
```

---

## Verification

```bash
openclaw plugins doctor          # No plugin issues detected
openclaw channels list           # alexa: configured, enabled
ss -tlnp | grep 18790            # webhook port listening

# Smoke test the plugin directly
curl -sS -X POST http://127.0.0.1:18790 \
  -H 'Authorization: Bearer <ALEXA_PROXY_SECRET>' \
  -H 'Content-Type: application/json' \
  -d '{"messages":[{"role":"user","content":"Hello!"}]}'

# Smoke test the proxy
curl -sS http://127.0.0.1:8000/v1/responses \
  -H 'Authorization: Bearer <ALEXA_PROXY_SECRET>' \
  -H 'Content-Type: application/json' \
  -d '{"messages":[{"role":"user","content":"Hello!"}]}'
```

---

## AlexaClaw Alexa Skill

The AWS Lambda function that drives the Alexa skill is a separate project. Install and configure it first:

**[github.com/SufiSR/AlexaClaw](https://github.com/SufiSR/AlexaClaw)**

Set the skill's endpoint URL to `https://<your-domain>/v1/responses` and configure `ALEXA_PROXY_SECRET` to the same value used in `.env`.
channels

Comments

Sign in to leave a comment

Loading comments...