← Back to Plugins
Tools

Lightclaw

OthmaneBlial By OthmaneBlial ⭐ 1 stars 👁 37 views ▲ 0 votes

๐Ÿฆž LightClaw: The Featherweight Core of OpenClaw โ€” Your AI Agent in a Tiny Codebase

GitHub

Install

pip install -r

README

<div align="center">

  <h1>๐Ÿฆž LightClaw</h1>

  <h3>The Featherweight Core of OpenClaw โ€” Your AI Agent in a Tiny Codebase</h3>

  <p><strong>OpenClaw-inspired Python AI agent</strong> for Telegram, long-term memory, and multi-provider LLM support.</p>

  <p>
    <img src="https://img.shields.io/badge/Python-3.10+-3776AB?style=flat&logo=python&logoColor=white" alt="Python">
    <img src="https://img.shields.io/badge/Core-lean-brightgreen" alt="Core">
    <img src="https://img.shields.io/badge/Repo-lightweight_core-blue" alt="Repo">
    <img src="https://img.shields.io/badge/LLM_Providers-6-purple" alt="Providers">
    <img src="https://img.shields.io/badge/RAM-<50MB-orange" alt="RAM">
    <img src="https://img.shields.io/badge/license-MIT-green" alt="License">
  </p>

  <p><i>Fork it. Hack it. Ship it. No framework tax.</i></p>

</div>

---

## Why LightClaw Exists

**OpenClaw** is a powerful, full-featured AI agent platform โ€” but it's also *big*. Dozens of packages, multiple channels, tool registries, message buses, plugin systems. It's built for scale and enterprise use.

**LightClaw** is the opposite. It's the *distilled essence* of the OpenClaw idea, stripped down to the atomic minimum:

If you are searching for an **OpenClaw alternative**, **OpenClaw in Python**, or a **self-hosted Telegram AI assistant with memory**, this repository is built for that exact use case.

```
OpenClaw:     Large multi-app monorepo โ”‚ TypeScript-first โ”‚ many channels + platform apps
LightClaw:    Focused Python core       โ”‚ Telegram-first   โ”‚ 6 providers โ”‚ lightweight runtime
```

As of February 2026, the official OpenClaw repository shows 12k+ commits and 200k+ GitHub stars.

Think of LightClaw as **the starter engine** โ€” the part of a rocket that ignites first. It contains the core DNA of OpenClaw (LLM routing, memory, conversational agent) but removes everything else. No message bus. No plugin registry. No tool orchestration. Just a direct pipeline:

```
๐Ÿ“ฑ Telegram Message โ†’ ๐Ÿง  Memory Recall โ†’ ๐Ÿค– LLM โ†’ ๐Ÿ’ก HTML Format โ†’ ๐Ÿ’ฌ Reply
```

## Looking for OpenClaw?

- OpenClaw GitHub: https://github.com/openclaw/openclaw
- OpenClaw docs: https://docs.openclaw.ai/
- LightClaw focuses on the lightweight Python path: Telegram-first, memory-enabled, and easy to fork.

## Who Is This For?

<table>
  <tr>
    <td>๐Ÿง‘โ€๐Ÿ’ป <b>Builders</b></td>
    <td>You want to build <i>your own</i> AI assistant without inheriting a massive codebase. Fork LightClaw, add what you need, nothing more.</td>
  </tr>
  <tr>
    <td>๐ŸŽ“ <b>Learners</b></td>
    <td>You want to understand how AI agents work โ€” memory, RAG, LLM routing โ€” in code you can read in 30 minutes.</td>
  </tr>
  <tr>
    <td>โšก <b>Minimalists</b></td>
    <td>You need a personal AI bot on a $5/month VPS. No Docker. No build steps. Just <code>./lightclaw run</code>.</td>
  </tr>
  <tr>
    <td>๐Ÿ”ฌ <b>Tinkerers</b></td>
    <td>You want to experiment with different LLM providers, memory strategies, or prompt engineering without fighting a framework.</td>
  </tr>
</table>

## The Core Idea

> **OpenClaw is the Industrial Complex. LightClaw is the Precision Workbench.**
>
> You don't need an entire industrial complex to build a custom tool. You need a workbench with the right instruments. LightClaw gives you exactly that โ€” a clean, readable, forkable foundation that does one thing well: **connect you to an AI through Telegram, with infinite memory.**
>
> Add Discord support? Drop in a file. Need tool calling? Add a function. Want vector search with FAISS? Swap out 20 lines in `memory.py`. The codebase is small enough that *you own it completely*.

## Features

๐Ÿง  **Infinite Memory** โ€” Every conversation is persisted in SQLite with TF-IDF vector embeddings. The bot recalls relevant context from days, weeks, or months ago via semantic search (RAG).

๐Ÿ”Œ **6 LLM Providers** โ€” OpenAI (ChatGPT), xAI (Grok), Anthropic (Claude), Google (Gemini), DeepSeek, Z-AI (GLM). Switch providers by changing one line in `.env`.

๐Ÿ“ฑ **Telegram Native** โ€” Polling-based bot with "Thinkingโ€ฆ ๐Ÿ’ญ" placeholders, HTML-formatted responses, typing indicators, and rich commands.

๐ŸŽญ **Customizable Personality** โ€” Edit `.lightclaw/workspace/SOUL.md`, `IDENTITY.md`, and `USER.md` to shape your bot's character, identity, and personal context.

๐Ÿงฉ **Skill System (ClawHub + Local)** โ€” Install skills from `clawhub.ai`, activate them per chat with `/skills`, and create your own custom skills locally.

๐Ÿค– **Local Agent Delegation** โ€” Delegate large build tasks to installed local coding agents (`codex`, `claude`, `opencode`) with `/agent`, while LightClaw reports workspace change summaries back in Telegram.

๐Ÿ› ๏ธ **Workspace File Operations + Diff Summaries** โ€” Large code is written directly to `.lightclaw/workspace` (not dumped in chat). LightClaw applies create/edit operations, then returns concise operation + diff line summaries.

๐Ÿงฑ **Truncation Recovery for Large Files** โ€” If an LLM response is cut mid-file, LightClaw attempts continuation/repair passes (including HTML completion) before finalizing the saved file.

๐ŸŽ™๏ธ **Voice Messages** โ€” Automatic voice transcription via Groq Whisper (optional). Send a voice note and the bot transcribes + responds.

๐Ÿ“ธ **Photo & Document Support** โ€” Send images and files โ€” the bot acknowledges them and processes captions through the agent loop.

๐Ÿงน **Smart Context Management** โ€” Auto-summarization when conversations grow too long, plus emergency context window compression with retry on overflow.

๐Ÿ“ฆ **Small Core** โ€” `main.py` + `memory.py` + `providers.py` + `config.py` + `lightclaw` CLI. No hidden complexity. No abstractions for the sake of abstractions.

๐Ÿš€ **Instant Startup** โ€” No compilation, no Docker, no build pipeline. `./lightclaw run` and you're running.

## Architecture

```
โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚                      main.py                                      โ”‚
โ”‚                                                                  โ”‚
โ”‚  โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”                     โ”‚
โ”‚  โ”‚ markdown_to_telegram_html()             โ”‚  MD โ†’ HTML converterโ”‚
โ”‚  โ”‚ load_personality()                      โ”‚  .lightclaw/workspace/*.md โ”‚
โ”‚  โ”‚ build_system_prompt()                   โ”‚  Dynamic prompts    โ”‚
โ”‚  โ”‚ transcribe_voice()                      โ”‚  Groq Whisper       โ”‚
โ”‚  โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜                     โ”‚
โ”‚                                                                  โ”‚
โ”‚  LightClawBot                                                    โ”‚
โ”‚  โ”œโ”€โ”€ handle_message()    โ† text messages                         โ”‚
โ”‚  โ”œโ”€โ”€ handle_voice()      โ† voice transcription                   โ”‚
โ”‚  โ”œโ”€โ”€ handle_photo()      โ† image handling                        โ”‚
โ”‚  โ”œโ”€โ”€ handle_document()   โ† file handling                         โ”‚
โ”‚  โ”œโ”€โ”€ _process_user_message()                                     โ”‚
โ”‚  โ”‚     โ”‚                                                         โ”‚
โ”‚  โ”‚     โ”œโ”€ 1. Send "Thinkingโ€ฆ ๐Ÿ’ญ"  placeholder                   โ”‚
โ”‚  โ”‚     โ”œโ”€ 2. Recall memories      โ—„โ”€โ”€ memory.py                  โ”‚
โ”‚  โ”‚     โ”œโ”€ 3. Build prompt              SQLite + TF-IDF RAG      โ”‚
โ”‚  โ”‚     โ”œโ”€ 4. Call LLM + retry     โ—„โ”€โ”€ providers.py               โ”‚
โ”‚  โ”‚     โ”œโ”€ 5. Edit placeholder          6 providers unified       โ”‚
โ”‚  โ”‚     โ””โ”€ 6. Summarize if needed                                 โ”‚
โ”‚  โ”‚                                                               โ”‚
โ”‚  โ””โ”€โ”€ cmd_start/help/clear/wipe_memory/memory/recall/skills/agent/show โ”‚
โ”‚                                                                  โ”‚
โ”‚  config.py โ—„โ”€โ”€ .env file                                          โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜
```

## Quick Start

### โšก One-Command Setup (Recommended)

```bash
git clone https://github.com/OthmaneBlial/lightclaw.git && cd lightclaw && bash setup.sh
```

The interactive setup wizard will walk you through:
1. Choosing your AI provider (OpenAI, xAI, Claude, Gemini, DeepSeek, Z-AI)
2. Entering your API key
3. Creating a Telegram bot via @BotFather (step-by-step guide)
4. Optional voice transcription setup
5. Auto-start your bot ๐Ÿš€

### ๐Ÿ”ง Manual Setup

```bash
git clone https://github.com/OthmaneBlial/lightclaw.git
cd lightclaw
pip install -r requirements.txt
```

**2. Onboard (recommended)**

```bash
./lightclaw onboard
```

This creates:
- `.env` (if missing)
- `.lightclaw/workspace/` (runtime personality files)
- `.lightclaw/lightclaw.db` (runtime DB path)

Then edit `.env` with your API key and Telegram bot token:

```env
# Choose your provider: openai | xai | claude | gemini | deepseek | zai
LLM_PROVIDER=openai
LLM_MODEL=latest
OPENAI_API_KEY=sk-...
DEEPSEEK_API_KEY=

# Get a token from @BotFather on Telegram
TELEGRAM_BOT_TOKEN=123456:ABC...

# Optional: restrict to your user ID (get it from @userinfobot)
TELEGRAM_ALLOWED_USERS=123456789

# Optional tuning for large code/file generation
MAX_OUTPUT_TOKENS=12000
LOCAL_AGENT_TIMEOUT_SEC=1800

# Optional delegated local-agent safety policy
LOCAL_AGENT_SAFETY_MODE=off
LOCAL_AGENT_DENY_PATTERNS=

# Skills (default registry)
SKILLS_HUB_BASE_URL=https://clawhub.ai
SKILLS_STATE_PATH=.lightclaw/skills_state.json
```

**3. Customize (Optional)**

Edit the personality files in `.lightclaw/workspace/`:

```
.lightclaw/workspace/
โ”œโ”€โ”€ IDENTITY.md   # Bot's name, purpose, philosophy
โ”œโ”€โ”€ SOUL.md       # Personality traits and values
โ””โ”€โ”€ USER.md       # Your preferences and personal context
```

**4. Run**

```bash
./lightclaw run
```

That's it. Open Telegram, find your bot, say hello. ๐Ÿฆž

> Development mode still works with `python main.py` (it now defaults to `.lightclaw/workspace`).

## CLI Commands

```bash
lightclaw onboard   # initialize .env + .lightclaw/workspace in current directory
lightclaw onboard --reset-env  # reset existing .env from latest template
lightclaw onboard --configure  # guided provider/model/key setup on current .env
lightclaw run       # ru

... (truncated)
tools

Comments

Sign in to leave a comment

Loading comments...