Tools
Llm Mem
LLM-agnostic persistent memory plugin for OpenClaude. Captures session observations, compresses with AI (Claude, Gemini, Ollama, OpenRouter), injects context into future sessions. Supports Claude Code CLI/Desktop/Web + OpenClaw Gateway.
Install
npm install -g
README
<h1 align="center">
<br>
<a href="https://github.com/mevlutcanturaci/llm-mem">
llm-mem
</a>
<br>
</h1>
<p align="center">
<a href="docs/i18n/README.zh.md">🇨🇳 中文</a> •
<a href="docs/i18n/README.zh-tw.md">🇹🇼 繁體中文</a> •
<a href="docs/i18n/README.ja.md">🇯🇵 日本語</a> •
<a href="docs/i18n/README.pt.md">🇵🇹 Português</a> •
<a href="docs/i18n/README.pt-br.md">🇧🇷 Português</a> •
<a href="docs/i18n/README.ko.md">🇰🇷 한국어</a> •
<a href="docs/i18n/README.es.md">🇪🇸 Español</a> •
<a href="docs/i18n/README.de.md">🇩🇪 Deutsch</a> •
<a href="docs/i18n/README.fr.md">🇫🇷 Français</a> •
<a href="docs/i18n/README.he.md">🇮🇱 עברית</a> •
<a href="docs/i18n/README.ar.md">🇸🇦 العربية</a> •
<a href="docs/i18n/README.ru.md">🇷🇺 Русский</a> •
<a href="docs/i18n/README.pl.md">🇵🇱 Polski</a> •
<a href="docs/i18n/README.cs.md">🇨🇿 Čeština</a> •
<a href="docs/i18n/README.nl.md">🇳🇱 Nederlands</a> •
<a href="docs/i18n/README.tr.md">🇹🇷 Türkçe</a> •
<a href="docs/i18n/README.uk.md">🇺🇦 Українська</a> •
<a href="docs/i18n/README.vi.md">🇻🇳 Tiếng Việt</a> •
<a href="docs/i18n/README.tl.md">🇵🇭 Tagalog</a> •
<a href="docs/i18n/README.id.md">🇮🇩 Indonesia</a> •
<a href="docs/i18n/README.th.md">🇹🇭 ไทย</a> •
<a href="docs/i18n/README.hi.md">🇮🇳 हिन्दी</a> •
<a href="docs/i18n/README.bn.md">🇧🇩 বাংলা</a> •
<a href="docs/i18n/README.ur.md">🇵🇰 اردو</a> •
<a href="docs/i18n/README.ro.md">🇷🇴 Română</a> •
<a href="docs/i18n/README.sv.md">🇸🇪 Svenska</a> •
<a href="docs/i18n/README.it.md">🇮🇹 Italiano</a> •
<a href="docs/i18n/README.el.md">🇬🇷 Ελληνικά</a> •
<a href="docs/i18n/README.hu.md">🇭🇺 Magyar</a> •
<a href="docs/i18n/README.fi.md">🇫🇮 Suomi</a> •
<a href="docs/i18n/README.da.md">🇩🇰 Dansk</a> •
<a href="docs/i18n/README.no.md">🇳🇴 Norsk</a>
</p>
<h4 align="center">Persistent memory compression system built for <a href="https://claude.com/claude-code" target="_blank">Claude Code</a>.</h4>
<p align="center">
<a href="LICENSE">
<img src="https://img.shields.io/badge/License-AGPL%203.0-blue.svg" alt="License">
</a>
<a href="package.json">
<img src="https://img.shields.io/badge/version-0.1.2-green.svg" alt="Version">
</a>
<a href="package.json">
<img src="https://img.shields.io/badge/node-%3E%3D18.0.0-brightgreen.svg" alt="Node">
</a>
<a href="https://github.com/mevlutcanturaci/awesome-claude-code">
<img src="https://awesome.re/mentioned-badge.svg" alt="Mentioned in Awesome Claude Code">
</a>
</p>
<p align="center">
<a href="https://trendshift.io/repositories/15496" target="_blank">
<picture>
<source media="(prefers-color-scheme: dark)" srcset="https://raw.githubusercontent.com/mevlutcanturaci/llm-mem/main/docs/public/trendshift-badge-dark.svg">
<source media="(prefers-color-scheme: light)" srcset="https://raw.githubusercontent.com/mevlutcanturaci/llm-mem/main/docs/public/trendshift-badge.svg">
<img src="https://raw.githubusercontent.com/mevlutcanturaci/llm-mem/main/docs/public/trendshift-badge.svg" alt="mevlutcanturaci/llm-mem | Trendshift" width="250" height="55"/>
</picture>
</a>
</p>
<br>
<table align="center">
<tr>
<td align="center">
<a href="https://github.com/mevlutcanturaci/llm-mem">
<picture>
<img
src="https://raw.githubusercontent.com/mevlutcanturaci/llm-mem/main/docs/public/cm-preview.gif"
alt="LLM-Mem Preview"
width="500"
>
</picture>
</a>
</td>
<td align="center">
<a href="https://www.star-history.com/#mevlutcanturaci/llm-mem&Date">
<picture>
<source
media="(prefers-color-scheme: dark)"
srcset="https://api.star-history.com/image?repos=mevlutcanturaci/llm-mem&type=date&theme=dark&legend=top-left"
/>
<source
media="(prefers-color-scheme: light)"
srcset="https://api.star-history.com/image?repos=mevlutcanturaci/llm-mem&type=date&legend=top-left"
/>
<img
alt="Star History Chart"
src="https://api.star-history.com/image?repos=mevlutcanturaci/llm-mem&type=date&legend=top-left"
width="500"
/>
</picture>
</a>
</td>
</tr>
</table>
<p align="center">
<a href="#quick-start">Quick Start</a> •
<a href="#how-it-works">How It Works</a> •
<a href="#mcp-search-tools">Search Tools</a> •
<a href="#documentation">Documentation</a> •
<a href="#configuration">Configuration</a> •
<a href="#troubleshooting">Troubleshooting</a> •
<a href="#license">License</a>
</p>
<p align="center">
llm-mem seamlessly preserves context across sessions by automatically capturing tool usage observations, generating semantic summaries, and making them available to future sessions. This enables Claude to maintain continuity of knowledge about projects even after sessions end or reconnect.
</p>
---
## Quick Start
Start a new Claude Code session in the terminal and enter the following commands:
```bash
# npm'den global kurulum
npm install -g @mevlutcanturaci/llm-mem
```
Veya Claude Code plugin olarak:
```
/plugin marketplace add mevlutcanturaci/llm-mem
/plugin install llm-mem
```
Restart Claude Code. Context from previous sessions will automatically appear in new sessions.
### 🦞 OpenClaw Gateway
Install llm-mem as a persistent memory plugin on [OpenClaw](https://openclaw.ai) gateways with a single command:
```bash
curl -fsSL https://raw.githubusercontent.com/mevlutcanturaci/llm-mem/main/install/openclaw.sh | bash
```
The installer handles dependencies, plugin setup, AI provider configuration, worker startup, and optional real-time observation feeds to Telegram, Discord, Slack, and more. See the [OpenClaw Integration Guide](https://github.com/mevlutcanturaci/llm-mem/tree/main/docs/openclaw-integration) for details.
**Key Features:**
- 🧠 **Persistent Memory** - Context survives across sessions
- 📊 **Progressive Disclosure** - Layered memory retrieval with token cost visibility
- 🔍 **Skill-Based Search** - Query your project history with mem-search skill
- 🖥️ **Web Viewer UI** - Real-time memory stream at http://localhost:37777
- 💻 **Claude Desktop Skill** - Search memory from Claude Desktop conversations
- 🔒 **Privacy Control** - Use `<private>` tags to exclude sensitive content from storage
- ⚙️ **Context Configuration** - Fine-grained control over what context gets injected
- 🤖 **Automatic Operation** - No manual intervention required
- 🔗 **Citations** - Reference past observations with IDs (access via http://localhost:37777/api/observation/{id} or view all in the web viewer at http://localhost:37777)
- 🧪 **Beta Channel** - Try experimental features like Endless Mode via version switching
---
## Documentation
📚 **[View Full Documentation](https://github.com/mevlutcanturaci/llm-mem/tree/main/docs/)** - Browse on official website
### Getting Started
- **[Installation Guide](https://github.com/mevlutcanturaci/llm-mem/tree/main/docs/installation)** - Quick start & advanced installation
- **[Usage Guide](https://github.com/mevlutcanturaci/llm-mem/tree/main/docs/usage/getting-started)** - How llm-mem works automatically
- **[Search Tools](https://github.com/mevlutcanturaci/llm-mem/tree/main/docs/usage/search-tools)** - Query your project history with natural language
- **[Beta Features](https://github.com/mevlutcanturaci/llm-mem/tree/main/docs/beta-features)** - Try experimental features like Endless Mode
### Best Practices
- **[Context Engineering](https://github.com/mevlutcanturaci/llm-mem/tree/main/docs/context-engineering)** - AI agent context optimization principles
- **[Progressive Disclosure](https://github.com/mevlutcanturaci/llm-mem/tree/main/docs/progressive-disclosure)** - Philosophy behind llm-mem's context priming strategy
### Architecture
- **[Overview](https://github.com/mevlutcanturaci/llm-mem/tree/main/docs/architecture/overview)** - System components & data flow
- **[Architecture Evolution](https://github.com/mevlutcanturaci/llm-mem/tree/main/docs/architecture-evolution)** - The journey from v3 to v5
- **[Hooks Architecture](https://github.com/mevlutcanturaci/llm-mem/tree/main/docs/hooks-architecture)** - How llm-mem uses lifecycle hooks
- **[Hooks Reference](https://github.com/mevlutcanturaci/llm-mem/tree/main/docs/architecture/hooks)** - 7 hook scripts explained
- **[Worker Service](https://github.com/mevlutcanturaci/llm-mem/tree/main/docs/architecture/worker-service)** - HTTP API & Bun management
- **[Database](https://github.com/mevlutcanturaci/llm-mem/tree/main/docs/architecture/database)** - SQLite schema & FTS5 search
- **[Search Architecture](https://github.com/mevlutcanturaci/llm-mem/tree/main/docs/architecture/search-architecture)** - Hybrid search with Chroma vector database
### Configuration & Development
- **[Configuration](https://github.com/mevlutcanturaci/llm-mem/tree/main/docs/configuration)** - Environment variables & settings
- **[Development](https://github.com/mevlutcanturaci/llm-mem/tree/main/docs/development)** - Building, testing, contributing
- **[Troubleshooting](https://github.com/mevlutcanturaci/llm-mem/tree/main/docs/troubleshooting)** - Common issues & solutions
---
## How It Works
**Core Components:**
1. **5 Lifecycle Hooks** - SessionStart, UserPromptSubmit, PostToolUse, Stop, SessionEnd (6 hook scripts)
2. **Smart Install** - Cached dependency checker (pre-hook script, not a lifecycle hook)
3. **Worker Service** - HTTP API on port 37777 with web viewer UI and 10 search endpoints, managed by Bun
4. **SQLite Database** - Stores sessions, observations, summaries
5. **mem-search Skill** - Natural language queries with progressive disclosure
6. **Chroma Vector Database** - Hybrid semantic + keyword search for intelligent context retrieval
See [Architecture Overview](https://github.com/mevlutcanturaci/llm-mem/tree/main/docs/architecture/ove
... (truncated)
tools
Comments
Sign in to leave a comment