← Back to Plugins
Tools

Clawdreamer

EESIZ By EESIZ 👁 12 views ▲ 0 votes

Give your AI agent the ability to dream. Nightly memory consolidation modeled after sleep neuroscience.

GitHub

Install

pip install -r

Configuration Example

{
  "plugins": {
    "slots": {
      "memory": "memory-lancedb"
    },
    "entries": {
      "memory-lancedb": {
        "enabled": true,
        "config": {
          "embedding": {
            "apiKey": "${OPENAI_API_KEY}",
            "model": "text-embedding-3-small"
          },
          "autoCapture": true,
          "autoRecall": true
        }
      }
    }
  },
  "hooks": {
    "internal": {
      "enabled": true,
      "entries": {
        "session-memory": {
          "enabled": true
        }
      }
    }
  }
}

README

# Dreamer

> I was trying to integrate [memento-mcp](https://github.com/JinHo-von-Choi/memento-mcp) into OpenClaw. It didn't work. I went to bed frustrated.
> Then I had a dream where I was riding a bus to work with Jang Wonyoung. Woke up, and suddenly the idea was just... there.

> AI agents never sleep. They never dream. โ†’ That's actually their biggest problem.

> Dreaming is supposedly what happens when your consciousness accidentally wakes up while your brain is busy compressing and organizing memories during sleep.
> So what if we gave AI agents the same process?
>   - Instead of just dumping memories into an ever-growing database, what if we mimicked the way biological brains -- refined over millions of years of evolution -- actually handle memory?
> Maybe we could escape the endless token consumption that comes with memory systems,
> while creating a "memory" that's imperfect but human-like -- something people can actually relate to?

TL;DR: Dreams = brain's memory compression process (hypothesis) โ†’ Let's give AI bots something similar.

**Dreamer gives your AI agent the ability to dream.**

Built for [OpenClaw](https://openclaw.ai), but works with any system that produces daily markdown files. -- "Probably."

## How It Works

With Claude's help, I referenced sleep neuroscience papers and
modeled Dreamer after them. Every night, it goes through the same 3 phases your brain does.

### Phase 1: NREM -- "What happened today?"

During NREM sleep, the hippocampus replays the day's events and transfers important patterns to the neocortex. Dreamer does the same:

- Loads episode files generated by OpenClaw (`YYYY-MM-DD.md`)
- Chunks text into semantic units
- Clusters similar chunks via embedding similarity
- LLM distills each cluster into key facts
- Deduplicates against existing memories
- Stores new semantic memories in LanceDB

Raw experience in, compressed knowledge out.

### Phase 2: REM -- "Does this fit with what I already know?"

REM sleep is when the brain integrates new memories with existing ones -- resolving contradictions and strengthening connections. Dreamer's REM phase:

- Detects conflicts between **new** and **existing** memories (expected complexity: O(N*M)) โ† I feel like there's room for optimization with some clever module in the middle, but this is about as far as I could get
- Classifies each conflict: `state_change` / `different_aspects` / `unrelated`
- **State changes**: merges into one memory with historical context ("model changed to Claude" + prev: "model was Gemini")
- **Different aspects**: consolidates into a comprehensive memory
- Applies importance decay -- memories not recalled gradually fade
- Soft-deletes memories that fall below the threshold
- Archives processed episodes

No more "I told you I changed that setting last week."

### Phase 3: Dream Log -- "What did I dream about?"

Every cycle produces a markdown report: what was created, what was merged, what was forgotten. A transparent audit trail of your agent's memory maintenance.

## Quick Start

```bash
# 1. Install dependencies
pip install -r requirements.txt

# 2. Set up environment
cp .env.example .env
# Edit .env with your OpenAI API key

# 3. Initialize data directory + LanceDB table
python setup.py --example

# 4. Run
python dreamer.py --verbose
```

The setup script creates the directory structure, initializes the LanceDB `memories` table (1536-dim vectors), and optionally generates an example episode file.

## OpenClaw Integration

Dreamer is designed to work with [OpenClaw](https://openclaw.ai)'s memory system. Here's how the pieces fit together:

### Prerequisites

1. **OpenClaw Gateway** running with the `memory-lancedb` plugin enabled
2. **LanceDB** as the vector store for semantic memories
3. **Episode files** generated by OpenClaw's `session-memory` hook

### OpenClaw Configuration

In your `openclaw.json`, enable the memory plugin:

```json
{
  "plugins": {
    "slots": {
      "memory": "memory-lancedb"
    },
    "entries": {
      "memory-lancedb": {
        "enabled": true,
        "config": {
          "embedding": {
            "apiKey": "${OPENAI_API_KEY}",
            "model": "text-embedding-3-small"
          },
          "autoCapture": true,
          "autoRecall": true
        }
      }
    }
  },
  "hooks": {
    "internal": {
      "enabled": true,
      "entries": {
        "session-memory": {
          "enabled": true
        }
      }
    }
  }
}
```

This configures:
- **memory-lancedb**: Stores semantic memories as 1536-dim vectors in LanceDB. The gateway reads/writes to the same LanceDB that Dreamer consolidates.
- **session-memory**: The gateway's internal hook that flushes conversation context to episode files (`YYYY-MM-DD.md`) during session compaction.

### Data Flow

```
User <-> OpenClaw Gateway
              โ”‚
              โ”œโ”€โ”€ autoCapture โ”€โ”€> LanceDB (semantic memories)
              โ”‚                      โ†‘
              โ”œโ”€โ”€ session-memory โ”€โ”€> episodes/YYYY-MM-DD.md
              โ”‚                      โ”‚
              โ”‚                   Dreamer (nightly)
              โ”‚                      โ”‚
              โ””โ”€โ”€ autoRecall <โ”€โ”€โ”€โ”€ LanceDB (consolidated)
```

1. **During conversation**: Gateway auto-captures important facts to LanceDB and auto-recalls relevant memories
2. **Session compaction**: When context window fills up, the `session-memory` hook flushes a summary to `episodes/YYYY-MM-DD.md`
3. **Nightly (3 AM)**: Dreamer reads episodes, creates new semantic memories, resolves conflicts with existing ones, and prunes stale memories
4. **Next conversation**: Gateway recalls consolidated memories from LanceDB

### Standalone Usage (Without OpenClaw)

Dreamer works with any system that produces markdown episode files. Just write daily files to the episodes directory:

```
$DREAMER_HOME/episodes/2024-03-15.md
$DREAMER_HOME/episodes/2024-03-16.md
```

And point `DREAMER_HOME` to a directory with a LanceDB store. Run `python setup.py` to initialize the table.

## Configuration

All settings are in `config.py` and can be overridden via environment variables:

| Variable | Default | Description |
|----------|---------|-------------|
| `DREAMER_HOME` | `~/.dreamer` | Root data directory |
| `OPENAI_API_KEY` | (required) | For embeddings |
| `MINIMAX_API_KEY` | (optional) | If using MiniMax LLM |
| `DREAMER_LLM_PROVIDER` | `openai` | `openai` or `minimax` |

### Tunable Parameters

| Parameter | Default | Description |
|-----------|---------|-------------|
| `CLUSTER_SIMILARITY` | 0.75 | Threshold for grouping chunks |
| `DEDUP_SIMILARITY` | 0.90 | Skip if existing memory is this similar |
| `CONTRADICTION_SIMILARITY` | 0.70 | Conflict detection threshold |
| `IMPORTANCE_DECAY_RATE` | 0.05 | Daily decay rate |
| `SOFT_DELETE_THRESHOLD` | 0.15 | Below this = memory deleted |
| `MAX_EPISODES_PER_RUN` | 7 | Max days processed per cycle |
| `MAX_NEW_MEMORIES` | 20 | Cap on new memories per cycle |

### Directory Structure

```
$DREAMER_HOME/
  episodes/           # input: daily markdown files (YYYY-MM-DD.md)
  episodes/archive/   # processed episodes moved here
  lancedb/            # LanceDB vector database (shared with gateway)
  dream-log/          # output: nightly consolidation reports
  memory-archive/     # backup: pre-merge memory snapshots
  workspace/          # optional: reference docs for context linking
    docs/             # auto-generated reference documents
    skills/           # skill definitions (SKILL.md)
```

## Episode File Format

Episodes are markdown files named `YYYY-MM-DD.md`. Content is free-form text representing the AI agent's daily experiences:

```markdown
# Session Notes - 2024-03-15

## User asked about deployment
Discussed Docker setup. User prefers docker-compose over raw Docker commands.
Decided to use nginx as reverse proxy.

## API Integration
Connected to the payment API. Key endpoint: POST /v1/charges
Rate limit: 100 req/min. Auth via Bearer token.
```

## Running as a Cron Job

```bash
# Example: run daily at 3 AM
0 3 * * * cd /path/to/dreamer && python3 dreamer.py --verbose >> dream-log/cron.log 2>&1
```

Or use the provided systemd timer (see `examples/`).

## Architecture

```
Episode Files (YYYY-MM-DD.md)
        |
        v
   +---------+
   |  NREM   |  Chunk -> Embed -> Cluster -> Summarize -> Store
   +----+----+
        | created_ids
        v
   +---------+
   |   REM   |  Conflict Detection -> Merge/Consolidate -> Decay -> Prune
   +----+----+
        |
        v
   +---------+
   |Dream Log|  Generate report
   +---------+
```

## Requirements

- Python 3.10+
- OpenAI API key (for embeddings)
- LLM API key (OpenAI or MiniMax, etc.)

## License

MIT
tools

Comments

Sign in to leave a comment

Loading comments...