Quick Start
Architecture
Directory Structure
How It Works
Export creates a new chunk
Each
engram sync creates a new chunk — never modifies old ones.The chunk contains:- Sessions created after the last chunk
- Observations created after the last chunk
- Prompts created after the last chunk
Chunk is compressed and hashed
- Chunk content is serialized to JSON
- Compressed with gzip (typically ~2KB for 8 sessions + 10 observations)
- SHA-256 hash is computed from content
- First 8 characters of hash become the chunk ID
Commands
Export Memories
.engram/chunks/.
Project Detection: By default, engram sync uses the current directory name as the project filter. Only memories from sessions matching that project are exported.
Example:
Export All Projects
Override Project
Import Chunks
- Reads
manifest.json - Gets list of chunks already imported from local DB (
sync_chunkstable) - For each chunk in the manifest not yet imported:
- Reads and decompresses the chunk file
- Imports sessions, observations, and prompts into local DB
- Records the chunk ID as imported
- Sessions use
INSERT OR IGNORE(skip if session ID already exists) - Observations and prompts are imported as-is (Engram’s deduplication logic handles duplicates at save time)
Auto-import: The OpenCode plugin automatically runs
engram sync --import when it detects .engram/manifest.json in the project directory.Clone a repo → open OpenCode → team memories are loaded automatically.Check Status
- How many chunks exist locally (in your DB)
- How many chunks exist remotely (in the manifest)
- How many chunks are pending import
Why Chunks?
Engram uses a chunked architecture instead of a single large JSON file. Here’s why:No Merge Conflicts
No Merge Conflicts
Each
engram sync creates a new chunk — old chunks are never modified.When multiple developers sync independently:- Alan creates
a3f8c1d2.jsonl.gz - Juan creates
b7d2e4f1.jsonl.gz - Git just adds both files — no conflicts
Content-Hashed Chunks
Content-Hashed Chunks
Each chunk is identified by the first 8 characters of its SHA-256 content hash.This means:
- Each chunk is imported only once (tracked in local
sync_chunkstable) - If two devs create identical chunks, the hash deduplicates them
- No risk of double-importing the same data
Compressed and Small
Compressed and Small
Chunks are gzipped JSONL.Typical sizes:
- 8 sessions + 10 observations + 5 prompts = ~2KB compressed
- 50 sessions + 100 observations = ~10KB compressed
Append-Only Manifest
Append-Only Manifest
The manifest is the only file git needs to diff/merge:It’s small, human-readable, and append-only (new chunks are added, old entries never change).
Workflow Examples
Solo Developer (Multiple Machines)
Team Collaboration
Shared Knowledge Base
Create a separate repo for team-wide memories across multiple projects:Data Model
Manifest Entry
Chunk Data
Implementation Details
Time-Based Filtering
When exporting, Engram filters data based on the timestamp of the last chunk:- Read the manifest
- Find the most recent chunk’s
created_attimestamp - Export only sessions/observations/prompts created after that timestamp
Project Filtering
When--project is specified (or auto-detected from directory name):
- Filter sessions by
projectfield - Include only observations/prompts from those sessions
Compression
Chunks are gzipped using Go’scompress/gzip:
Deduplication
At export time: Engram checks if a chunk with the same content hash already exists. If yes, skip export. At import time: Engram checks thesync_chunks table to see if the chunk ID has been imported. If yes, skip import.
At save time: Engram’s normal deduplication logic (normalized hash + project + scope + type + title) prevents duplicate observations even if the same data is imported multiple times.
Related
MCP Tools
All 14 MCP tools for agents
Privacy
Redact sensitive data before syncing
CLI Reference
Full command-line reference
Export/Import
JSON export/import (alternative to git sync)