Skip to main content
EchoVault is designed with privacy as a core principle. This page explains how your data is stored, what network calls are made, and how to ensure complete local operation.

Core Privacy Principles

Local Storage

All memories are stored on your machine by default. No cloud database, no remote storage.

You Own Your Data

Memories are stored as human-readable Markdown files. No proprietary formats, no lock-in.

Secret Redaction

3-layer redaction pipeline prevents API keys, passwords, and credentials from being stored.

Optional Cloud

Cloud embeddings are opt-in. Use Ollama for fully local operation with zero network calls.

Data Storage Location

By default, EchoVault stores all data in ~/.memory/ on your local machine.

Directory Structure

~/.memory/
├── vault/                  # Markdown files (human-readable)
│   └── my-project/
│       └── 2026-03-03-session.md
├── index.db                # SQLite database (FTS5 + vectors)
├── config.yaml             # Configuration (embedding provider)
└── .memoryignore           # Custom redaction patterns

What’s Stored Where

Location: ~/.memory/vault/<project>/<date>-session.mdContents:
  • Memory title, category, tags
  • What/why/impact descriptions
  • Full decision details
  • Timestamps and metadata
Format: Standard Markdown with YAML frontmatterPrivacy: Stored locally, never transmitted unless you sync them yourselfObsidian-compatible: Yes
Location: ~/.memory/index.dbContents:
  • FTS5 full-text search index
  • Vector embeddings (if configured)
  • Memory metadata (IDs, timestamps, projects)
  • Pointers to markdown files
Privacy: Stored locally, contains processed/indexed versions of your memoriesCan be deleted: Yes, you can rebuild it from markdown files with memory reindex
Location: ~/.memory/config.yamlContents:
  • Embedding provider settings
  • API keys (for OpenAI, if configured)
  • Context retrieval preferences
Privacy: Stored locally. When displayed via memory config, API keys are redacted.Example:
embedding:
  provider: ollama
  model: nomic-embed-text
  base_url: http://localhost:11434

context:
  semantic: auto
  topup_recent: true

Network Calls and Cloud Services

EchoVault can operate completely offline or with optional cloud embeddings.

Fully Local Operation (Default)

If you use Ollama for embeddings (or no embeddings at all), EchoVault makes zero network calls.
embedding:
  provider: ollama              # Local
  model: nomic-embed-text       # Runs on your machine
  base_url: http://localhost:11434
What happens:
  • Memories are saved to local disk
  • Embeddings are generated by Ollama locally
  • Search queries run against local SQLite database
  • No data leaves your machine
To verify Ollama is running locally, check http://localhost:11434 in your browser. You should see “Ollama is running”.

Optional Cloud Embeddings (OpenAI)

If you configure OpenAI for embeddings, EchoVault will make API calls to OpenAI’s servers.
embedding:
  provider: openai
  model: text-embedding-3-small
  api_key: sk-...
  base_url: https://api.openai.com/v1
What gets sent to OpenAI:
  • Text to embed: <title> <what> <why> <impact> <tags>
  • This is sent when saving memories or running memory reindex
What does NOT get sent:
  • Full memory details (only the summary fields)
  • Markdown files
  • Database contents
  • Your configuration
OpenAI’s data usage policy:
  • API data is not used to train models (as of their current policy)
  • Data is retained for 30 days for abuse monitoring
  • Review OpenAI’s privacy policy for details
If you work with sensitive codebases or proprietary information, use Ollama instead of OpenAI to ensure no data leaves your machine.

On-Premises Embeddings (Custom)

You can use your own embedding service via OpenAI-compatible APIs:
embedding:
  provider: openai
  model: your-model
  base_url: http://vllm.internal:8000/v1
  api_key: optional
Privacy: Your data stays on your network. No third-party services involved.

Secret Redaction System

EchoVault includes a 3-layer redaction pipeline to prevent secrets from being stored.

Layer 1: Explicit Redaction Tags

You can manually mark sensitive content:
memory save --title "API Setup" \
  --what "Configured API key <redacted>sk_live_abc123</redacted>"
Result: Configured API key [REDACTED] The content between <redacted> tags is replaced before storage.

Layer 2: Automatic Pattern Detection

EchoVault automatically detects and redacts known secret formats:
PatternExampleRedacted
Stripe keyssk_live_abc123[REDACTED]
GitHub tokensghp_abc123[REDACTED]
AWS access keysAKIAIOSFODNN7EXAMPLE[REDACTED]
Slack tokensxoxb-abc-123[REDACTED]
JWT tokenseyJhbGc...[REDACTED]
Private keys-----BEGIN PRIVATE KEY-----[REDACTED]
Password fieldspassword: secret123[REDACTED]
API key fieldsapi_key: abc123[REDACTED]
Implemented in: src/memory/redaction.py

Layer 3: Custom Patterns (.memoryignore)

Add project-specific patterns to ~/.memory/.memoryignore:
# Social Security Numbers
\d{3}-\d{2}-\d{4}

# Internal IP addresses
10\.0\.\d+\.\d+

# Company-specific secret prefix
COMPANY_SECRET_[A-Z0-9]+
Each line is a regex pattern. Lines starting with # are comments.
Redaction happens before storage. Once a memory is saved, redacted content is permanently replaced with [REDACTED]. The original secret is never written to disk.

Testing Redaction

You can test what would be redacted:
from memory.redaction import redact, load_memoryignore

patterns = load_memoryignore("/home/user/.memory/.memoryignore")
text = "API key: sk_live_abc123, password: secret123"
print(redact(text, patterns))
# Output: API key: [REDACTED], [REDACTED]

What Agents Can Access

EchoVault exposes three MCP tools to agents:
Purpose: Load recent memories at session startWhat the agent can access:
  • Memory IDs, titles, categories, tags
  • Project names and creation timestamps
  • Summary fields (what/why/impact)
What the agent cannot access:
  • Full memory details (unless they call memory_search with the ID)
  • Configuration file
  • Database internals
Purpose: Persist decisions and learningsWhat the agent can do:
  • Save new memories to the current project
  • Provide title, what, why, impact, tags, category, details
What the agent cannot do:
  • Save to arbitrary file paths
  • Bypass redaction
  • Delete existing memories (unless you’ve granted file system access)
Agents have access to all memories they can search. If you have sensitive memories in your vault, be mindful of what you share with agents or consider using separate memory homes for different projects.

Syncing Across Machines

EchoVault doesn’t include built-in sync. To share memories across machines:

Option 1: Cloud Storage

Point MEMORY_HOME to a synced folder:
# On all machines
memory config set-home ~/Dropbox/memory
Privacy implications:
  • Your memories are now stored in Dropbox/iCloud/Google Drive
  • They’re encrypted in transit but readable by the cloud provider
  • Consider this if you have sensitive codebases

Option 2: Git Repository

Version control your memory vault:
cd ~/.memory
git init
git add vault/
git commit -m "Initial memories"
git remote add origin [email protected]:you/private-memory.git
git push
Privacy implications:
  • Memories are stored in Git history (even if deleted later)
  • Use a private repository
  • Consider encrypting the repo with git-crypt or similar

Option 3: Manual Sync

Use rsync or scp:
rsync -av ~/.memory/ user@remote:~/.memory/
Privacy implications:
  • Full control over when and where data is synced
  • No third-party involvement
If you sync across machines, avoid running multiple agents simultaneously. SQLite doesn’t handle concurrent writes well.

Compliance and Data Residency

GDPR (European Union)

EchoVault is compliant-friendly:
  • Data minimization: Only stores what you explicitly save
  • Right to erasure: Delete memories with memory delete <id> or rm -rf ~/.memory
  • Data portability: Memories are stored as Markdown (open format)
  • Local processing: No third-party data processors (unless you use OpenAI)

CCPA (California)

Similar to GDPR:
  • You control all data
  • No sale of personal information (EchoVault doesn’t collect any)
  • Delete your data anytime

SOC 2, ISO 27001, etc.

If you work at a company with compliance requirements:
  • Use Ollama for fully local operation
  • Store MEMORY_HOME on encrypted volumes
  • Exclude ~/.memory/ from backup systems if required
  • Use .memoryignore to enforce redaction policies

Security Best Practices

Use Ollama

For fully local operation with zero network calls. Ideal for sensitive work.

Encrypt Your Disk

Use FileVault (macOS), BitLocker (Windows), or LUKS (Linux) to encrypt ~/.memory/.

Review .memoryignore

Add patterns for project-specific secrets (internal IPs, custom tokens, etc.).

Don't Commit Memories

Add ~/.memory/ to .gitignore if you’re syncing via Git. Use private repos only.

File Permissions

EchoVault creates files with default permissions. You can restrict access:
chmod 700 ~/.memory
chmod 600 ~/.memory/config.yaml
chmod 600 ~/.memory/index.db
This ensures only your user can read the memory vault.

API Key Storage

If you use OpenAI, your API key is stored in ~/.memory/config.yaml. Protection:
  • The file is only readable by your user (default Linux/macOS permissions)
  • API keys are redacted in memory config output
  • Keys are never logged or transmitted anywhere except to the configured API
Best practice:
  • Use environment variables for extra security:
    export OPENAI_API_KEY=sk-...
    
    Then reference it in config.yaml:
    embedding:
      provider: openai
      api_key: ${OPENAI_API_KEY}
    
    (Note: This is not currently supported but is a common pattern)

Deleting All Data

To completely remove EchoVault and all memories:
# Uninstall from agents
memory uninstall claude-code
memory uninstall cursor
memory uninstall opencode
memory uninstall codex

# Uninstall CLI
pip uninstall echovault

# Delete all memories
rm -rf ~/.memory/
This removes:
  • All markdown files
  • The SQLite database
  • Configuration file
  • MCP server configs
This is irreversible. Back up ~/.memory/vault/ if you want to preserve your memories.

Questions About Privacy?

If you have specific privacy concerns:
  1. Review the source code (MIT licensed, fully transparent)
  2. Check src/memory/redaction.py for redaction implementation
  3. Check src/memory/core.py for data storage logic
  4. Open an issue on GitHub for clarification
EchoVault is open source. You can audit exactly what data is stored and where it goes.

Build docs developers (and LLMs) love