Skip to main content

Documentation Index

Fetch the complete documentation index at: https://mintlify.com/openagen/zeroclaw/llms.txt

Use this file to discover all available pages before exploring further.

ZeroClaw is a fast, small, fully autonomous AI assistant infrastructure runtime written in Rust. It runs as a single binary, uses less than 5MB of RAM at runtime, and starts in under 10ms — making it deployable on anything from a $10 ARM board to a cloud VM. Every core subsystem (AI providers, messaging channels, memory backends, tools, tunnels) is defined as a Rust trait, so you can swap any component with a one-line config change and zero code modifications.
The official repository is github.com/zeroclaw-labs/zeroclaw. Do not trust binaries, announcements, or fundraising from openagen/zeroclaw, zeroclaw.org, or zeroclaw.net — those domains are impersonating the official project.

Why ZeroClaw exists

Most AI agent runtimes are built on interpreted languages with heavy dependency trees. They start slowly, consume hundreds of megabytes of memory, and are difficult to deploy on constrained hardware. ZeroClaw is the opposite: a Rust binary with no mandatory runtime dependencies, a sub-10ms cold start, and a design that makes provider, channel, and tool lock-in impossible by construction. The four principles that guide every design decision:
  • Lean by default — small Rust binary, fast startup, low memory footprint
  • Secure by design — gateway pairing, strict sandboxing, explicit allowlists, workspace scoping
  • Fully swappable — all core systems are traits (providers, channels, tools, memory, tunnels)
  • No lock-in — OpenAI-compatible provider support plus pluggable custom endpoints

Benchmark comparison

The table below compares ZeroClaw against three other runtimes on a macOS arm64 machine, normalized for 0.8 GHz edge hardware (measured February 2026).
OpenClawNanoBotPicoClawZeroClaw
LanguageTypeScriptPythonGoRust
RAM> 1 GB> 100 MB< 10 MB< 5 MB
Startup (0.8 GHz core)> 500 s> 30 s< 1 s< 10 ms
Binary size~28 MB (dist)N/A (scripts)~8 MB~8.8 MB
Minimum hardware costMac Mini $599Linux SBC ~$50Linux board $10Any hardware $10
OpenClaw requires the Node.js runtime (~390 MB additional memory overhead). NanoBot requires a Python runtime. PicoClaw and ZeroClaw are static binaries. RAM figures are runtime memory; build-time compilation requires more. You can reproduce these numbers on your own build:
cargo build --release
ls -lh target/release/zeroclaw
/usr/bin/time -l target/release/zeroclaw --help
/usr/bin/time -l target/release/zeroclaw status

Architecture overview

Every ZeroClaw subsystem is a Rust trait. Swapping an implementation means changing one line in ~/.zeroclaw/config.toml — no recompilation, no code changes.
SubsystemTraitShips withExtend by
AI modelsProviderProvider catalog via zeroclaw providers (built-ins + aliases + custom endpoints)custom:https://your-api.com (OpenAI-compatible) or anthropic-custom:https://your-api.com
ChannelsChannelCLI, Telegram, Discord, Slack, Mattermost, iMessage, Matrix, Signal, WhatsApp, Linq, Email, IRC, Lark, DingTalk, QQ, Nostr, WebhookAny messaging API
MemoryMemorySQLite hybrid search, PostgreSQL, Lucid bridge, Markdown files, explicit none backendAny persistence backend
ToolsToolshell/file/memory, cron/schedule, git, pushover, browser, http_request, screenshot/image_info, composio (opt-in), delegate, hardware toolsAny capability
ObservabilityObserverNoop, Log, MultiPrometheus, OTel
RuntimeRuntimeAdapterNative, Docker (sandboxed)Additional runtimes via adapter
SecuritySecurityPolicyGateway pairing, sandbox, allowlists, rate limits, filesystem scoping, encrypted secrets
IdentityIdentityConfigOpenClaw (markdown), AIEOS v1.1 (JSON)Any identity format
TunnelTunnelNone, Cloudflare, Tailscale, ngrok, CustomAny tunnel binary
HeartbeatEngineHEARTBEAT.md periodic tasks
SkillsLoaderTOML manifests + SKILL.md instructionsCommunity skill packs
IntegrationsRegistry70+ integrations across 9 categoriesPlugin system
Currently supported runtime kinds are native and docker. WASM and edge runtimes are planned but not yet implemented — ZeroClaw exits with a clear error if you configure an unsupported kind.

Memory system

ZeroClaw ships a full-stack hybrid search engine with no external dependencies — no Pinecone, no Elasticsearch, no LangChain:
LayerImplementation
Vector DBEmbeddings stored as BLOB in SQLite, cosine similarity search
Keyword searchFTS5 virtual tables with BM25 scoring
Hybrid mergeCustom weighted merge function (vector.rs)
EmbeddingsEmbeddingProvider trait — OpenAI, custom URL, or noop
ChunkingLine-based markdown chunker with heading preservation
CachingSQLite embedding_cache table with LRU eviction
Safe reindexRebuild FTS5 + re-embed missing vectors atomically
The agent automatically recalls, saves, and manages memory via tools — no manual wiring required.

Where to go next

Quickstart

Install ZeroClaw, onboard, and send your first message in under five minutes

Installation

Detailed install options for Linux, macOS, Windows, and ARM devices

Configuration

Configure providers, channels, memory backends, and autonomy settings

CLI reference

Complete reference for every ZeroClaw command and flag

Build docs developers (and LLMs) love