Operator OS consumes less than 10MB of RAM at runtime — 99% smaller than comparable Node.js or Python-based agent frameworks. It cold-starts in under 1 second, even on single-core 0.6GHz processors.
Core capabilities
Ultra-lightweight engine
Runs in under 10MB of RAM with sub-second cold starts. Deployable on hardware costing as little as $10, including Raspberry Pi Zero and RISC-V boards.
True portability
Ships as a single, statically compiled binary with no external dependencies. Runs on Linux, macOS, Windows, and FreeBSD across x86_64, ARM64, ARMv7, and RISC-V architectures.
Persistent memory
Structural long-term memory carries context seamlessly across sessions and reboots. Agents remember what they’ve done without requiring an external database.
Multi-channel messaging
Natively connects to Slack, Discord, Telegram, WhatsApp, DingTalk, Feishu, LINE, WeCom, and more. One agent, every channel.
Universal LLM support
Zero-code model switching between OpenAI, Anthropic, Google Gemini, DeepSeek, Groq, Ollama, and 10+ other providers using a simple protocol-prefix system.
Built-in tools
Web search, cron scheduling, shell execution, and MCP (Model Context Protocol) server support are included out of the box — no plugins to install.
Architecture overview
Operator OS is built around four composable primitives: Gateway daemon — Theoperator gateway process is the long-running service that connects your agent to the outside world. It manages channel connections (Slack, Discord, Telegram, etc.), routes inbound messages to the agent, and streams responses back. Run it on a server or a Raspberry Pi and leave it running indefinitely.
Agents — The reasoning core. Each agent is configured with a model, a set of tools, a workspace, and memory. Agents receive messages, invoke tools, and produce responses. You interact with an agent directly via operator agent -m "..." or through any connected channel.
Tools — Agents act through tools: file read/write, shell execution, web search, cron scheduling, and MCP server calls. The sandbox restricts tool access to the configured workspace directory by default, with an explicit opt-out for trusted environments.
Channels — Channels are the interfaces through which users communicate with the agent. A channel can be a messaging platform (Slack, Telegram), a chat protocol (WhatsApp, DingTalk), or a hardware interface (MaixCam). Multiple channels can be active simultaneously.
Supported AI providers
Operator uses a protocol-prefix system inmodel_list — no code changes required to switch models.
| Provider | Protocol prefix | Example model |
|---|---|---|
| Anthropic | anthropic/ | anthropic/claude-sonnet-4.6 |
| OpenAI | openai/ | openai/gpt-5.2 |
| Google Gemini | gemini/ | gemini/gemini-3.1-pro |
| DeepSeek | deepseek/ | deepseek/deepseek-chat |
| Groq | groq/ | groq/llama3-8b-8192 |
| Ollama (local) | ollama/ | ollama/llama3 |
Next steps
Quick Start
Get a running agent in 5 minutes. Download the binary, run
operator onboard, and send your first message.Installation
Full installation guide covering precompiled binaries, building from source, and Docker deployment.