Skip to main content

Documentation Index

Fetch the complete documentation index at: https://mintlify.com/nearai/ironclaw/llms.txt

Use this file to discover all available pages before exploring further.

This guide walks you through installing IronClaw, running the setup wizard, and having your first conversation.

Before You Begin

Make sure you have:

IronClaw Installed

Follow the Installation Guide to install IronClaw on your system.

Database Ready

PostgreSQL 15+ with pgvector, or libSQL (embedded, no setup needed).

Setup Wizard

IronClaw includes an interactive setup wizard that guides you through configuration.
1

Run the onboarding wizard

ironclaw onboard
The wizard saves progress after each step. If you exit early, re-running ironclaw onboard will resume where you left off.
2

Configure database

Choose your database backend:
Recommended for local development — zero dependencies, no server required.The wizard will prompt:
Database file path (default: ~/.ironclaw/ironclaw.db):
Press Enter to use the default, or specify a custom path.
Enable Turso cloud sync (remote replica)? [y/N]:
  • No (default) — Local file only
  • Yes — Enter your Turso URL and auth token for cloud sync
3

Configure security

IronClaw encrypts secrets (API keys, tokens) with a master key.The wizard will prompt:
The secrets master key encrypts sensitive data like API tokens.
Choose where to store it:

1. OS Keychain (recommended for local installs)
2. Environment variable (for CI/Docker)
3. Skip (disable secrets features)
4

Select inference provider

Choose your LLM provider:
Select your inference provider:

1. NEAR AI          - multi-model access via NEAR account
2. Anthropic        - Claude models (direct API key)
3. OpenAI           - GPT models (direct API key)
4. Ollama           - local models, no API key needed
5. OpenRouter       - 200+ models via single API key
6. OpenAI-compatible - custom endpoint (vLLM, LiteLLM, etc.)
Select 1 for NEAR AI.The wizard will open your browser to authenticate via GitHub or Google OAuth. After authentication, your session token is saved automatically.
NEAR AI provides access to multiple models (GLM, Claude, GPT) with a single account.
5

Select model

The wizard will fetch available models from your provider and display a list:
Available models:

1. GLM Latest (default, fast)
2. Claude Sonnet 4 (best quality)
3. GPT-5.3 Codex (flagship)
4. Custom model ID
Select your preferred model, or choose Custom model ID to enter a specific model identifier.
You can change the model later by re-running ironclaw onboard or editing ~/.ironclaw/.env.
6

Configure embeddings (optional)

Embeddings enable semantic search across your workspace memory.
Enable semantic search? [Y/n]:
  • Yes (default) — The wizard will configure embeddings using your LLM provider or OpenAI
  • No — Workspace will use keyword search only
Semantic search lets you find notes and context by meaning, not just keywords. Highly recommended.
7

Configure channels (optional)

IronClaw supports multiple communication channels:
Select channels to enable (space to toggle, enter to confirm):

[ ] HTTP Webhook Server
[ ] Telegram Bot
[ ] Signal Messaging
[ ] Cloudflare Tunnel (ngrok alternative)
Exposes a REST API for triggering tasks via HTTP.The wizard will prompt for:
  • Host (default: 0.0.0.0)
  • Port (default: 8080)
  • Webhook secret (for request authentication)
Interact with IronClaw via Telegram DMs.The wizard will prompt for:After setup, send /start to your bot in Telegram to pair.
Interact with IronClaw via Signal.Requires a running signal-cli daemon. See Signal Setup for details.
Expose your local IronClaw instance to the internet via Cloudflare Tunnel (ngrok alternative).Requires cloudflared to be installed.
You can skip this step and configure channels later with ironclaw onboard --channels-only.
8

Install extensions (optional)

The wizard can install pre-built WASM tools from the registry:
Install extensions from registry? [y/N]:
Select Yes to browse and install tools like GitHub, Gmail, Google Calendar, etc.
You can install extensions later via the CLI or web gateway.
9

Configure Docker sandbox (optional)

The Docker sandbox runs code in isolated containers.
Enable Docker sandbox for code execution? [y/N]:
  • Yes — Requires Docker to be installed and running
  • No (default) — Code execution disabled
10

Configure heartbeat (optional)

The heartbeat system runs background tasks on a schedule (e.g., monitoring, maintenance).
Enable background tasks (heartbeat)? [y/N]:
  • Yes — The wizard will prompt for interval (default: 30 minutes)
  • No (default) — No background tasks

Your First Conversation

Start the interactive REPL:
ironclaw
You’ll see the IronClaw prompt:
╭─ IronClaw v0.13.1
╰─ Type 'help' for commands, 'exit' to quit

>

Try these commands:

1

Ask a question

> What can you do?
IronClaw will explain its capabilities, including tool use, memory management, and channel support.
2

Use a tool

> Create a note in my workspace called "ideas.md" with the content "Build a Rust CLI tool"
IronClaw will use the workspace_write tool to create the file.
3

Search your workspace

> Search my workspace for notes about Rust
If you enabled embeddings, IronClaw will perform a semantic search across your notes.
4

Exit the REPL

> exit
Or press Ctrl+D.

Running as a Service

To keep IronClaw running in the background:
Create a docker-compose.yml:
docker-compose.yml
version: '3.8'
services:
  ironclaw:
    image: ghcr.io/nearai/ironclaw:latest
    environment:
      - DATABASE_URL=postgres://postgres:postgres@db:5432/ironclaw
      - NEARAI_MODEL=zai-org/GLM-latest
    volumes:
      - ~/.ironclaw:/root/.ironclaw
    depends_on:
      - db

  db:
    image: pgvector/pgvector:pg15
    environment:
      - POSTGRES_PASSWORD=postgres
      - POSTGRES_DB=ironclaw
    volumes:
      - postgres-data:/var/lib/postgresql/data

volumes:
  postgres-data:
Start the services:
docker-compose up -d

Next Steps

Configuration

Learn how to configure LLM providers, channels, and secrets

Tools & Extensions

Explore built-in tools and install extensions

Channels

Set up Telegram, HTTP webhooks, and web gateway

CLI Reference

Explore all available commands

Troubleshooting

Problem: Failed to connect to databaseSolution:
  • Ensure PostgreSQL is running: brew services list (macOS) or systemctl status postgresql (Linux)
  • Verify pgvector is installed: psql -c "SELECT * FROM pg_available_extensions WHERE name = 'vector';" ironclaw
  • For libSQL, ensure parent directory exists: mkdir -p ~/.ironclaw
Problem: Browser doesn’t open or authentication times outSolution:
  • Check your internet connection
  • Manually visit the auth URL printed by the wizard
  • Try API key mode instead: Set NEARAI_API_KEY in your environment
Problem: No models found. Pull one first: ollama pull llama3Solution:
# Pull a model
ollama pull llama3.2

# Verify it's available
ollama list

# Re-run wizard
ironclaw onboard
Problem: ironclaw command hangs or crashesSolution:
  • Check logs: RUST_LOG=ironclaw=debug ironclaw
  • Verify database connection: psql $DATABASE_URL
  • Re-run setup: ironclaw onboard

Build docs developers (and LLMs) love