Skip to main content

Documentation Index

Fetch the complete documentation index at: https://mintlify.com/nearai/ironclaw/llms.txt

Use this file to discover all available pages before exploring further.

Overview

The workspace provides persistent memory for agents with a flexible filesystem-like structure. Agents can create arbitrary markdown file hierarchies that get indexed for full-text and semantic search. Inspired by OpenClaw, the workspace gives agents long-term memory across sessions.

Filesystem-like Structure

workspace/
├── README.md              <- Root runbook/index
├── MEMORY.md              <- Long-term curated memory
├── HEARTBEAT.md           <- Periodic checklist
├── IDENTITY.md            <- Agent name, vibe, personality
├── SOUL.md                <- Core values and principles
├── AGENTS.md              <- Behavior instructions
├── USER.md                <- User context and preferences
├── context/               <- Identity and context
│   ├── vision.md
│   └── priorities.md
├── daily/                 <- Daily logs
│   ├── 2024-01-15.md
│   └── 2024-01-16.md
└── projects/              <- Arbitrary structure
    └── alpha/
        ├── README.md
        └── notes.md

Core Types

Workspace

Database-backed memory storage scoped to a user and optionally an agent.
user_id
String
User identifier from the channel
agent_id
Option<Uuid>
Optional agent ID for multi-agent isolation
embeddings
Option<Arc<dyn EmbeddingProvider>>
Embedding provider for semantic search

Constructors

new
fn(user_id: impl Into<String>, pool: Pool) -> Self
Create a new workspace backed by PostgreSQL (requires postgres feature)
new_with_db
fn(user_id: impl Into<String>, db: Arc<dyn Database>) -> Self
Create a workspace with any Database implementation (libSQL, etc.)
with_agent
fn(self, agent_id: Uuid) -> Self
Set a specific agent ID for multi-agent isolation
with_embeddings
fn(self, provider: Arc<dyn EmbeddingProvider>) -> Self
Set the embedding provider for semantic search

Example

use ironclaw::workspace::Workspace;
use ironclaw::workspace::OpenAiEmbeddings;

// Create workspace with embeddings
let embeddings = Arc::new(OpenAiEmbeddings::new(api_key));
let workspace = Workspace::new_with_db("user_123", db)
    .with_embeddings(embeddings);

File Operations

Workspace provides a simple filesystem-like API for document management.

read

read
async fn(&self, path: &str) -> Result<MemoryDocument>
Read a file by path. Returns error if file doesn’t exist.
let doc = workspace.read("context/vision.md").await?;
println!("{}", doc.content);

write

write
async fn(&self, path: &str, content: &str) -> Result<MemoryDocument>
Create or update a file. Creates parent directories implicitly. Re-indexes for search.
workspace.write(
    "projects/alpha/README.md",
    "# Project Alpha\n\nDescription here."
).await?;

append

append
async fn(&self, path: &str, content: &str) -> Result<()>
Append content to a file. Creates the file if it doesn’t exist. Adds newline separator.
workspace.append("MEMORY.md", "Learned about Rust async patterns today.").await?;

exists

exists
async fn(&self, path: &str) -> Result<bool>
Check if a file exists at the given path.
if !workspace.exists("TODO.md").await? {
    workspace.write("TODO.md", "# Tasks\n").await?;
}

delete

delete
async fn(&self, path: &str) -> Result<()>
Delete a file and its associated search chunks.
workspace.delete("outdated/notes.md").await?;

list

list
async fn(&self, directory: &str) -> Result<Vec<WorkspaceEntry>>
List files and directories at a path. Returns immediate children (not recursive). Use empty string or ”/” for root.
let entries = workspace.list("projects/").await?;
for entry in entries {
    if entry.is_directory {
        println!("📁 {}/", entry.name());
    } else {
        println!("📄 {}", entry.name());
    }
}

list_all

list_all
async fn(&self) -> Result<Vec<String>>
List all files recursively as a flat list of paths.
let all_files = workspace.list_all().await?;
println!("Total files: {}", all_files.len());

Convenience Methods

memory

memory
async fn(&self) -> Result<MemoryDocument>
Get the main MEMORY.md document (long-term curated memory). Creates if it doesn’t exist.
let memory = workspace.memory().await?;
println!("Current memory: {}", memory.content);

today_log

today_log
async fn(&self) -> Result<MemoryDocument>
Get today’s daily log (append-only, keyed by date).
let today = workspace.today_log().await?;

daily_log

daily_log
async fn(&self, date: NaiveDate) -> Result<MemoryDocument>
Get a daily log for a specific date.
use chrono::NaiveDate;

let date = NaiveDate::from_ymd_opt(2024, 1, 15).unwrap();
let log = workspace.daily_log(date).await?;

append_memory

append_memory
async fn(&self, entry: &str) -> Result<()>
Append an entry to MEMORY.md with double newline separation. For important facts and decisions.
workspace.append_memory("User prefers tabs over spaces.").await?;

append_daily_log

append_daily_log
async fn(&self, entry: &str) -> Result<()>
Append a timestamped entry to today’s daily log.
workspace.append_daily_log("Completed initial project setup.").await?;
// Stored as: [14:35:22] Completed initial project setup.

heartbeat_checklist

heartbeat_checklist
async fn(&self) -> Result<Option<String>>
Get the HEARTBEAT.md checklist for periodic background tasks. Returns seed template if not yet created.
if let Some(checklist) = workspace.heartbeat_checklist().await? {
    println!("Heartbeat tasks:\n{}", checklist);
}

System Prompt

system_prompt

system_prompt
async fn(&self) -> Result<String>
Build the system prompt from identity files (AGENTS.md, SOUL.md, USER.md, IDENTITY.md, MEMORY.md). Includes last 2 days of daily logs.
let prompt = workspace.system_prompt().await?;

system_prompt_for_context

system_prompt_for_context
async fn(&self, is_group_chat: bool) -> Result<String>
Build system prompt with option to exclude MEMORY.md for group chats (privacy protection).
// Exclude personal memory in group contexts
let prompt = workspace.system_prompt_for_context(true).await?;

search

Hybrid search combining full-text (BM25) and semantic (vector) search using Reciprocal Rank Fusion.
let results = workspace.search("project deadlines", 5).await?;
for result in results {
    println!("{}: {} (score: {})", result.path, result.preview, result.score);
}

search_with_config

search_with_config
async fn(&self, query: &str, config: SearchConfig) -> Result<Vec<SearchResult>>
Search with custom configuration for ranking weights and limits.
use ironclaw::workspace::SearchConfig;

let config = SearchConfig::default()
    .with_limit(10)
    .with_semantic_weight(0.7)  // Prefer semantic over full-text
    .with_bm25_weight(0.3);

let results = workspace.search_with_config("async patterns", config).await?;

Seeding

seed_if_empty

seed_if_empty
async fn(&self) -> Result<usize>
Seed missing core identity files (README, MEMORY, IDENTITY, SOUL, AGENTS, USER, HEARTBEAT). Only creates files that don’t exist - never overwrites. Returns number of files created.
let created = workspace.seed_if_empty().await?;
if created > 0 {
    println!("Created {} workspace files", created);
}

backfill_embeddings

backfill_embeddings
async fn(&self) -> Result<usize>
Generate embeddings for chunks that don’t have them yet. Useful after enabling embedding provider. Returns number of chunks processed.
let count = workspace.backfill_embeddings().await?;
println!("Generated embeddings for {} chunks", count);

Data Types

MemoryDocument

A document stored in the workspace.
id
Uuid
Unique document identifier
user_id
String
Owner user identifier
agent_id
Option<Uuid>
Optional agent ID for isolation
path
String
File path (e.g., “context/vision.md”)
content
String
Full document content
created_at
DateTime<Utc>
Creation timestamp
updated_at
DateTime<Utc>
Last update timestamp

WorkspaceEntry

An entry in a directory listing.
path
String
Relative path from listing directory
is_directory
bool
True if this entry has children
updated_at
Option<DateTime<Utc>>
Last update time (latest among children for directories)
content_preview
Option<String>
First ~200 characters (None for directories)

SearchResult

A search result with ranking score.
path
String
Document path
chunk_content
String
Matched chunk content
score
f32
Combined relevance score (0.0 to 1.0)
preview
String
Preview text with context

SearchConfig

Configuration for hybrid search ranking.
limit
usize
default:"10"
Maximum results to return
semantic_weight
f32
default:"0.5"
Weight for semantic similarity (0.0 to 1.0)
bm25_weight
f32
default:"0.5"
Weight for BM25 full-text score (0.0 to 1.0)
k
usize
default:"60"
RRF constant (higher = less aggressive fusion)

Embedding Providers

Workspace supports multiple embedding providers for semantic search.

OpenAiEmbeddings

use ironclaw::workspace::OpenAiEmbeddings;

let embeddings = Arc::new(OpenAiEmbeddings::new(api_key));

OllamaEmbeddings

use ironclaw::workspace::OllamaEmbeddings;

let embeddings = Arc::new(OllamaEmbeddings::new(
    "http://localhost:11434",
    "nomic-embed-text"
));

NearAiEmbeddings

use ironclaw::workspace::NearAiEmbeddings;

let embeddings = Arc::new(NearAiEmbeddings::new(base_url, api_key));

MockEmbeddings

For testing without external dependencies.
use ironclaw::workspace::MockEmbeddings;

let embeddings = Arc::new(MockEmbeddings::new());

Key Patterns

  1. Memory is persistence: If you want to remember something, write it to the workspace
  2. Flexible structure: Create any directory/file hierarchy you need
  3. Self-documenting: Use README.md files to describe directory structure
  4. Hybrid search: Vector similarity + BM25 full-text via RRF for best results
  5. Privacy boundaries: Use system_prompt_for_context(true) to exclude personal memory in group chats

Agent Module

Core agent orchestration and session management

LLM Module

Language model integration for reasoning

Build docs developers (and LLMs) love