Skip to main content

Prerequisites

Before installing Loom, ensure you have the following:

Elixir 1.18+

Download from elixir-lang.org

LLM Provider API Key

At least one of: Anthropic, OpenAI, Google, Groq, xAI, or other req_llm supported provider
Loom uses SQLite for persistence, which is automatically included via ecto_sqlite3. No separate database installation required.

Installation from Source

1

Clone the repository

git clone https://github.com/yourusername/loom.git
cd loom
2

Install dependencies and set up the database

mix setup
This command runs:
  • mix deps.get — Downloads all Elixir dependencies
  • mix ecto.create — Creates the SQLite database
  • mix ecto.migrate — Runs database migrations
The database is created at _build/dev.db by default. In production, it defaults to ~/.loom/loom.db.
3

Build the CLI escript (optional)

If you want to use Loom as a standalone CLI tool:
mix escript.build
This creates an executable ./loom file in your project directory that you can run from anywhere.
4

Verify installation

Start the Phoenix web UI to verify everything works:
mix phx.server
You should see output like:
[info] Running LoomWeb.Endpoint with Bandit 1.6.0 at 127.0.0.1:4200 (http)
[info] Access LoomWeb.Endpoint at http://localhost:4200
Open http://localhost:4200 in your browser to access the web UI.

Configure Your LLM Provider

1

Set your API key

Export your LLM provider’s API key as an environment variable:
export ANTHROPIC_API_KEY="sk-ant-..."
Add the export statement to your ~/.bashrc, ~/.zshrc, or equivalent shell configuration file to persist the key across terminal sessions.
2

Create a .loom.toml configuration file (optional)

Create a .loom.toml file in your project root to customize Loom’s behavior:
.loom.toml
[model]
# Default model for main reasoning tasks
default = "anthropic:claude-sonnet-4-6"
# Weak model for sub-agent search and parallel exploration
weak = "anthropic:claude-haiku-4-5"

[permissions]
# Tools that don't require approval
auto_approve = [
  "file_read",
  "file_search",
  "content_search",
  "directory_list",
  "decision_query"
]

[context]
# Maximum tokens for repo map in context
max_repo_map_tokens = 2048
# Maximum tokens for decision context
max_decision_context_tokens = 1024
# Tokens reserved for model output
reserved_output_tokens = 4096
All configuration values are optional. Loom uses sensible defaults if not specified.
3

Test your configuration

Run a quick test to verify your API key and configuration:
./loom "What files are in this directory?"
You should see Loom respond with a list of files in your current directory.

Environment Variables

Loom recognizes the following environment variables:
VariableDescriptionDefault
ANTHROPIC_API_KEYAnthropic API keyNone
OPENAI_API_KEYOpenAI API keyNone
GOOGLE_API_KEYGoogle AI API keyNone
GROQ_API_KEYGroq API keyNone
XAI_API_KEYxAI API keyNone
LOOM_MODELOverride default modelFrom .loom.toml or config
LOOM_DB_PATHSQLite database location~/.loom/loom.db (prod)
PORTWeb UI port4200
PHX_HOSTWeb UI hostlocalhost
SECRET_KEY_BASEPhoenix secret keyAuto-generated

Building a Standalone Binary

Loom can be packaged as a single self-contained binary using Burrito. The binary bundles the BEAM runtime, so users don’t need Elixir or Erlang installed.
1

Build for your current platform

MIX_ENV=prod mix release loom
The binary will be created in burrito_out/ directory:
# macOS Apple Silicon
./burrito_out/loom_macos_aarch64

# macOS Intel
./burrito_out/loom_macos_x86_64

# Linux x86_64
./burrito_out/loom_linux_x86_64

# Linux ARM64
./burrito_out/loom_linux_aarch64
2

Configure release behavior

The release uses the following defaults:
  • Database: ~/.loom/loom.db (override with LOOM_DB_PATH)
  • Web UI port: 4200 (override with PORT)
  • Migrations run automatically on startup
Standalone binaries are configured for the targets in mix.exs. Cross-compilation may require additional setup depending on your platform.

Next Steps

Quickstart

Start your first coding session with Loom

Project Rules

Create a LOOM.md file to give project-specific instructions

Configuration

Learn about all configuration options

Tools

Explore the 11 built-in tools

Build docs developers (and LLMs) love