oMLX ships with first-class support for the most popular AI coding tools. Once your server is running, you can connect Claude Code, Codex, OpenClaw, Pi, or OpenCode without editing config files manually — either through the admin dashboard or a single CLI command. Each integration configures authentication, model selection, and endpoint routing automatically so your tool of choice talks directly to your local models.Documentation Index
Fetch the complete documentation index at: https://mintlify.com/jundot/omlx/llms.txt
Use this file to discover all available pages before exploring further.
Two ways to connect
Admin dashboard
Open the admin dashboard athttp://localhost:8000/admin and navigate to the Integrations tab. Each supported tool has a one-click setup button that writes the required config and launches the tool. This is the quickest path and works without a terminal.
omlx launch
Theomlx launch command configures and starts an external tool in a single step. It fetches the list of available models from the running server and either selects the only model automatically or presents an interactive picker.
The server must be running before you call
omlx launch. Start it with omlx serve --model-dir ~/models if it is not already running.Supported integrations
Claude Code
Connect Anthropic’s Claude Code CLI to oMLX for local agentic coding. Includes context scaling and SSE keep-alive optimizations.
Codex
Use OpenAI’s Codex CLI with any model loaded in oMLX. Config is written to
~/.codex/config.toml automatically.OpenClaw
Launch OpenClaw’s TUI with your choice of tools profile: minimal, coding, messaging, or full.
Pi
Connect the Pi coding agent to oMLX. Supports both LLM and VLM models with automatic context window configuration.
OpenCode
Wire OpenCode to oMLX by writing a provider entry to
~/.config/opencode/opencode.json.MCP Tools
Give any loaded model access to filesystem, databases, web search, and other tools via the Model Context Protocol.