The model_list array
model_list is an array of model entries in ~/.operator/config.json. The agent references entries by their model_name alias, not by the raw API model identifier.
Entry fields
The alias used throughout the rest of the config to refer to this model.
Set
agents.defaults.model_name to this value to make the model the default.
Multiple entries with the same model_name enable load balancing — Operator
selects between them in round-robin order.The fully qualified model identifier including its protocol prefix, for example
openai/gpt-5.2 or anthropic/claude-sonnet-4.6. The protocol prefix
determines which provider implementation handles the request. If no prefix
is present, openai is assumed.The API key sent as a
Bearer token to the provider. Required for all
HTTP-based providers. Not used for OAuth providers such as antigravity
and github-copilot.The base URL for the provider’s API. Each protocol has a built-in default
(see the provider table below), so this field is optional when using the
official endpoint. Set it to route traffic through a proxy, a self-hosted
deployment, or an OpenAI-compatible endpoint.
Authentication mechanism. Use
"oauth" for providers that require OAuth
2.0 login (Antigravity / GitHub Copilot). For Anthropic and OpenAI you can
also specify "token" to load credentials from the local auth store
(~/.operator/auth.json) instead of an api_key.Optional HTTP/HTTPS/SOCKS5 proxy URL applied to all requests for this entry.
Example:
"http://proxy.example.com:3128".Optional requests-per-minute cap. Operator respects this limit when selecting
entries during load balancing.
Per-request timeout in seconds. Overrides the global default when set.
Protocol prefix system
Themodel field uses a protocol/model-identifier format. The protocol prefix routes the request to the correct provider implementation.
| Protocol | Provider | Default api_base |
|---|---|---|
openai | OpenAI | https://api.openai.com/v1 |
anthropic | Anthropic | https://api.anthropic.com/v1 |
gemini | Google Gemini (API key) | https://generativelanguage.googleapis.com/v1beta |
antigravity | Google Cloud Code Assist (OAuth) | — (OAuth, no base URL) |
groq | Groq | https://api.groq.com/openai/v1 |
deepseek | DeepSeek | https://api.deepseek.com/v1 |
ollama | Ollama (local) | http://localhost:11434/v1 |
openrouter | OpenRouter | https://openrouter.ai/api/v1 |
mistral | Mistral AI | https://api.mistral.ai/v1 |
qwen | Qwen / Alibaba DashScope | https://dashscope.aliyuncs.com/compatible-mode/v1 |
zhipu | Zhipu AI (GLM) | https://open.bigmodel.cn/api/paas/v4 |
moonshot | Moonshot (Kimi) | https://api.moonshot.cn/v1 |
nvidia | NVIDIA | https://integrate.api.nvidia.com/v1 |
cerebras | Cerebras | https://api.cerebras.ai/v1 |
volcengine | Volcengine (Doubao) | https://ark.cn-beijing.volces.com/api/v3 |
shengsuanyun | ShengsuanYun | https://router.shengsuanyun.com/api/v1 |
vllm | vLLM (local) | http://localhost:8000/v1 |
litellm | LiteLLM proxy | http://localhost:4000/v1 |
github-copilot | GitHub Copilot | localhost:4321 (gRPC) |
claude-cli | Claude CLI (subprocess) | — (local process) |
codex-cli | Codex CLI (subprocess) | — (local process) |
All protocols except
anthropic, antigravity, github-copilot,
claude-cli, and codex-cli use an OpenAI-compatible HTTP API.
Any service that exposes an OpenAI-compatible endpoint works with the
openai protocol prefix and a custom api_base.Provider configuration examples
- OpenAI
- Anthropic
- Gemini
- Groq
- DeepSeek
- Ollama (local)
- OpenRouter
- Antigravity (OAuth)
Setting the default model
agents.defaults.model_name names the model entry the agent uses by default
when no model is specified for a task. It must match a model_name value in
model_list.
Multi-model load balancing
Adding multiple entries that share the samemodel_name enables automatic
load balancing. Operator selects between them using round-robin so that
requests are spread across different API keys or endpoints.
Model fallbacks
You can configure ordered fallback models onagents.defaults for resilience.
If the primary model fails, Operator retries with each fallback in sequence: