Pi supports 20+ LLM providers. For each built-in provider, Pi maintains a curated list of tool-capable models, updated with every release. Authenticate via an OAuth subscription (Documentation Index
Fetch the complete documentation index at: https://mintlify.com/earendil-works/pi/llms.txt
Use this file to discover all available pages before exploring further.
/login) or an API key, then select any model from that provider via /model or Ctrl+L.
Subscription providers
Three providers support OAuth-based subscription login. Run/login inside Pi, then select a provider:
| Provider | Requirement |
|---|---|
| Claude Pro/Max | Anthropic Claude Pro or Max subscription |
| ChatGPT Plus/Pro (Codex) | OpenAI ChatGPT Plus or Pro subscription |
| GitHub Copilot | Active GitHub Copilot subscription |
Anthropic subscription auth draws from extra usage and is billed per token, not against your Claude plan limits.
/logout to clear credentials. Tokens are stored in ~/.pi/agent/auth.json and refresh automatically when expired.
API key providers
Environment variables
Set an API key before launching Pi, and Pi will detect it automatically:| Provider | Environment Variable | auth.json key |
|---|---|---|
| Anthropic | ANTHROPIC_API_KEY | anthropic |
| OpenAI | OPENAI_API_KEY | openai |
| Azure OpenAI Responses | AZURE_OPENAI_API_KEY | azure-openai-responses |
| DeepSeek | DEEPSEEK_API_KEY | deepseek |
| Google Gemini | GEMINI_API_KEY | google |
| Mistral | MISTRAL_API_KEY | mistral |
| Groq | GROQ_API_KEY | groq |
| Cerebras | CEREBRAS_API_KEY | cerebras |
| Cloudflare AI Gateway | CLOUDFLARE_API_KEY (+ CLOUDFLARE_ACCOUNT_ID, CLOUDFLARE_GATEWAY_ID) | cloudflare-ai-gateway |
| Cloudflare Workers AI | CLOUDFLARE_API_KEY (+ CLOUDFLARE_ACCOUNT_ID) | cloudflare-workers-ai |
| xAI | XAI_API_KEY | xai |
| OpenRouter | OPENROUTER_API_KEY | openrouter |
| Vercel AI Gateway | AI_GATEWAY_API_KEY | vercel-ai-gateway |
| ZAI | ZAI_API_KEY | zai |
| OpenCode Zen | OPENCODE_API_KEY | opencode |
| OpenCode Go | OPENCODE_API_KEY | opencode-go |
| Hugging Face | HF_TOKEN | huggingface |
| Fireworks | FIREWORKS_API_KEY | fireworks |
| Kimi For Coding | KIMI_API_KEY | kimi-coding |
| MiniMax | MINIMAX_API_KEY | minimax |
| MiniMax (China) | MINIMAX_CN_API_KEY | minimax-cn |
| Xiaomi MiMo | XIAOMI_API_KEY | xiaomi |
| Xiaomi MiMo Token Plan (China) | XIAOMI_TOKEN_PLAN_CN_API_KEY | xiaomi-token-plan-cn |
| Xiaomi MiMo Token Plan (Amsterdam) | XIAOMI_TOKEN_PLAN_AMS_API_KEY | xiaomi-token-plan-ams |
| Xiaomi MiMo Token Plan (Singapore) | XIAOMI_TOKEN_PLAN_SGP_API_KEY | xiaomi-token-plan-sgp |
Auth file
You can also store credentials permanently in~/.pi/agent/auth.json. The file is created with 0600 permissions (user read/write only). Credentials in auth.json take priority over environment variables.
/login in interactive mode and select an API key provider to store the key in auth.json without editing the file manually.
Key resolution
Thekey field in auth.json supports three formats:
Shell command
Shell command
Prefix the value with
! to run a shell command and use its stdout. The result is cached for the process lifetime.Environment variable name
Environment variable name
Provide the name of an environment variable (no
$ prefix). Pi reads the variable at runtime.Literal value
Literal value
Any other string is used directly as the API key.
auth.json after /login and managed automatically by Pi.
Cloud providers
Azure OpenAI
https://your-resource.openai.azure.com and https://your-resource.cognitiveservices.azure.com root endpoints are supported and auto-normalized to /openai/v1.
Amazon Bedrock
- AWS profile
- IAM keys
- Bearer token
us-east-1):
AWS_CONTAINER_CREDENTIALS_*) and IRSA (AWS_WEB_IDENTITY_TOKEN_FILE).
To use a specific Bedrock model:
Google Vertex AI
Uses Application Default Credentials (ADC):GOOGLE_APPLICATION_CREDENTIALS to a service account key file path.
Custom providers
Via models.json
Add custom providers and local models (Ollama, LM Studio, vLLM, or any OpenAI/Anthropic/Google-compatible API) via~/.pi/agent/models.json. No restart required — the file reloads each time you open /model.
openai-completions, openai-responses, anthropic-messages, google-generative-ai.
You can also use models.json to override a built-in provider’s base URL (for proxies) or merge additional models into an existing provider.
Via extensions
For providers that need custom API implementations or OAuth flows (enterprise SSO, corporate proxies), create an extension usingpi.registerProvider(). See extensions for how to build one.
Resolution order
When Pi resolves credentials for a provider, it checks in this order:- CLI
--api-keyflag auth.jsonentry (API key or OAuth token)- Environment variable
- Custom provider keys from
models.json