Skip to main content

Documentation Index

Fetch the complete documentation index at: https://mintlify.com/earendil-works/pi/llms.txt

Use this file to discover all available pages before exploring further.

Pi supports 20+ LLM providers. For each built-in provider, Pi maintains a curated list of tool-capable models, updated with every release. Authenticate via an OAuth subscription (/login) or an API key, then select any model from that provider via /model or Ctrl+L.

Subscription providers

Three providers support OAuth-based subscription login. Run /login inside Pi, then select a provider:
/login
ProviderRequirement
Claude Pro/MaxAnthropic Claude Pro or Max subscription
ChatGPT Plus/Pro (Codex)OpenAI ChatGPT Plus or Pro subscription
GitHub CopilotActive GitHub Copilot subscription
Anthropic subscription auth draws from extra usage and is billed per token, not against your Claude plan limits.
Use /logout to clear credentials. Tokens are stored in ~/.pi/agent/auth.json and refresh automatically when expired.

API key providers

Environment variables

Set an API key before launching Pi, and Pi will detect it automatically:
export ANTHROPIC_API_KEY=sk-ant-...
pi
The full list of supported providers and their environment variables:
ProviderEnvironment Variableauth.json key
AnthropicANTHROPIC_API_KEYanthropic
OpenAIOPENAI_API_KEYopenai
Azure OpenAI ResponsesAZURE_OPENAI_API_KEYazure-openai-responses
DeepSeekDEEPSEEK_API_KEYdeepseek
Google GeminiGEMINI_API_KEYgoogle
MistralMISTRAL_API_KEYmistral
GroqGROQ_API_KEYgroq
CerebrasCEREBRAS_API_KEYcerebras
Cloudflare AI GatewayCLOUDFLARE_API_KEY (+ CLOUDFLARE_ACCOUNT_ID, CLOUDFLARE_GATEWAY_ID)cloudflare-ai-gateway
Cloudflare Workers AICLOUDFLARE_API_KEY (+ CLOUDFLARE_ACCOUNT_ID)cloudflare-workers-ai
xAIXAI_API_KEYxai
OpenRouterOPENROUTER_API_KEYopenrouter
Vercel AI GatewayAI_GATEWAY_API_KEYvercel-ai-gateway
ZAIZAI_API_KEYzai
OpenCode ZenOPENCODE_API_KEYopencode
OpenCode GoOPENCODE_API_KEYopencode-go
Hugging FaceHF_TOKENhuggingface
FireworksFIREWORKS_API_KEYfireworks
Kimi For CodingKIMI_API_KEYkimi-coding
MiniMaxMINIMAX_API_KEYminimax
MiniMax (China)MINIMAX_CN_API_KEYminimax-cn
Xiaomi MiMoXIAOMI_API_KEYxiaomi
Xiaomi MiMo Token Plan (China)XIAOMI_TOKEN_PLAN_CN_API_KEYxiaomi-token-plan-cn
Xiaomi MiMo Token Plan (Amsterdam)XIAOMI_TOKEN_PLAN_AMS_API_KEYxiaomi-token-plan-ams
Xiaomi MiMo Token Plan (Singapore)XIAOMI_TOKEN_PLAN_SGP_API_KEYxiaomi-token-plan-sgp

Auth file

You can also store credentials permanently in ~/.pi/agent/auth.json. The file is created with 0600 permissions (user read/write only). Credentials in auth.json take priority over environment variables.
{
  "anthropic": { "type": "api_key", "key": "sk-ant-..." },
  "openai": { "type": "api_key", "key": "sk-..." },
  "deepseek": { "type": "api_key", "key": "sk-..." },
  "google": { "type": "api_key", "key": "..." },
  "opencode": { "type": "api_key", "key": "..." }
}
You can also use /login in interactive mode and select an API key provider to store the key in auth.json without editing the file manually.

Key resolution

The key field in auth.json supports three formats:
Prefix the value with ! to run a shell command and use its stdout. The result is cached for the process lifetime.
{ "type": "api_key", "key": "!security find-generic-password -ws 'anthropic'" }
{ "type": "api_key", "key": "!op read 'op://vault/item/credential'" }
Provide the name of an environment variable (no $ prefix). Pi reads the variable at runtime.
{ "type": "api_key", "key": "MY_ANTHROPIC_KEY" }
Any other string is used directly as the API key.
{ "type": "api_key", "key": "sk-ant-..." }
OAuth credentials are also stored in auth.json after /login and managed automatically by Pi.

Cloud providers

Azure OpenAI

export AZURE_OPENAI_API_KEY=...
export AZURE_OPENAI_BASE_URL=https://your-resource.openai.azure.com
# Or use the resource name instead of the full URL
export AZURE_OPENAI_RESOURCE_NAME=your-resource

# Optional
export AZURE_OPENAI_API_VERSION=2024-02-01
export AZURE_OPENAI_DEPLOYMENT_NAME_MAP=gpt-4=my-gpt4,gpt-4o=my-gpt4o
Both https://your-resource.openai.azure.com and https://your-resource.cognitiveservices.azure.com root endpoints are supported and auto-normalized to /openai/v1.

Amazon Bedrock

export AWS_PROFILE=your-profile
Set the region (defaults to us-east-1):
export AWS_REGION=us-west-2
Pi also supports ECS task roles (AWS_CONTAINER_CREDENTIALS_*) and IRSA (AWS_WEB_IDENTITY_TOKEN_FILE). To use a specific Bedrock model:
pi --provider amazon-bedrock --model us.anthropic.claude-sonnet-4-20250514-v1:0
Prompt caching is enabled automatically for Claude models whose ID contains a recognizable model name. For application inference profiles (whose ARNs don’t contain the model name), force caching manually:
export AWS_BEDROCK_FORCE_CACHE=1
pi --provider amazon-bedrock --model arn:aws:bedrock:us-east-1:123456789012:application-inference-profile/abc123

Google Vertex AI

Uses Application Default Credentials (ADC):
gcloud auth application-default login
export GOOGLE_CLOUD_PROJECT=your-project
export GOOGLE_CLOUD_LOCATION=us-central1
Or set GOOGLE_APPLICATION_CREDENTIALS to a service account key file path.

Custom providers

Via models.json

Add custom providers and local models (Ollama, LM Studio, vLLM, or any OpenAI/Anthropic/Google-compatible API) via ~/.pi/agent/models.json. No restart required — the file reloads each time you open /model.
{
  "providers": {
    "ollama": {
      "baseUrl": "http://localhost:11434/v1",
      "api": "openai-completions",
      "apiKey": "ollama",
      "models": [
        { "id": "llama3.1:8b" },
        { "id": "qwen2.5-coder:7b" }
      ]
    }
  }
}
Supported API types: openai-completions, openai-responses, anthropic-messages, google-generative-ai. You can also use models.json to override a built-in provider’s base URL (for proxies) or merge additional models into an existing provider.

Via extensions

For providers that need custom API implementations or OAuth flows (enterprise SSO, corporate proxies), create an extension using pi.registerProvider(). See extensions for how to build one.

Resolution order

When Pi resolves credentials for a provider, it checks in this order:
  1. CLI --api-key flag
  2. auth.json entry (API key or OAuth token)
  3. Environment variable
  4. Custom provider keys from models.json

Build docs developers (and LLMs) love