Skip to main content

Documentation Index

Fetch the complete documentation index at: https://mintlify.com/earendil-works/pi/llms.txt

Use this file to discover all available pages before exploring further.

Anthropic (Claude Pro/Max), OpenAI Codex (ChatGPT Plus/Pro), and GitHub Copilot require OAuth authentication rather than static API keys. OAuth functions are available from the @earendil-works/pi-ai/oauth entry point, which is Node.js only.
OAuth login flows are not supported in browser environments. Use this entry point in Node.js only. For web apps, authenticate server-side and proxy requests.

Imports

import {
  // Login functions (return credentials, do not store them)
  loginAnthropic,
  loginOpenAICodex,
  loginGitHubCopilot,
  loginGeminiCli,

  // Token management
  refreshOAuthToken,   // (provider, credentials) => new credentials
  getOAuthApiKey,      // (provider, credentialsMap) => { newCredentials, apiKey } | null

  // Types
  type OAuthProvider,
  type OAuthCredentials,
} from '@earendil-works/pi-ai/oauth';

CLI login (quickest start)

npx @earendil-works/pi-ai login              # interactive provider selection
npx @earendil-works/pi-ai login anthropic    # login to specific provider
npx @earendil-works/pi-ai login openai-codex
npx @earendil-works/pi-ai login github-copilot
npx @earendil-works/pi-ai list               # list available providers
Credentials are saved to auth.json in the current directory.

Programmatic login flow

Use the login functions when you need to control credential storage or integrate with your own auth infrastructure. The callbacks let you drive the interactive device-flow UI:
import { loginGitHubCopilot } from '@earendil-works/pi-ai/oauth';
import { writeFileSync } from 'fs';

const credentials = await loginGitHubCopilot({
  onAuth: (url, instructions) => {
    // Show the user the URL to open in their browser
    console.log(`Open: ${url}`);
    if (instructions) console.log(instructions);
  },
  onPrompt: async (prompt) => {
    // Optionally ask the user to confirm or enter a code
    return await getUserInput(prompt.message);
  },
  onProgress: (message) => console.log(message)
});

// Credential storage is your responsibility
const auth = { 'github-copilot': { type: 'oauth', ...credentials } };
writeFileSync('auth.json', JSON.stringify(auth, null, 2));
The same callback pattern works for loginAnthropic, loginOpenAICodex, and loginGeminiCli.

Using OAuth tokens

Use getOAuthApiKey() to extract a current API key from stored credentials, automatically refreshing if the token has expired. Save the returned credentials to persist the refresh:
import { getModel, complete } from '@earendil-works/pi-ai';
import { getOAuthApiKey } from '@earendil-works/pi-ai/oauth';
import { readFileSync, writeFileSync } from 'fs';

// Load stored credentials
const auth = JSON.parse(readFileSync('auth.json', 'utf-8'));

// Get API key, refreshing automatically if expired
const result = await getOAuthApiKey('github-copilot', auth);
if (!result) throw new Error('Not logged in to GitHub Copilot');

// Persist refreshed credentials
auth['github-copilot'] = { type: 'oauth', ...result.newCredentials };
writeFileSync('auth.json', JSON.stringify(auth, null, 2));

// Use the API key for the request
const model = getModel('github-copilot', 'gpt-4o');
const response = await complete(model, {
  messages: [{ role: 'user', content: 'Hello!' }]
}, { apiKey: result.apiKey });

Vertex AI

Vertex AI supports two authentication modes:
Set GOOGLE_CLOUD_API_KEY or pass apiKey in call options. Project and location are not required when using an API key.
import { getModel, complete } from '@earendil-works/pi-ai';

const model = getModel('google-vertex', 'gemini-2.5-flash');
const response = await complete(model, {
  messages: [{ role: 'user', content: 'Hello from Vertex AI' }]
}, {
  apiKey: process.env.GOOGLE_CLOUD_API_KEY,
});

Provider notes

Requires a ChatGPT Plus or Pro subscription. Provides access to GPT-5.x Codex models with extended context windows and reasoning capabilities.The library automatically handles session-based prompt caching when sessionId is provided in stream options. You can set transport in stream options to "sse", "websocket", or "auto" for transport selection. When using WebSocket with a sessionId, connections are reused per session and expire after 5 minutes of inactivity.
Uses the Responses API only. Set AZURE_OPENAI_API_KEY and either AZURE_OPENAI_BASE_URL or AZURE_OPENAI_RESOURCE_NAME.
  • AZURE_OPENAI_BASE_URL supports both https://<resource>.openai.azure.com and https://<resource>.cognitiveservices.azure.com; root endpoints are normalized to .../openai/v1 automatically
  • Use AZURE_OPENAI_API_VERSION (defaults to v1) to override the API version
  • Deployment names are treated as model IDs by default; override with azureDeploymentName or AZURE_OPENAI_DEPLOYMENT_NAME_MAP using comma-separated model-id=deployment pairs (e.g. gpt-4o-mini=my-deployment,gpt-4o=prod)
  • Legacy deployment-based URLs are intentionally unsupported
If you receive a “The requested model is not supported” error, enable the model manually in VS Code:
  1. Open Copilot Chat
  2. Click the model selector
  3. Select the model (it may show a warning icon)
  4. Click “Enable”

Build docs developers (and LLMs) love