Use this file to discover all available pages before exploring further.
@earendil-works/pi-ai provides a unified streaming API across 20+ LLM providers. It handles model discovery, provider configuration, token and cost tracking, and context persistence — including seamless handoffs between models mid-conversation.
Only models that support tool calling (function calling) are included in the registry, as tool use is essential for agentic workflows.
import { getProviders, getModels, getModel } from '@earendil-works/pi-ai';// Get all available providersconst providers = getProviders();console.log(providers); // ['openai', 'anthropic', 'google', 'xai', 'groq', ...]// Get all models from a provider (fully typed)const anthropicModels = getModels('anthropic');for (const model of anthropicModels) { console.log(`${model.id}: ${model.name}`); console.log(` API: ${model.api}`); // 'anthropic-messages' console.log(` Context: ${model.contextWindow} tokens`); console.log(` Vision: ${model.input.includes('image')}`); console.log(` Reasoning: ${model.reasoning}`);}// Get a specific model (provider and model ID are auto-completed in IDEs)const model = getModel('openai', 'gpt-4o-mini');console.log(`Using ${model.name} via ${model.api} API`);
Use Model<'openai-completions'> to define a custom model for any OpenAI-compatible endpoint. The api field selects which API implementation the library uses.
The Context object is fully serializable with JSON.stringify and JSON.parse, making it easy to persist conversations, implement chat history, or hand off a session to a different model.
import { Context, getModel, complete } from '@earendil-works/pi-ai';const context: Context = { systemPrompt: 'You are a helpful assistant.', messages: [{ role: 'user', content: 'What is TypeScript?' }]};const model = getModel('openai', 'gpt-4o-mini');const response = await complete(model, context);context.messages.push(response);// Serialize and saveconst serialized = JSON.stringify(context);localStorage.setItem('conversation', serialized);// Later: restore and continue with any modelconst restored: Context = JSON.parse(localStorage.getItem('conversation')!);restored.messages.push({ role: 'user', content: 'Tell me more about its type system' });const newModel = getModel('anthropic', 'claude-3-5-haiku-20241022');const continuation = await complete(newModel, restored);
If the context contains images (base64-encoded), they are included in the serialized output. Be mindful of storage size.
The library works in browsers, but you must pass the API key explicitly — environment variables are not available. Exposing API keys in frontend code is dangerous. Only do this for internal tools or demos; use a backend proxy for production.