Skip to main content
Prism’s flexible configuration system allows you to easily set up and switch between different AI providers. This guide covers everything you need to know about configuring Prism.

Configuration file

After installation, you’ll find the Prism configuration file at config/prism.php. If you haven’t published it yet, run:
php artisan vendor:publish --tag=prism-config
Here’s the structure of the configuration file:
return [
    'prism_server' => [
        'middleware' => [],
        'enabled' => env('PRISM_SERVER_ENABLED', false),
    ],
    'request_timeout' => env('PRISM_REQUEST_TIMEOUT', 30),
    'providers' => [
        // Provider configurations
    ],
];

Request timeout

Prism includes a global request timeout that applies to all provider HTTP requests. By default, requests timeout after 30 seconds:
'request_timeout' => env('PRISM_REQUEST_TIMEOUT', 30),
You can adjust this in your .env file:
PRISM_REQUEST_TIMEOUT=60

Per-request timeout

You can also override the timeout for individual requests using withClientOptions():
use Prism\Prism\Facades\Prism;
use Prism\Prism\Enums\Provider;

$response = Prism::text()
    ->using(Provider::OpenAI, 'gpt-4')
    ->withClientOptions(['timeout' => 120]) // 2 minutes
    ->withPrompt('Complex analysis task...')
    ->asText();
The timeout applies to both the connection and the overall request duration. Increase this value for complex operations like large context windows or detailed analysis.

Provider configuration

Each AI provider has its own configuration section. Let’s look at the available providers and their settings.

OpenAI

'openai' => [
    'url' => env('OPENAI_URL', 'https://api.openai.com/v1'),
    'api_key' => env('OPENAI_API_KEY', ''),
    'organization' => env('OPENAI_ORGANIZATION', null),
    'project' => env('OPENAI_PROJECT', null),
],
Environment variables:
OPENAI_API_KEY=sk-...
OPENAI_URL=https://api.openai.com/v1
OPENAI_ORGANIZATION=org-...
OPENAI_PROJECT=proj_...

Anthropic

'anthropic' => [
    'api_key' => env('ANTHROPIC_API_KEY', ''),
    'version' => env('ANTHROPIC_API_VERSION', '2023-06-01'),
    'url' => env('ANTHROPIC_URL', 'https://api.anthropic.com/v1'),
    'default_thinking_budget' => env('ANTHROPIC_DEFAULT_THINKING_BUDGET', 1024),
    'anthropic_beta' => env('ANTHROPIC_BETA', null),
],
Environment variables:
ANTHROPIC_API_KEY=sk-ant-...
ANTHROPIC_API_VERSION=2023-06-01
ANTHROPIC_URL=https://api.anthropic.com/v1
ANTHROPIC_DEFAULT_THINKING_BUDGET=1024
ANTHROPIC_BETA=prompt-caching-2024-07-31
The anthropic_beta setting accepts a comma-separated list of beta features.

Ollama

'ollama' => [
    'url' => env('OLLAMA_URL', 'http://localhost:11434'),
],
Environment variables:
OLLAMA_URL=http://localhost:11434

Mistral

'mistral' => [
    'api_key' => env('MISTRAL_API_KEY', ''),
    'url' => env('MISTRAL_URL', 'https://api.mistral.ai/v1'),
],
Environment variables:
MISTRAL_API_KEY=...
MISTRAL_URL=https://api.mistral.ai/v1

Groq

'groq' => [
    'api_key' => env('GROQ_API_KEY', ''),
    'url' => env('GROQ_URL', 'https://api.groq.com/openai/v1'),
],
Environment variables:
GROQ_API_KEY=gsk_...
GROQ_URL=https://api.groq.com/openai/v1

xAI

'xai' => [
    'api_key' => env('XAI_API_KEY', ''),
    'url' => env('XAI_URL', 'https://api.x.ai/v1'),
],
Environment variables:
XAI_API_KEY=xai-...
XAI_URL=https://api.x.ai/v1

Gemini

'gemini' => [
    'api_key' => env('GEMINI_API_KEY', ''),
    'url' => env('GEMINI_URL', 'https://generativelanguage.googleapis.com/v1beta/models'),
],
Environment variables:
GEMINI_API_KEY=...
GEMINI_URL=https://generativelanguage.googleapis.com/v1beta/models

DeepSeek

'deepseek' => [
    'api_key' => env('DEEPSEEK_API_KEY', ''),
    'url' => env('DEEPSEEK_URL', 'https://api.deepseek.com/v1'),
],
Environment variables:
DEEPSEEK_API_KEY=sk-...
DEEPSEEK_URL=https://api.deepseek.com/v1

ElevenLabs

'elevenlabs' => [
    'api_key' => env('ELEVENLABS_API_KEY', ''),
    'url' => env('ELEVENLABS_URL', 'https://api.elevenlabs.io/v1/'),
],
Environment variables:
ELEVENLABS_API_KEY=...
ELEVENLABS_URL=https://api.elevenlabs.io/v1/

VoyageAI

'voyageai' => [
    'api_key' => env('VOYAGEAI_API_KEY', ''),
    'url' => env('VOYAGEAI_URL', 'https://api.voyageai.com/v1'),
],
Environment variables:
VOYAGEAI_API_KEY=pa-...
VOYAGEAI_URL=https://api.voyageai.com/v1

OpenRouter

'openrouter' => [
    'api_key' => env('OPENROUTER_API_KEY', ''),
    'url' => env('OPENROUTER_URL', 'https://openrouter.ai/api/v1'),
    'site' => [
        'http_referer' => env('OPENROUTER_SITE_HTTP_REFERER', null),
        'x_title' => env('OPENROUTER_SITE_X_TITLE', null),
    ],
],
Environment variables:
OPENROUTER_API_KEY=sk-or-...
OPENROUTER_URL=https://openrouter.ai/api/v1
OPENROUTER_SITE_HTTP_REFERER=https://yourapp.com
OPENROUTER_SITE_X_TITLE=Your App Name

Environment variables

Prism follows Laravel’s best practices for environment-specific configuration:
  1. All sensitive values (like API keys) are stored in your .env file
  2. Default values are provided as fallbacks in the config file
  3. Environment variables follow a predictable naming pattern:
    • API keys: PROVIDER_API_KEY
    • URLs: PROVIDER_URL
    • Other settings: PROVIDER_SETTING_NAME
Never commit your .env file to version control. API keys and secrets should remain private.

Overriding configuration at runtime

You can override provider configuration in your code without modifying the config file. This is useful for multi-tenant applications where users provide their own API keys.

Method 1: Via the using() method

Pass configuration as the third parameter:
use Prism\Prism\Facades\Prism;
use Prism\Prism\Enums\Provider;

$response = Prism::text()
    ->using(Provider::OpenAI, 'gpt-4o', [
        'api_key' => 'user-specific-key',
        'url' => 'https://custom-endpoint.com'
    ])
    ->withPrompt('Generate text')
    ->asText();

Method 2: Via usingProviderConfig()

Call usingProviderConfig() to override configuration:
use Prism\Prism\Facades\Prism;
use Prism\Prism\Enums\Provider;

$response = Prism::text()
    ->using(Provider::OpenAI, 'gpt-4o')
    ->usingProviderConfig([
        'api_key' => 'user-specific-key',
        'url' => 'https://custom-endpoint.com'
    ])
    ->withPrompt('Generate text')
    ->asText();
Runtime configuration is merged with the original configuration, allowing for partial or complete overrides.

Prism server configuration

Prism includes an optional MCP (Model Context Protocol) server that’s disabled by default:
'prism_server' => [
    'middleware' => [],
    'enabled' => env('PRISM_SERVER_ENABLED', false),
],
To enable the Prism server:
PRISM_SERVER_ENABLED=true
You can also specify middleware to apply to the Prism server routes:
'prism_server' => [
    'middleware' => ['auth', 'throttle:api'],
    'enabled' => env('PRISM_SERVER_ENABLED', false),
],
Learn more about Prism Server in the Prism Server documentation.

Custom endpoints and self-hosted services

Many providers support custom endpoints, which is useful for:
  • Self-hosted AI services (like Ollama)
  • Proxy services
  • Regional endpoints
  • Development/testing environments
Simply override the url configuration:
$response = Prism::text()
    ->using(Provider::OpenAI, 'gpt-4', [
        'url' => 'https://my-custom-proxy.com/v1'
    ])
    ->withPrompt('Generate text')
    ->asText();

Next steps

Quickstart

Build your first AI-powered feature with Prism

Text generation

Learn about text generation capabilities

Provider documentation

Explore detailed provider-specific documentation

Advanced configuration

Configure Prism Server and advanced features

Build docs developers (and LLMs) love