Skip to main content

Installation

Get Prism Vertex up and running in your Laravel application.

Requirements

Before you begin, make sure you have:
  • PHP 8.2 or higher
  • Laravel 11 or 12
  • Composer installed

Install the package

1

Install via Composer

Run the following command in your Laravel project:
composer require rmh/vertex
This will also install the required dependencies:
  • prism-php/prism (>= 0.99.16)
  • google/auth (^1.0)
2

Choose your configuration mode

Prism Vertex supports two modes:Standard mode - Full access to all models and providers
  • Requires Google Cloud project ID and location
  • Supports service account or API key authentication
  • Works with all 11 supported providers
Express mode - Quick setup with API key only
  • Only requires an API key
  • Limited to Google Gemini models
  • No project or location configuration needed
3

Configure Standard mode (recommended)

Add the Vertex configuration to your config/prism.php file:
'providers' => [
    'vertex' => [
        'project_id'  => env('VERTEX_PROJECT_ID'),
        'location'    => env('VERTEX_LOCATION', 'us-central1'),
        'credentials' => env('VERTEX_CREDENTIALS'),
    ],
],
Then add these values to your .env file:
VERTEX_PROJECT_ID=your-project-id
VERTEX_LOCATION=us-central1
VERTEX_CREDENTIALS=/path/to/service-account.json
You can use api_key instead of credentials if you prefer API key authentication:
'vertex' => [
    'project_id'  => env('VERTEX_PROJECT_ID'),
    'location'    => env('VERTEX_LOCATION', 'us-central1'),
    'api_key'     => env('VERTEX_API_KEY'),
],
4

Or configure Express mode

For a simpler setup with only Google Gemini models, add this to config/prism.php:
'providers' => [
    'vertex' => [
        'api_key' => env('VERTEX_API_KEY'),
    ],
],
Then add to your .env file:
VERTEX_API_KEY=your-api-key
Express mode only supports Google Gemini models. If you try to use partner models (Anthropic, Meta, Mistral, etc.) in Express mode, you’ll get an exception.

Authentication options

You have two ways to authenticate with Vertex AI:
  1. Create a service account in your Google Cloud project
  2. Download the JSON key file
  3. Set the credentials config option to the path of your JSON file:
'vertex' => [
    'project_id'  => env('VERTEX_PROJECT_ID'),
    'location'    => env('VERTEX_LOCATION', 'us-central1'),
    'credentials' => env('VERTEX_CREDENTIALS'),
],
The package uses the google/auth library to automatically obtain a Bearer token.

API key

Set the api_key config option. The key is sent as a query parameter on every request:
'vertex' => [
    'project_id'  => env('VERTEX_PROJECT_ID'),
    'location'    => env('VERTEX_LOCATION', 'us-central1'),
    'api_key'     => env('VERTEX_API_KEY'),
],
This works for both Standard and Express modes.

Per-provider configuration

All providers read from the shared vertex config by default. If you need different settings for a specific provider (for example, a different region for Anthropic), you can add a per-provider config block:
'providers' => [
    'vertex' => [
        'project_id'  => env('VERTEX_PROJECT_ID'),
        'location'    => env('VERTEX_LOCATION', 'us-central1'),
        'credentials' => env('VERTEX_CREDENTIALS'),
    ],
    
    // Override for Anthropic only
    'vertex-anthropic' => [
        'location' => 'europe-west1',
    ],
],
The provider-specific config merges with and overrides the shared config.

Available locations

Vertex AI is available in multiple Google Cloud regions. Common locations include:
  • us-central1 (Iowa)
  • us-east4 (Northern Virginia)
  • europe-west1 (Belgium)
  • europe-west4 (Netherlands)
  • asia-northeast1 (Tokyo)
  • asia-southeast1 (Singapore)
Not all models are available in all regions. Check the Vertex AI documentation for model availability by region.

Verify your installation

Test your configuration with a simple text generation request:
use Prism\Prism\Prism;
use Prism\Vertex\Enums\Vertex;

$response = Prism::text()
    ->using(Vertex::Gemini, 'gemini-2.5-flash')
    ->withPrompt('Say hello')
    ->asText();

echo $response->text;
If you see a response, you’re ready to go!

Next steps

Quick start guide

Follow the quickstart to build your first integration

Build docs developers (and LLMs) love