Skip to main content
Prism Vertex gives you access to 11 different model providers through a single unified interface. All providers share the same vertex configuration block—the provider constant you use determines which API schema and publisher endpoint is selected.

Available providers

The following table shows all supported providers with their publishers and API schemas:
ProviderConstantPublisherSchemaExample Models
Google GeminiVertex::GeminigoogleGeminigemini-2.5-flash, text-embedding-005
AnthropicVertex::AnthropicanthropicAnthropicclaude-3-5-sonnet@20241022, claude-3-5-haiku@20241022
Mistral AIVertex::MistralmistralaiOpenAImistral-small-2503, codestral-2501
MetaVertex::MetametaOpenAIllama-4-scout-17b-16e-instruct-maas
DeepSeekVertex::DeepSeekdeepseekOpenAIdeepseek-v3-0324-maas
AI21 LabsVertex::AI21ai21OpenAIjamba-1.5-mini@001, jamba-1.5-large@001
KimiVertex::KimikimiOpenAIkimi-k2-0711-maas
MiniMaxVertex::MiniMaxminimaxOpenAIminimax-m1-40k-0709-maas
OpenAI OSSVertex::OpenAIopenaiOpenAIgpt-oss-4o-mini-maas
QwenVertex::QwenqwenOpenAIqwen2.5-72b-instruct-maas
ZAIVertex::ZAIzaiorgOpenAIglm-4-plus-maas

API schemas

Prism Vertex uses three different API schemas to handle the different formats used by Vertex AI publishers:

Gemini

Google’s native generateContent and predict endpoints. Supports text, structured output, and embeddings.

Anthropic

Uses :rawPredict with the Anthropic Messages API format. Supports text and structured output.

OpenAI

Uses :rawPredict or :chatCompletions with OpenAI-compatible format. Supports text and structured output.

Schema capabilities

SchemaText GenerationStructured OutputEmbeddings
Gemini
Anthropic-
OpenAI-

How it works

When you use a provider constant, Prism Vertex automatically:
  1. Selects the correct API schema for that provider
  2. Routes to the appropriate publisher endpoint
  3. Formats requests and responses according to the schema
use Prism\Prism\Prism;
use Prism\Vertex\Enums\Vertex;

// Automatically uses Gemini schema + google publisher
$response = Prism::text()
    ->using(Vertex::Gemini, 'gemini-2.5-flash')
    ->withPrompt('Explain quantum computing')
    ->asText();

// Automatically uses Anthropic schema + anthropic publisher
$response = Prism::text()
    ->using(Vertex::Anthropic, 'claude-3-5-sonnet@20241022')
    ->withPrompt('Explain quantum computing')
    ->asText();

// Automatically uses OpenAI schema + meta publisher
$response = Prism::text()
    ->using(Vertex::Meta, 'llama-4-scout-17b-16e-instruct-maas')
    ->withPrompt('Explain quantum computing')
    ->asText();
You can override the schema if needed using withProviderOptions(['apiSchema' => VertexSchema::OpenAI]), though this is rarely necessary.

Express mode limitations

When using Express mode (API key only, no project ID or location), only Google Gemini models are supported. Using partner providers in Express mode will throw an exception.

Next steps

Gemini

Learn about Google Gemini models

Anthropic

Learn about Claude models on Vertex

Mistral

Learn about Mistral AI models

Meta

Learn about Llama models

Build docs developers (and LLMs) love