Skip to main content
The Portkey AI Gateway can be deployed in several ways depending on your infrastructure requirements, latency needs, and operational preferences.

Portkey Cloud

Fully managed deployment. No infrastructure to run. Portkey processes billions of tokens daily for production workloads.

Node.js

Run the gateway locally or on any Node.js server with a single npx command or by building from source.

Docker

Pull the official Docker Hub image and run with a single command. Docker Compose supported.

Cloudflare Workers

Deploy to Cloudflare’s global edge network for low-latency routing close to your users.

AWS EC2

Deploy on EC2 using the provided CloudFormation template for repeatable, infrastructure-as-code provisioning.

Replit

One-click deploy on Replit for quick experimentation.

Zeabur

One-click deploy on Zeabur using the official template.

Supabase Functions

Deploy the gateway as a Supabase Edge Function alongside your existing Supabase project.

Comparison

OptionInfrastructureCold startEdge networkCustom config
Portkey CloudManagedNoneYesVia dashboard
Node.jsSelf-hostedFastNoconf.json
DockerSelf-hostedFastNoconf.json + env vars
Cloudflare WorkersServerlessSub-msYeswrangler.toml
AWS EC2Self-hostedFastNoconf.json + env vars
ReplitManagedModerateNoLimited
ZeaburManagedModerateNoVia dashboard
Supabase FunctionsServerlessModerateYesLimited

Choosing a deployment

  • No infrastructure management — Use Portkey Cloud. It runs the same gateway code and is battle-tested at scale.
  • Full control over data and configuration — Use Docker or Node.js on your own infrastructure.
  • Global edge distribution — Use Cloudflare Workers for sub-millisecond overhead close to your users.
  • AWS-native infrastructure — Use AWS EC2 with the bundled CloudFormation template.

Configuration

All self-hosted deployments share the same configuration format. See the configuration reference for a full description of conf.json fields and environment variables.

Build docs developers (and LLMs) love