Skip to main content
The gateway is configured via a conf.json file placed in the project root. The file is loaded at startup. All self-hosted deployments — Node.js, Docker, and EC2 — use this same format.

Example

conf.json
{
  "plugins_enabled": [
    "default",
    "portkey",
    "aporia",
    "sydelabs",
    "pillar",
    "patronus",
    "pangea",
    "promptsecurity",
    "panw-prisma-airs",
    "walledai"
  ],
  "credentials": {
    "portkey": {
      "apiKey": "..."
    }
  },
  "cache": false,
  "integrations": [
    {
      "provider": "anthropic",
      "slug": "dev_team_anthropic",
      "credentials": {
        "apiKey": "sk-ant-"
      },
      "rate_limits": [
        {
          "type": "requests",
          "unit": "rph",
          "value": 3
        },
        {
          "type": "tokens",
          "unit": "rph",
          "value": 3000
        }
      ],
      "models": [
        {
          "slug": "claude-3-7-sonnet-20250219",
          "status": "active",
          "pricing_config": null
        }
      ]
    }
  ]
}

Top-level fields

body.plugins_enabled
string[]
List of guardrail and integration plugin identifiers to enable. The default plugin is always required. Additional plugins correspond to third-party guardrail providers.Available values: default, portkey, qualifire, aporia, sydelabs, pillar, patronus, pangea, promptsecurity, panw-prisma-airs, walledai
body.credentials
object
Top-level credentials for enabled plugins that require API keys (e.g. Portkey itself). Each key is a plugin identifier and its value is a credentials object.
{
  "credentials": {
    "portkey": {
      "apiKey": "pk-..."
    }
  }
}
body.cache
boolean
default:"false"
Enable in-memory response caching. When true, the gateway intercepts repeated identical requests and returns cached responses without forwarding to the upstream provider.For persistent caching across restarts or multiple instances, use the REDIS_CONNECTION_STRING environment variable instead.
body.integrations
object[]
Array of provider integration configurations. Each entry registers a named provider integration with its own credentials, rate limits, and model list.See the integrations fields section below for full details.

Integrations fields

Each object in the integrations array supports the following fields:
body.provider
string
required
The AI provider identifier. Examples: anthropic, openai, azure, cohere, mistral.
body.slug
string
required
A unique human-readable name for this integration. Used to reference this integration in routing configs and logs.
body.credentials
object
required
Provider-specific credentials. For most providers this is an apiKey string:
{
  "credentials": {
    "apiKey": "sk-..."
  }
}
body.rate_limits
object[]
Optional array of rate limit rules applied to this integration. Each rule has three fields:
FieldTypeDescription
type"requests" | "tokens"Whether to limit by request count or token count
unit"rph"Rate limit window. Currently rph (requests/tokens per hour)
valuenumberMaximum allowed requests or tokens within the window
{
  "rate_limits": [
    { "type": "requests", "unit": "rph", "value": 100 },
    { "type": "tokens",   "unit": "rph", "value": 50000 }
  ]
}
body.models
object[]
Optional list of models available through this integration. Each entry has the following fields:
FieldTypeDescription
slugstringModel identifier as expected by the provider API (e.g. claude-3-7-sonnet-20250219)
status"active" | "inactive"Whether the model is available for routing
pricing_configobject | nullOptional custom pricing configuration

Environment variables

body.REDIS_CONNECTION_STRING
string
Redis connection string for persistent, distributed response caching. When set (Node.js runtime only), the gateway connects to Redis on startup and uses it as the cache backend instead of in-memory storage.
REDIS_CONNECTION_STRING=redis://your-redis-host:6379
Redis caching is only supported on the Node.js runtime. It is not available on Cloudflare Workers.

Mounting conf.json in Docker

When running via Docker, mount your local conf.json into the container at /app/conf.json:
docker run --rm -p 8787:8787 \
  -v $(pwd)/conf.json:/app/conf.json \
  -e REDIS_CONNECTION_STRING=redis://your-redis-host:6379 \
  portkeyai/gateway:latest

Build docs developers (and LLMs) love