Skip to main content
Helicone’s prompt management system lets you version, deploy, and iterate on prompts without changing your code. Create prompts in the UI, deploy them to different environments, and track how they perform in production.

Why Use Prompt Management?

Version Control

Track every prompt change with major and minor versions. Rollback to previous versions when needed.

Environment Deployment

Deploy different prompt versions to production, staging, or custom environments.

No Code Changes

Update prompts without redeploying your application. Change text, parameters, and tools on the fly.

A/B Testing

Run experiments to compare prompt variations and measure performance improvements.

How It Works

Prompt management in Helicone follows a simple workflow:
1

Create a prompt in the UI

Build your prompt template in the Helicone dashboard with variables, messages, tools, and parameters.
2

Deploy using the SDK

Reference your prompt by ID in your code using the @helicone/prompts SDK.
3

Update without redeploying

Make changes in the UI and deploy new versions to different environments instantly.

Core Concepts

Prompts

A prompt is a template that defines:
  • Messages: The conversation structure (system, user, assistant messages)
  • Variables: Placeholders like {{hc:name:string}} that get replaced at runtime
  • Model Parameters: Temperature, max tokens, top_p, etc.
  • Tools: Function calling definitions
  • Response Format: Structured output schemas

Versions

Every prompt has multiple versions with semantic versioning:
  • Major versions (1.0, 2.0): Breaking changes or significant rewrites
  • Minor versions (1.1, 1.2): Incremental improvements
Each version is immutable and stored with a commit message for tracking.

Environments

Deploy specific versions to named environments:
  • production: The live version users interact with
  • staging: Testing before production release
  • Custom environments: dev, qa, canary, etc.
One version can be deployed to multiple environments simultaneously.

Variables

Prompts support typed variables using the syntax {{hc:variableName:type}}:
{{hc:userName:string}}
{{hc:temperature:number}}
{{hc:enableFeature:boolean}}
Variables are validated at runtime and can appear in:
  • Message content
  • Tool descriptions
  • Response format schemas

Quick Start

1

Install the SDK

npm install @helicone/prompts
2

Create a prompt in the UI

Navigate to Prompts in your Helicone dashboard and create a new prompt. Add messages, variables, and parameters.
3

Use in your code

import { HeliconePromptManager } from "@helicone/prompts";
import { OpenAI } from "openai";

const promptManager = new HeliconePromptManager({
  apiKey: process.env.HELICONE_API_KEY!,
});

const openai = new OpenAI({
  apiKey: process.env.OPENAI_API_KEY!,
});

// Fetch and compile the production version
const { body } = await promptManager.getPromptBody({
  prompt_id: "abc123",
  environment: "production",
  inputs: {
    userName: "Alice",
    topic: "AI safety",
  },
});

// Use with any OpenAI-compatible API
const response = await openai.chat.completions.create(body);

Next Steps

Versioning

Learn how to create and manage prompt versions

Deployment

Deploy prompts to different environments

Experiments

Run A/B tests to compare prompt variations

Gateway Integration

Use prompts directly through the AI Gateway

Benefits

Faster Iteration

Change prompts instantly without code deployments. Test new approaches in production with confidence.

Audit Trail

Every change is tracked with versions and commit messages. Know exactly what changed and when.

Collaboration

Non-technical team members can iterate on prompts in the UI while engineers focus on application logic.

Safety

Rollback to previous versions if new prompts underperform. Deploy cautiously to staging before production.

Build docs developers (and LLMs) love