Batch processing is provider-dependent. Check your provider’s documentation for supported endpoints and file formats.
Create a batch
POST /v1/batches
Creates a new batch job from a previously uploaded .jsonl file.
Request
The provider to route the request to (e.g.
openai).Your Portkey API key or provider API key.
The ID of the uploaded file containing the batch requests. Use
POST /v1/files to upload this file with purpose=batch.The API endpoint to run the batch against. For example,
/v1/chat/completions.The time window for completion. Currently
24h is the most commonly supported value.Optional key-value metadata to attach to the batch object.
Response
Unique identifier for the batch.
Always
batch.The API endpoint the batch runs against.
Current status of the batch. Values:
validating, in_progress, completed, failed, cancelled.The ID of the input file.
The ID of the output file once the batch has completed.
Unix timestamp when the batch was created.
Unix timestamp when the batch completed, or
null if still in progress.List batches
GET /v1/batches
Returns a list of batch jobs for the configured provider.
Request
The provider to route the request to.
Your Portkey API key or provider API key.
Maximum number of batches to return.
Cursor for pagination. Pass the
id of the last batch from the previous page.Response
Always
list.Array of batch objects.
Whether additional batches are available beyond this page.
Retrieve a batch
GET /v1/batches/:id
Retrieves a single batch job by its ID. Use this to poll the status of an in-progress batch.
Request
The ID of the batch to retrieve.
The provider to route the request to.
Your Portkey API key or provider API key.
Response
Returns a single batch object. See Create a batch for the full field list.Get batch output
GET /v1/batches/*/output
Retrieves the output file content for a completed batch. The wildcard path segment accepts the batch ID.
The batch must have
status: completed before output is available. Retrieve the batch first to check its output_file_id, then use GET /v1/files/:id/content to download the output directly.Request
The batch ID. For example,
/v1/batches/batch-xyz789/output.The provider to route the request to.
Your Portkey API key or provider API key.
Response
Returns the raw.jsonl output content. Each line corresponds to a response for a single request in the batch.
Cancel a batch
POST /v1/batches/:id/cancel
Cancels an in-progress batch. Requests that have already been processed are not reversed.
Request
The ID of the batch to cancel.
The provider to route the request to.
Your Portkey API key or provider API key.
Response
Returns the updated batch object withstatus: cancelling.