Skip to main content

Documentation Index

Fetch the complete documentation index at: https://mintlify.com/nodejs/undici/llms.txt

Use this file to discover all available pages before exploring further.

Dispatcher is the abstract base class that every undici transport builds on top of. You never instantiate Dispatcher directly — instead you work with concrete implementations like Client, Pool, or Agent. Understanding Dispatcher matters because all methods (request, stream, pipeline, dispatch, connect, upgrade) and all events (connect, disconnect, drain, connectionError) are defined here and inherited by every dispatcher in the library.
Requests are not guaranteed to be dispatched in order of invocation. Non-idempotent requests will not be pipelined, but idempotent requests may be automatically retried if they fail due to head-of-line blocking.

Methods

dispatch(options, handler)

The low-level API that all higher-level methods delegate to. Use this when you need complete control over the request lifecycle via callbacks. Returns false when the dispatcher is busy — wait for the drain event before dispatching again.
Basic dispatch example
import { Client } from 'undici'

const client = new Client('http://localhost:3000')
const chunks = []

client.dispatch(
  { path: '/', method: 'GET' },
  {
    onRequestStart: () => console.log('connected'),
    onResponseStart: (_ctrl, statusCode, headers) => {
      console.log(statusCode, headers)
    },
    onResponseData: (_ctrl, chunk) => chunks.push(chunk),
    onResponseEnd: (_ctrl, trailers) => {
      console.log(Buffer.concat(chunks).toString('utf8'))
    },
    onResponseError: (_ctrl, err) => console.error(err),
  }
)

DispatchOptions

body.origin
string | URL
The origin to dispatch the request against. Required by Agent; inferred from the constructor URL for Client and Pool.
body.path
string
required
The request path, including query string if not using query.
body.method
string
required
HTTP method (e.g. 'GET', 'POST', 'PUT').
body.body
string | Buffer | Uint8Array | stream.Readable | Iterable | AsyncIterable | null
default:"null"
Request body.
body.headers
UndiciHeaders
default:"null"
Request headers. Can be an object, a flat string array (even-length), or any iterable of [name, value] pairs (e.g. Headers, Map).
body.query
Record<string, any> | null
default:"null"
Query string parameters. Keys and values are encoded with encodeURIComponent. For unencoded query strings, embed them directly in path.
body.reset
boolean
default:"false"
When true, sends connection: close and closes the socket after the response. Defaults to keep-alive behavior.
body.idempotent
boolean
default:"true for GET and HEAD"
Whether the request can be safely retried. Non-idempotent requests are not pipelined.
body.blocking
boolean
default:"true except for HEAD"
Prevents further pipelining on the same connection until response headers arrive. Set to true for long-running requests.
body.upgrade
string | null
default:"null"
Upgrade token (e.g. 'websocket'). When set, the handler must implement onRequestUpgrade.
body.headersTimeout
number | null
default:"300000"
Milliseconds to wait for complete response headers before timing out. Set to 0 to disable.
body.bodyTimeout
number | null
default:"300000"
Milliseconds of inactivity on body data before timing out. Set to 0 to disable.
body.signal
AbortSignal | EventEmitter | null
default:"null"
Abort signal or an EventEmitter that emits 'abort'.
body.expectContinue
boolean
default:"false"
HTTP/2 only. Appends expect: 100-continue and holds the body until the server acknowledges.

DispatchHandler

The handler object passed to dispatch(). All methods receive a DispatchController as their first argument.
onRequestStart
(controller, context) => void
required
Called before the request is dispatched. May be called multiple times on retry.
onResponseStart
(controller, statusCode, headers, statusMessage?) => void
required
Called when the status code and headers are received. May fire multiple times for 1xx informational responses.
onResponseData
(controller, chunk: Buffer) => void
required
Called for each chunk of response body data.
onResponseEnd
(controller, trailers) => void
required
Called when the response is fully received, including trailers.
onResponseError
(controller, error: Error) => void
required
Called on error. Must not throw.
onRequestUpgrade
(controller, statusCode, headers, socket) => void
Required when upgrade is set or method is CONNECT.

Controller API

The DispatchController passed to every handler method exposes:
  • controller.pause() — back-pressure signal to pause data flow
  • controller.resume() — resume after a pause()
  • controller.abort(reason?) — abort the request
  • controller.rawHeaders — raw response header array
  • controller.rawTrailers — raw trailer array
If you are migrating from the legacy handler API (onConnect, onHeaders, onData, onComplete, onError), switch to the new callbacks. Use Dispatcher1Wrapper to expose a new dispatcher to legacy v1 handler consumers.
Legacy compatibility wrapper
import { Agent, Dispatcher1Wrapper } from 'undici'

const legacyDispatcher = new Dispatcher1Wrapper(new Agent())

request(options[, callback])

Performs an HTTP request and returns a ResponseData object. Idempotent requests are automatically retried on head-of-line failures (unless the body is a stream). All response bodies must be consumed or destroyed.
GET request
import { Client } from 'undici'

const client = new Client('http://localhost:3000')

const { statusCode, headers, body, trailers } = await client.request({
  path: '/users/1',
  method: 'GET',
})

console.log(statusCode) // 200
const data = await body.json()
POST with JSON body
const { statusCode, body } = await client.request({
  path: '/users',
  method: 'POST',
  headers: { 'content-type': 'application/json' },
  body: JSON.stringify({ name: 'Alice' }),
})
Aborting a request
const controller = new AbortController()

client.request({
  path: '/',
  method: 'GET',
  signal: controller.signal,
}).catch(console.error)

controller.abort()

RequestOptions

Extends DispatchOptions. Additional fields:
body.opaque
unknown
default:"null"
Passed through to ResponseData and StreamFactoryData. Use to avoid closures in factory functions.
body.onInfo
({ statusCode, headers }) => void | null
default:"null"
Called for each 1xx informational response before the final response.

ResponseData

statusCode
number
required
HTTP response status code.
statusText
string
required
Status message (e.g. "OK", "Not Found").
headers
Record<string, string | string[]>
required
Response headers. All keys are lower-cased.
body
stream.Readable
required
Response body stream. Also implements the Fetch body mixin: .json(), .text(), .arrayBuffer(), .blob(), .bytes(). Cannot be consumed twice. Call body.dump() to discard without killing the socket.
trailers
Record<string, string>
HTTP trailers, populated after body emits 'end'.
opaque
unknown
The value passed as options.opaque.
Always fully consume or destroy the response body, even when you don’t need it. Leaving it unconsumed will prevent the connection from being reused.
Discarding the body safely
const { body, statusCode } = await client.request({ path: '/', method: 'GET' })

if (statusCode === 200) {
  return await body.json()
}

await body.dump() // discard up to 128KB without killing the socket

stream(options, factory[, callback])

A faster alternative to request that writes the response body directly to a stream.Writable returned by factory. Avoids creating an intermediate Readable.
Stream response to a Writable
import { Writable } from 'stream'
import { Client } from 'undici'

const client = new Client('http://localhost:3000')
const chunks = []

await client.stream(
  { path: '/', method: 'GET', opaque: { chunks } },
  ({ statusCode, opaque: { chunks } }) =>
    new Writable({
      write(chunk, _, cb) {
        chunks.push(chunk)
        cb()
      },
    })
)

console.log(Buffer.concat(chunks).toString('utf8'))

pipeline(options, handler)

For use with Node.js stream.pipeline. The handler receives { statusCode, headers, body, opaque } and must return a Readable. Returns a Duplex that writes the request body and reads the response.

connect(options[, callback])

Opens a raw TCP tunnel using HTTP CONNECT. Returns { statusCode, headers, socket, opaque }.
body.path
string
required
Target host and port in the form host:port.
body.headers
UndiciHeaders
default:"null"
Additional request headers.
body.signal
AbortSignal | EventEmitter | null
default:"null"
Abort signal.

upgrade(options[, callback])

Upgrades to a different protocol (e.g. WebSocket). Returns { headers, socket, opaque }.
body.path
string
required
Request path.
body.method
string
default:"GET"
HTTP method.
body.protocol
string
default:"Websocket"
Comma-separated protocol list in descending preference order.
body.headers
UndiciHeaders
default:"null"
Additional request headers.

close([callback])

Gracefully closes the dispatcher. Waits for in-flight requests to complete before resolving.
await dispatcher.close()

destroy([error][, callback])

Abruptly destroys the dispatcher. All pending and running requests are asynchronously aborted with the provided error (or a generic error if omitted).
await dispatcher.destroy(new Error('shutting down'))

compose(interceptors)

Wraps the dispatcher with one or more interceptors, returning a new Dispatcher. Interceptors are applied in reverse order — the last interceptor in the array is the first to process each request.
Adding redirect and retry interceptors
import { Client, interceptors } from 'undici'

const client = new Client('http://api.example.com')
  .compose(interceptors.redirect({ maxRedirections: 3 }))
  .compose(interceptors.retry({ maxRetries: 3, minTimeout: 500 }))

await client.request({ path: '/', method: 'GET' })

Built-in interceptors

Handles HTTP redirects automatically.
const { Client, interceptors } = require('undici')

const client = new Client('http://api.example.com').compose(
  interceptors.redirect({ maxRedirections: 5, throwOnMaxRedirect: true })
)
Retries failed requests with exponential backoff. Accepts the same options as RetryHandler.
const client = new Client('http://api.example.com').compose(
  interceptors.retry({ maxRetries: 3, minTimeout: 1000, timeoutFactor: 2 })
)
Client-side HTTP caching per RFC 9111. Accepts a CacheStore (default: MemoryCacheStore).
import { Agent, cacheStores, interceptors, setGlobalDispatcher } from 'undici'

setGlobalDispatcher(
  new Agent().compose(
    interceptors.cache({
      store: new cacheStores.MemoryCacheStore({ maxSize: 100 * 1024 * 1024 })
    })
  )
)
Caches DNS lookups per origin to reduce resolution overhead.Options: maxTTL (default 10000 ms), maxItems, dualStack (default true), affinity (4 or 6), lookup, pick, storage.
Dumps (discards) response bodies up to maxSize bytes (default 1 MB) without killing the connection.
Automatically decompresses gzip, deflate, brotli, and zstd response bodies per RFC 9110.Options: skipErrorResponses (default true), skipStatusCodes (default [204, 304]).
Throws a ResponseError for any response with a status code >= 400.
Deduplicates concurrent identical requests so only one reaches the origin.Options: methods (default ['GET']), skipHeaderNames, excludeHeaderNames, maxBufferSize.

Events

connect
event
Emitted when a socket connects to the origin. Parameters: origin: URL, targets: Dispatcher[].
disconnect
event
Emitted when a socket disconnects. For HTTP/2, also fired on GOAWAY frames. Parameters: origin: URL, targets: Dispatcher[], error: Error.
connectionError
event
Emitted when the dispatcher fails to connect to the origin. Parameters: origin: URL, targets: Dispatcher[], error: Error.
drain
event
Emitted when the dispatcher is no longer busy. Safe to call dispatch() again after this event. Parameters: origin: URL.

Header formats (UndiciHeaders)

Headers can be passed in three forms:
{
  'content-type': 'application/json',
  'x-request-id': 'abc123',
}
Undici validates header syntax but does not sanitize values. Always validate and sanitize user-provided header names and values before passing them to undici to prevent header injection vulnerabilities.

Build docs developers (and LLMs) love