Skip to main content

Documentation Index

Fetch the complete documentation index at: https://mintlify.com/nodejs/undici/llms.txt

Use this file to discover all available pages before exploring further.

Undici exposes a set of top-level dispatch functions that use the global dispatcher under the hood. Each function accepts a URL as the first argument and an options object, so you don’t need to instantiate a Client or Pool manually. The global dispatcher is an Agent instance by default, but you can replace it with setGlobalDispatcher.
All response bodies returned by request must be fully consumed or explicitly destroyed. Failing to do so will exhaust the connection pool and stall subsequent requests.

undici.request(url, options)

Performs an HTTP request and returns a promise that resolves with the full response. Non-idempotent requests are not pipelined; idempotent requests are automatically retried on indirect pipeline failures (except when the request body is a stream).
import { request } from 'undici'

const { statusCode, headers, body } = await request('https://example.com/api/data')
console.log(statusCode) // 200
const json = await body.json()

Parameters

url
string | URL | UrlObject
required
The request URL. Accepts a plain string, a WHATWG URL object, or a Node.js UrlObject (the shape returned by url.parse()). When an UrlObject is used, origin is derived from protocol, auth, hostname, and port; path is derived from pathname and search.
options
object
Request configuration. All fields are optional unless noted.

Return value

Returns Promise<ResponseData>.
statusCode
number
required
HTTP response status code.
statusText
string
required
HTTP status message, e.g. 'OK' or 'Not Found'.
headers
Record<string, string | string[]>
required
Response headers. All keys are lower-cased, e.g. 'content-type'. Header values are strings; headers that appear more than once become string arrays.
body
stream.Readable
required
Response body as a Node.js Readable. Also implements the Fetch body mixin:The body cannot be consumed twice. Calling json() after text() throws a TypeError.
trailers
Record<string, string>
required
Starts as an empty object and is populated with HTTP trailers after the body 'end' event fires.
opaque
unknown
The value passed in via options.opaque.
context
object
Internal dispatcher context forwarded from the connection.

Examples

GET with JSON body
import { request } from 'undici'

const { statusCode, body } = await request('https://api.example.com/users')
if (statusCode === 200) {
  const users = await body.json()
  console.log(users)
} else {
  await body.dump() // must still consume the body
}
Always consume or destroy the body, even when you don’t need its content. Use body.dump() to drain it without closing the socket, or body.destroy() to close the socket immediately.

undici.stream(url, options, factory)

A faster alternative to request that writes the response body directly into a Writable stream returned by the factory function. This avoids creating an intermediate Readable and is well-suited for proxying responses (e.g. into a Fastify or Express response object).
Basic stream example
import { stream } from 'undici'
import { Writable } from 'node:stream'

const chunks = []

await stream(
  'https://example.com/large-file',
  { method: 'GET', opaque: { chunks } },
  ({ statusCode, headers, opaque: { chunks } }) => {
    console.log('Status:', statusCode)
    return new Writable({
      write (chunk, _enc, cb) {
        chunks.push(chunk)
        cb()
      }
    })
  }
)

console.log(Buffer.concat(chunks).toString())

Parameters

url
string | URL | UrlObject
required
Same as request. See the URL parameter description above.
options
object
required
Same options as request. The method field is required for stream.
factory
(data: StreamFactoryData) => stream.Writable
required
Called once response headers are received. Must return a stream.Writable that the response body is piped into.

Return value

Returns Promise<StreamData>.
opaque
unknown
The value passed via options.opaque.
trailers
Record<string, string>
HTTP trailers received after the body ends.
context
object
Internal dispatcher context.
Pass context through options.opaque instead of using a closure inside the factory. This avoids allocating a new function per request and works well with frameworks like Fastify.
Proxying to a Fastify response via opaque
fastify.get('/proxy', (req, reply) => {
  return client.stream(
    { path: '/upstream', method: 'GET', opaque: reply },
    ({ opaque }) => opaque.raw // write directly into Fastify's raw response
  )
})

undici.pipeline(url, options, handler)

Designed for use with Node.js stream.pipeline. Returns a Duplex stream: writes to it become the request body, and reads from it produce the response body. The handler function receives response metadata and must return a Readable.
Pipeline echo example
import { Readable, Writable, PassThrough, pipeline } from 'node:stream'
import { pipeline as undiciPipeline } from 'undici'

pipeline(
  Readable.from(['hello world']),
  undiciPipeline('https://echo.example.com', { method: 'POST' }, ({ statusCode, body }) => {
    if (statusCode !== 200) throw new Error(`Unexpected status: ${statusCode}`)
    return body
  }),
  new Writable({
    write (chunk, _, cb) {
      process.stdout.write(chunk)
      cb()
    }
  }),
  (err) => { if (err) console.error(err) }
)

Parameters

url
string | URL | UrlObject
required
Same as request.
options
object
required
Extends RequestOptions with one additional field:
handler
(data: PipelineHandlerData) => stream.Readable
required
Called when headers are received. Must validate the response, throw on error, and return a Readable that the caller reads from.

Return value

Returns stream.Duplex. The duplex is both the request body writer and the response body reader. Wire it up with stream.pipeline to propagate errors automatically.

undici.connect(url, options)

Opens an HTTP CONNECT tunnel to the target. Used for proxying, TLS passthrough, or any other two-way byte-stream protocol. Returns a raw stream.Duplex socket.
CONNECT tunnel
import { connect } from 'undici'

const { socket } = await connect('http://proxy.example.com', {
  path: '/tunnel'
})

socket.write('PING')
socket.on('data', (data) => console.log(data.toString()))
socket.end()

Parameters

url
string | URL | UrlObject
required
URL of the HTTP server that handles CONNECT. The path in options becomes the tunnel target address sent in the CONNECT request line.
options
object

Return value

Returns Promise<ConnectData>.
statusCode
number
Should be 200 Connection established on success.
headers
Record<string, string | string[]>
Response headers from the proxy.
socket
stream.Duplex
The raw bidirectional socket to the tunnel target.
opaque
unknown
Value from options.opaque.

undici.upgrade(url, options)

Sends an HTTP upgrade request to switch protocols (e.g. WebSocket, HTTP/2 cleartext). Returns the upgraded raw socket along with the server’s 101 Switching Protocols headers.
WebSocket upgrade handshake
import { upgrade } from 'undici'

const { headers, socket } = await upgrade('http://localhost:3000', {
  path: '/chat',
  protocol: 'websocket'
})

console.log('Upgraded to:', headers.upgrade)
socket.end()

Parameters

url
string | URL | UrlObject
required
Base URL of the HTTP server.
options
object

Return value

Returns Promise<UpgradeData>.
headers
http.IncomingHeaders
Response headers including upgrade and connection.
socket
stream.Duplex
The raw upgraded socket. The protocol handshake is complete; you can begin the wire protocol immediately.
opaque
unknown
Value from options.opaque.

Build docs developers (and LLMs) love