Skip to main content

Documentation Index

Fetch the complete documentation index at: https://mintlify.com/nodejs/undici/llms.txt

Use this file to discover all available pages before exploring further.

Undici ships a cache interceptor that stores HTTP responses and serves them from local storage on subsequent requests, eliminating unnecessary round-trips. The interceptor is fully RFC 9111-compliant: it respects Cache-Control directives, validates stale responses via ETag and Last-Modified, and implements stale-while-revalidate for background refreshes. Two pre-built stores are available — an in-memory store suitable for most applications and a SQLite-backed store for durable, persistent caching across process restarts.

Enabling the cache interceptor

Attach the cache interceptor to an Agent (or any dispatcher) using the compose helper:
Cache interceptor with default MemoryCacheStore
import { Agent, interceptors } from 'undici'

const agent = new Agent().compose(
  interceptors.cache()
)

// Use the agent for all requests
const response = await agent.request({
  origin: 'https://api.example.com',
  path: '/data',
  method: 'GET'
})
Set it globally so that all request() and fetch() calls benefit from caching:
Global caching via setGlobalDispatcher
import { Agent, interceptors, setGlobalDispatcher, request } from 'undici'

const agent = new Agent().compose(
  interceptors.cache({
    store: new interceptors.cache.MemoryCacheStore({
      maxSize: 50 * 1024 * 1024, // 50 MB
      maxCount: 512,
      maxEntrySize: 1 * 1024 * 1024 // 1 MB per entry
    })
  })
)

setGlobalDispatcher(agent)

// First request — cache miss, fetches from origin
const { statusCode, body } = await request('https://api.example.com/data')

// Second request — cache hit, served instantly from memory
const { statusCode: cached } = await request('https://api.example.com/data')

Cache interceptor options

OptionTypeDefaultDescription
storeCacheStorenew MemoryCacheStore()The storage backend to use
methodsstring[]['GET']HTTP methods to cache
type'shared' | 'private''shared'Whether this is a shared or private cache
cacheByDefaultnumberundefinedDefault TTL in seconds for responses without Cache-Control
origins(string | RegExp)[]undefinedAllowlist of origins to cache; all others are bypassed

MemoryCacheStore

The default store keeps all cached responses in a JavaScript Map in the current process. It is zero-dependency and suitable for applications where cache persistence across restarts is not required.

Options

OptionTypeDefaultDescription
maxSizenumber104857600 (100 MB)Maximum total size in bytes of all cached responses
maxCountnumber1024Maximum number of cached responses
maxEntrySizenumber5242880 (5 MB)Maximum body size in bytes for a single entry; larger bodies are not cached
Configuring MemoryCacheStore
import { MemoryCacheStore } from 'undici'

const store = new MemoryCacheStore({
  maxSize: 50 * 1024 * 1024,     // 50 MB total
  maxCount: 512,                  // at most 512 entries
  maxEntrySize: 1 * 1024 * 1024  // skip bodies larger than 1 MB
})
When the cache exceeds maxSize or maxCount, the store evicts approximately half of the oldest entries and emits a 'maxSizeExceeded' event:
Listening to cache eviction
import { MemoryCacheStore } from 'undici'

const store = new MemoryCacheStore({ maxCount: 100 })

store.on('maxSizeExceeded', ({ size, maxSize, count, maxCount }) => {
  console.warn(`Cache full — evicting entries (size=${size}, count=${count})`)
})

SqliteCacheStore

The SQLite store persists responses to a database file, surviving process restarts. It uses Node.js’s built-in node:sqlite module, which requires the --experimental-sqlite flag on Node.js versions before it becomes stable.
SqliteCacheStore is only available when node:sqlite is present. Run Node.js with --experimental-sqlite if you are on a version that still gates it behind a flag.
Running with experimental SQLite
node --experimental-sqlite app.js
Configuring SqliteCacheStore
import { Agent, interceptors, SqliteCacheStore, setGlobalDispatcher } from 'undici'

const agent = new Agent().compose(
  interceptors.cache({
    store: new SqliteCacheStore({
      location: './cache.db', // omit for in-memory SQLite
      maxCount: 10000,
      maxEntrySize: 2 * 1024 * 1024 // 2 MB
    })
  })
)

setGlobalDispatcher(agent)
OptionTypeDefaultDescription
locationstring':memory:'Path to the SQLite database file
maxCountnumberInfinityMaximum number of entries
maxEntrySizenumberInfinityMaximum body size in bytes per entry

When to use memory vs SQLite

MemoryCacheStore

Best for ephemeral caching in short-lived processes, CI environments, or when you want zero dependencies and maximum speed. Cache is lost when the process exits.

SqliteCacheStore

Best for long-running servers that benefit from warm caches across restarts, or for CLI tools that make repeated requests to the same endpoints.

Cache-Control header support

The interceptor fully respects Cache-Control directives on both requests and responses.

Response directives

DirectiveBehaviour
max-age=NCache the response for N seconds
no-cacheAlways revalidate before serving the cached response
no-storeNever cache this response
stale-while-revalidate=NServe stale response immediately and revalidate in the background within N seconds of becoming stale
stale-if-error=NServe a stale response if the origin returns an error, within N seconds of becoming stale

Request directives

DirectiveBehaviour
no-cacheForce revalidation regardless of freshness
no-storeBypass cache entirely for this request
max-age=NRefuse responses older than N seconds
max-stale=NAccept stale responses up to N seconds past expiry
min-fresh=NOnly accept responses with at least N seconds of freshness remaining
only-if-cachedOnly return a cached response; return 504 Gateway Timeout if none exists

ETag and Last-Modified validation

When a cached response becomes stale, the interceptor automatically sends a conditional request to revalidate it:
  • ETag: adds an If-None-Match header with the stored ETag value
  • Last-Modified: adds an If-Modified-Since header with the cached date
If the origin responds with 304 Not Modified, the cached entry is refreshed in-place without transferring the body again.
Server response that triggers revalidation
// Server sets these headers; undici caches and later revalidates
HTTP/1.1 200 OK
Cache-Control: max-age=60
ETag: "abc123"
Last-Modified: Mon, 01 Jan 2024 00:00:00 GMT

Configuring which methods to cache

By default only GET requests are cached. To also cache HEAD requests:
Caching GET and HEAD
import { Agent, interceptors, setGlobalDispatcher } from 'undici'

const agent = new Agent().compose(
  interceptors.cache({ methods: ['GET', 'HEAD'] })
)

setGlobalDispatcher(agent)
Only safe HTTP methods (those without side effects) should be cached. Do not add POST, PUT, PATCH, or DELETE to the methods array.

Restricting caching to specific origins

Use the origins option to limit caching to a set of trusted origins:
Origin allowlist
import { Agent, interceptors, setGlobalDispatcher } from 'undici'

const agent = new Agent().compose(
  interceptors.cache({
    origins: [
      'https://api.example.com',
      /^https:\/\/cdn\./  // RegExp allowed
    ]
  })
)

setGlobalDispatcher(agent)
Requests to any origin not in the list are forwarded to the network without caching.

Caching responses without Cache-Control

Some APIs do not set Cache-Control. Use cacheByDefault to assign a default TTL in seconds for these responses:
Default TTL for uncacheable responses
import { Agent, interceptors, setGlobalDispatcher } from 'undici'

const agent = new Agent().compose(
  interceptors.cache({
    cacheByDefault: 300 // cache for 5 minutes if no Cache-Control
  })
)

setGlobalDispatcher(agent)

Writing a custom cache store

Implement the CacheStore interface if you need a different backend (Redis, Memcached, etc.):
Custom cache store interface
class RedisCacheStore {
  // Optional: return true when the store is full
  isFull () {
    return false
  }

  // Return cached value or undefined
  async get (key) {
    const raw = await redis.get(buildKey(key))
    return raw ? JSON.parse(raw) : undefined
  }

  // Return a Writable stream to store the response body
  createWriteStream (key, value) {
    // Write value + streamed body to Redis
    // Return a Writable or undefined if full
  }
}
Pass the store to the cache interceptor just like the built-in stores:
const agent = new Agent().compose(
  interceptors.cache({ store: new RedisCacheStore() })
)

Build docs developers (and LLMs) love