Skip to main content
Streams let you process data incrementally without loading it all into memory at once. Bun implements the Web Streams API (ReadableStream, WritableStream, TransformStream) and the Node.js node:stream module.
Bun also implements node:stream, including Readable, Writable, Duplex, and Transform. For complete Node.js stream docs, refer to the Node.js documentation.

ReadableStream

Creating a ReadableStream

const stream = new ReadableStream({
  start(controller) {
    controller.enqueue("hello");
    controller.enqueue("world");
    controller.close();
  },
});

Consuming a ReadableStream

Use for await...of to consume a stream chunk by chunk:
for await (const chunk of stream) {
  console.log(chunk);
}
// "hello"
// "world"

Direct ReadableStream (Bun-specific)

Bun implements an optimized "direct" stream type that avoids copying chunks into an internal queue. Use it when you want to write directly to the stream’s underlying sink:
const stream = new ReadableStream({
  type: "direct",
  pull(controller) {
    controller.write("hello");
    controller.write("world");
    // no controller.close() needed; stream ends when pull() returns
  },
});
The standard enqueue() copies data into a queue. write() on a direct stream skips the queue entirely, which reduces memory pressure and latency for large or frequent writes.

Async generator streams

Pass an async generator function as the body of a Response or Request to create a streaming response without constructing a ReadableStream manually:
const response = new Response(
  (async function* () {
    yield "data: first chunk\n\n";
    await new Promise(r => setTimeout(r, 100));
    yield "data: second chunk\n\n";
  })(),
  { headers: { "Content-Type": "text/event-stream" } },
);
You can also use Symbol.asyncIterator:
const response = new Response({
  [Symbol.asyncIterator]: async function* () {
    yield "hello";
    yield " world";
  },
});

await response.text(); // "hello world"
For more control, yield returns the direct stream controller:
const response = new Response({
  [Symbol.asyncIterator]: async function* () {
    const controller = yield "first";
    await controller.end();
  },
});

WritableStream

const writable = new WritableStream({
  write(chunk) {
    console.log("received:", chunk);
  },
  close() {
    console.log("stream closed");
  },
  abort(reason) {
    console.error("stream aborted:", reason);
  },
});

const writer = writable.getWriter();
await writer.write("hello");
await writer.write("world");
await writer.close();

TransformStream

TransformStream sits between a readable and writable stream, transforming data as it passes through:
const uppercase = new TransformStream({
  transform(chunk, controller) {
    controller.enqueue(chunk.toUpperCase());
  },
});

const reader = uppercase.readable.getReader();
const writer = uppercase.writable.getWriter();

writer.write("hello");
writer.close();

for await (const chunk of uppercase.readable) {
  console.log(chunk); // "HELLO"
}

Piping streams

Use .pipeTo() to pipe a ReadableStream directly into a WritableStream:
const response = await fetch("https://example.com/large-file.bin");
const file = Bun.file("output.bin");

await response.body!.pipeTo(
  new WritableStream({
    write(chunk) {
      // process each chunk
    },
  }),
);
Use .pipeThrough() to pass a stream through a TransformStream:
const compressed = response.body!.pipeThrough(new CompressionStream("gzip"));

Streaming HTTP responses

Return a ReadableStream as a Response body to stream data to the client:
export default {
  fetch() {
    const stream = new ReadableStream({
      async start(controller) {
        for (let i = 0; i < 5; i++) {
          controller.enqueue(`chunk ${i}\n`);
          await new Promise(r => setTimeout(r, 100));
        }
        controller.close();
      },
    });

    return new Response(stream, {
      headers: { "Content-Type": "text/plain" },
    });
  },
};

Bun.ArrayBufferSink

Bun.ArrayBufferSink is a fast, incremental writer for collecting stream chunks into an ArrayBuffer (or Uint8Array).

Basic usage

const sink = new Bun.ArrayBufferSink();

sink.write("h");
sink.write("ello");
sink.write(new Uint8Array([32, 119, 111, 114, 108, 100])); // " world"

const result = sink.end();
// ArrayBuffer containing "hello world"

new TextDecoder().decode(result); // "hello world"

Get a Uint8Array instead of ArrayBuffer

const sink = new Bun.ArrayBufferSink();
sink.start({ asUint8Array: true });

sink.write("hello");
const bytes = sink.end(); // Uint8Array

Stream mode (flush repeatedly)

Use stream: true to flush the sink in chunks rather than collecting everything until .end():
const sink = new Bun.ArrayBufferSink();
sink.start({ stream: true, asUint8Array: true });

sink.write("first ");
const chunk1 = sink.flush(); // Uint8Array("first ")

sink.write("second");
const chunk2 = sink.flush(); // Uint8Array("second")
Each call to .flush() returns the buffered data and resets the internal buffer.

Pre-allocating the buffer

Set highWaterMark to pre-allocate an internal buffer of a known size, which improves performance when writing many small chunks:
const sink = new Bun.ArrayBufferSink();
sink.start({ highWaterMark: 1024 * 1024 }); // pre-allocate 1 MB

Converting between formats

ReadableStream to other formats

Bun provides optimized helper functions that avoid the overhead of wrapping in Response:
// To ArrayBuffer
const buf = await Bun.readableStreamToArrayBuffer(stream);

// To Uint8Array
const bytes = await Bun.readableStreamToBytes(stream);

// To string (UTF-8)
const text = await Bun.readableStreamToText(stream);

// To an array of chunks (each chunk may be string, TypedArray, or ArrayBuffer)
const chunks = await Bun.readableStreamToArray(stream);

// Using Response as an intermediary (also works)
const text2 = await new Response(stream).text();
const buf2 = await new Response(stream).arrayBuffer();

Other types to ReadableStream

// From ArrayBuffer
new ReadableStream({
  start(controller) {
    controller.enqueue(arrayBuffer);
    controller.close();
  },
});

// From Blob or BunFile
blob.stream();

// From string
new ReadableStream({
  start(controller) {
    controller.enqueue(new TextEncoder().encode("hello"));
    controller.close();
  },
});

Split a stream with .tee()

const [streamA, streamB] = originalStream.tee();
// streamA and streamB are independent copies

Node.js streams compatibility

Bun fully implements node:stream. You can use Readable, Writable, Duplex, and Transform exactly as you would in Node.js:
import { Readable, Writable, Transform } from "node:stream";
import { pipeline } from "node:stream/promises";

const readable = Readable.from(["hello", " ", "world"]);

const upper = new Transform({
  transform(chunk, _enc, callback) {
    callback(null, chunk.toString().toUpperCase());
  },
});

const writable = new Writable({
  write(chunk, _enc, callback) {
    console.log(chunk.toString()); // "HELLO WORLD"
    callback();
  },
});

await pipeline(readable, upper, writable);

Converting between Web streams and Node.js streams

import { Readable, Writable } from "node:stream";

// Web ReadableStream → Node.js Readable
const nodeReadable = Readable.fromWeb(webReadableStream);

// Node.js Readable → Web ReadableStream
const webStream = Readable.toWeb(nodeReadable);

// Web WritableStream → Node.js Writable
const nodeWritable = Writable.fromWeb(webWritableStream);

// Node.js Writable → Web WritableStream
const webWritable = Writable.toWeb(nodeWritable);

ArrayBufferSink API reference

class ArrayBufferSink {
  constructor();

  start(options?: {
    asUint8Array?: boolean;
    highWaterMark?: number;
    stream?: boolean;
  }): void;

  write(
    chunk: string | ArrayBufferView | ArrayBuffer | SharedArrayBuffer,
  ): number;

  flush(): ArrayBuffer | Uint8Array | number;
  end(): ArrayBuffer | Uint8Array;
}

Build docs developers (and LLMs) love