Skip to main content
The Renderer class provides local video rendering capabilities for Helios compositions. It captures frames from a browser instance and encodes them to video using FFmpeg.

Basic usage

Create a renderer instance and call the render method:
import { Renderer } from '@helios-project/renderer';

const renderer = new Renderer({
  width: 1920,
  height: 1080,
  fps: 60,
  durationInSeconds: 5,
  mode: 'canvas'
});

await renderer.render(
  'http://localhost:3000/composition.html',
  './output.mp4'
);

Renderer options

The RendererOptions interface defines the configuration for rendering:

Required options

width
number
required
Output video width in pixels
height
number
required
Output video height in pixels
fps
number
required
Frames per second for the output video
durationInSeconds
number
required
Duration of the composition in seconds

Render mode

mode
'canvas' | 'dom'
default:"'canvas'"
The rendering mode:
  • 'canvas': Captures frames by converting the first <canvas> element to a data URL using WebCodecs. Best for canvas-based animations.
  • 'dom': Captures frames by taking a screenshot of the entire viewport. Best for CSS/DOM-based animations.
canvasSelector
string
default:"'canvas'"
CSS selector to find the canvas element in ‘canvas’ mode. Supports Shadow DOM.
targetSelector
string
CSS selector to find the target element in ‘dom’ mode. If provided, the screenshot will be limited to this element. Supports deep selection across Shadow DOM boundaries.

Video encoding

videoCodec
string
default:"'libx264'"
The video codec to use for final output (e.g., 'libx264', 'libx265', 'libvpx-vp9', 'copy')
pixelFormat
string
default:"'yuv420p'"
The pixel format for the output video
crf
number
Constant Rate Factor for quality control. Lower values mean better quality. Range varies by codec (typically 0-51 for H.264).
preset
string
default:"'fast'"
Encoding preset (e.g., 'ultrafast', 'fast', 'medium', 'slow', 'veryslow')
videoBitrate
string
Video bitrate (e.g., '5M', '1000k'). If provided, overrides CRF for some codecs.

Intermediate capture

intermediateVideoCodec
string
default:"'vp8'"
The codec to use for intermediate capture in ‘canvas’ mode with WebCodecs:
  • 'vp8': Widely supported, good performance
  • 'vp9': Better compression, higher quality
  • 'av1': Best compression, requires newer hardware/browsers
  • 'avc1.4d002a': H.264 High Profile
intermediateImageFormat
'png' | 'jpeg'
default:"'png'"
The image format to use for intermediate capture in ‘dom’ mode, or as a fallback in ‘canvas’ mode if WebCodecs is not available
intermediateImageQuality
number
The quality of the intermediate image (0-100). Only applicable if intermediateImageFormat is 'jpeg'.
keyFrameInterval
number
The number of frames between keyframes (GOP size) when using WebCodecs in ‘canvas’ mode. Defaults to 2 seconds worth of frames (fps * 2).

Audio configuration

audioFilePath
string
Path to an audio file to include in the output video
audioTracks
(string | AudioTrackConfig)[]
Array of audio tracks to mix with the video. Can be simple file paths or configuration objects.
audioCodec
string
The audio codec to use. Defaults to 'aac' (or 'libvorbis' for WebM)
audioBitrate
string
Audio bitrate (e.g., '128k', '192k')

Advanced options

startFrame
number
default:"0"
The frame to start rendering from. Useful for rendering a range of frames (distributed rendering).
frameCount
number
The exact number of frames to render. If provided, this overrides durationInSeconds for calculating loop limits.
subtitles
string
Path to an SRT file to burn into the video. Note: This requires video transcoding (videoCodec cannot be 'copy').
inputProps
Record<string, any>
Optional props to inject into the composition. These will be available as window.__HELIOS_PROPS__.
randomSeed
number
default:"0x12345678"
Seed for the deterministic random number generator
stabilityTimeout
number
default:"30000"
Timeout in milliseconds to wait for the frame to stabilize (e.g., loading fonts, images, custom hooks)
ffmpegPath
string
Path to the FFmpeg binary. Defaults to the binary provided by @ffmpeg-installer/ffmpeg.
hwAccel
string
Hardware acceleration method to use for FFmpeg (e.g., 'cuda', 'vaapi', 'qsv', 'videotoolbox', 'auto')
webCodecsPreference
'hardware' | 'software' | 'disabled'
default:"'hardware'"
Preference for using WebCodecs hardware acceleration in ‘canvas’ mode:
  • 'hardware': Prioritize hardware-accelerated codecs
  • 'software': Prioritize software-based codecs (useful for deterministic testing)
  • 'disabled': Disable WebCodecs entirely and fall back to image capture
mixInputAudio
boolean
default:"false"
Whether to mix the audio from the input video (stream 0:a) into the output. Useful for multi-pass rendering.

Browser configuration

browserConfig
BrowserConfig
Configuration for the Playwright browser instance:
interface BrowserConfig {
  headless?: boolean;        // Default: true
  executablePath?: string;   // Path to custom browser
  args?: string[];           // Additional browser arguments
}

Audio track configuration

For advanced audio mixing, use the AudioTrackConfig interface:
interface AudioTrackConfig {
  path: string;              // Path to audio file
  buffer?: Buffer;           // Optional buffer instead of file
  volume?: number;           // Volume multiplier (0.0 to 1.0)
  offset?: number;           // Start time in composition (seconds)
  seek?: number;             // Seek into source file (seconds)
  fadeInDuration?: number;   // Fade in duration (seconds)
  fadeOutDuration?: number;  // Fade out duration (seconds)
  loop?: boolean;            // Loop indefinitely
  playbackRate?: number;     // Playback speed multiplier
  duration?: number;         // Source audio duration (seconds)
}

Example with audio tracks

const renderer = new Renderer({
  width: 1920,
  height: 1080,
  fps: 60,
  durationInSeconds: 10,
  audioTracks: [
    {
      path: './music.mp3',
      volume: 0.8,
      fadeInDuration: 1,
      fadeOutDuration: 2
    },
    {
      path: './voiceover.mp3',
      offset: 2,
      volume: 1.0
    }
  ]
});

Job options

The render method accepts optional job configuration:
interface RenderJobOptions {
  onProgress?: (progress: number) => void;
  signal?: AbortSignal;
  tracePath?: string;
}
onProgress
(progress: number) => void
Callback for progress updates. Receives a number between 0 and 1.
signal
AbortSignal
An AbortSignal to cancel the rendering process
tracePath
string
Path to save the Playwright trace file (zip). If provided, Playwright tracing will be enabled for the session.

Example with progress tracking

const controller = new AbortController();

await renderer.render(
  'http://localhost:3000',
  './output.mp4',
  {
    onProgress: (progress) => {
      console.log(`Rendering: ${(progress * 100).toFixed(1)}%`);
    },
    signal: controller.signal,
    tracePath: './trace.zip'
  }
);

Rendering workflow

1

Initialize browser

The Renderer launches a Chromium instance with GPU acceleration flags:
const DEFAULT_BROWSER_ARGS = [
  '--use-gl=egl',
  '--ignore-gpu-blocklist',
  '--enable-gpu-rasterization',
  '--enable-zero-copy',
  '--disable-web-security',
  '--allow-file-access-from-files',
];
2

Navigate to composition

The browser navigates to the composition URL and injects inputProps if provided via window.__HELIOS_PROPS__.
3

Initialize time driver

Based on the mode, either CdpTimeDriver (canvas) or SeekTimeDriver (DOM) is initialized to control time.
4

Prepare render strategy

The selected strategy (Canvas or DOM) prepares for capture:
  • Preloads fonts and images
  • Validates target element exists
  • Initializes WebCodecs encoder (canvas mode)
  • Scans for audio tracks in the DOM
5

Spawn FFmpeg process

FFmpeg is spawned with arguments generated by FFmpegBuilder, which handles:
  • Video input format (H.264, IVF, or image2pipe)
  • Audio track mixing and filtering
  • Output encoding parameters
6

Capture loop

For each frame:
  1. Set the composition time using the time driver
  2. Capture the frame using the render strategy
  3. Write the frame data to FFmpeg’s stdin
  4. Report progress via callback
7

Finalize encoding

After all frames are captured:
  1. Call strategy.finish() to flush any remaining data
  2. Close FFmpeg stdin
  3. Wait for FFmpeg to complete encoding
  4. Clean up browser and temporary resources

GPU acceleration

Helios automatically attempts to use GPU acceleration at multiple levels:

Browser GPU acceleration

The renderer enables GPU acceleration in Chromium with these flags:
  • --use-gl=egl: Use EGL for OpenGL
  • --ignore-gpu-blocklist: Override GPU blocklists
  • --enable-gpu-rasterization: Enable GPU rasterization
  • --enable-zero-copy: Enable zero-copy texture uploads

WebCodecs hardware encoding

In canvas mode, Helios uses the WebCodecs API for hardware-accelerated encoding:
const renderer = new Renderer({
  mode: 'canvas',
  intermediateVideoCodec: 'avc1.4d002a', // H.264 hardware codec
  webCodecsPreference: 'hardware'
});
The renderer automatically detects available hardware codecs and selects the best option. Priority order:
  1. H.264 (AVC) - Best hardware support
  2. VP9 - Good quality/compression
  3. AV1 - Best compression (requires newer hardware)
  4. VP8 - Fallback option

FFmpeg hardware acceleration

For the final encoding step, you can specify FFmpeg hardware acceleration:
const renderer = new Renderer({
  hwAccel: 'cuda',  // For NVIDIA GPUs
  videoCodec: 'h264_nvenc'  // Use NVIDIA encoder
});
Supported hwAccel values:
  • 'cuda': NVIDIA CUDA
  • 'vaapi': Video Acceleration API (Linux)
  • 'qsv': Intel Quick Sync Video
  • 'videotoolbox': Apple VideoToolbox (macOS)
  • 'auto': Let FFmpeg choose
The renderer validates that the requested hardware acceleration method is available in your FFmpeg build. If not available, a warning is logged.

Diagnostics

Use the diagnose() method to check browser and FFmpeg capabilities:
const renderer = new Renderer({ /* options */ });
const diagnostics = await renderer.diagnose();

console.log(diagnostics);
// {
//   browser: {
//     videoEncoder: true,
//     offscreenCanvas: true,
//     userAgent: "...",
//     codecs: {
//       h264: { supported: true, hardware: true, type: 'hardware' },
//       vp8: { supported: true, hardware: false, type: 'software' },
//       vp9: { supported: true, hardware: true, type: 'hardware' },
//       av1: { supported: false }
//     }
//   },
//   ffmpeg: {
//     version: "5.1.2",
//     hwaccels: ["cuda", "vaapi"]
//   }
// }

Error handling

The renderer captures and propagates errors from:
  • Page crashes
  • Console errors
  • FFmpeg failures
  • Abort signals
try {
  await renderer.render(url, output);
} catch (error) {
  if (error.message === 'Aborted') {
    console.log('Render was cancelled');
  } else {
    console.error('Render failed:', error);
  }
}
Page errors are captured during rendering:
// Renderer.ts:104-116
page.on('console', (msg) => console.log(`PAGE LOG: ${msg.text()}`));
page.on('pageerror', (err) => {
  console.error(`PAGE ERROR: ${err.message}`);
  capturedErrors.push(err);
});
page.on('crash', () => {
  const err = new Error('Page crashed!');
  capturedErrors.push(err);
});

Build docs developers (and LLMs) love