Skip to main content
The Renderer class is the primary interface for rendering browser-based animations to video files. It orchestrates the entire rendering pipeline including browser automation, frame capture, and FFmpeg encoding.

Constructor

new Renderer(options: RendererOptions)
Creates a new Renderer instance with the specified configuration.
options
RendererOptions
required
Configuration object for the renderer. See render options for all available fields.

Basic usage

import { Renderer } from '@helios/renderer';

const renderer = new Renderer({
  width: 1920,
  height: 1080,
  fps: 30,
  durationInSeconds: 10,
  mode: 'canvas'
});

Methods

render

async render(
  compositionUrl: string,
  outputPath: string,
  jobOptions?: RenderJobOptions
): Promise<void>
Renders a composition to a video file.
compositionUrl
string
required
URL of the HTML page containing the animation composition. Can be a local file path (file://) or HTTP URL.
outputPath
string
required
File path where the rendered video will be saved. The file extension determines the container format (e.g., .mp4, .webm, .mov).
jobOptions
RenderJobOptions
Optional job-specific configuration including progress callbacks, abort signals, and tracing. See render options for details.

Example

await renderer.render(
  'file:///path/to/composition.html',
  './output/animation.mp4',
  {
    onProgress: (progress) => {
      console.log(`Progress: ${(progress * 100).toFixed(1)}%`);
    },
    signal: abortController.signal
  }
);

diagnose

async diagnose(): Promise<any>
Runs diagnostic checks to verify browser and FFmpeg capabilities. Returns an object containing:
  • browser: Browser diagnostics including GPU info, WebCodecs support, and rendering mode capabilities
  • ffmpeg: FFmpeg diagnostics including version, available hardware acceleration methods, and encoders

Example

const diagnostics = await renderer.diagnose();
console.log('FFmpeg version:', diagnostics.ffmpeg.version);
console.log('Hardware acceleration:', diagnostics.ffmpeg.hwaccels);
console.log('WebCodecs supported:', diagnostics.browser.webCodecsSupported);

Rendering modes

The Renderer supports two capture strategies:

Canvas mode

Best for: Canvas-based animations, WebGL content, game engines
const renderer = new Renderer({
  mode: 'canvas',
  canvasSelector: 'canvas', // Default: first canvas element
  intermediateVideoCodec: 'vp8', // vp8, vp9, or av1
  // ... other options
});
Canvas mode uses WebCodecs API for efficient video encoding directly from canvas frames. Falls back to image capture if WebCodecs is unavailable.

DOM mode

Best for: CSS animations, DOM-based content, text animations
const renderer = new Renderer({
  mode: 'dom',
  targetSelector: '#animation-container', // Optional: limit to specific element
  intermediateImageFormat: 'png', // png or jpeg
  // ... other options
});
DOM mode captures viewport screenshots using Playwright, providing pixel-perfect rendering of CSS and DOM elements.

FFmpeg integration

The Renderer automatically manages FFmpeg for video encoding:

Default configuration

const renderer = new Renderer({
  width: 1920,
  height: 1080,
  fps: 30,
  durationInSeconds: 10,
  // FFmpeg uses these defaults:
  // videoCodec: 'libx264'
  // pixelFormat: 'yuv420p'
  // preset: 'fast'
  // audioCodec: 'aac'
});

Custom FFmpeg settings

const renderer = new Renderer({
  width: 3840,
  height: 2160,
  fps: 60,
  durationInSeconds: 30,
  videoCodec: 'libx265',
  crf: 18, // Lower = better quality
  preset: 'slow', // slower = better compression
  pixelFormat: 'yuv420p10le', // 10-bit color
  videoBitrate: '10M',
  ffmpegPath: '/usr/local/bin/ffmpeg' // Custom FFmpeg binary
});

Hardware acceleration

const renderer = new Renderer({
  width: 1920,
  height: 1080,
  fps: 30,
  durationInSeconds: 10,
  hwAccel: 'cuda', // cuda, vaapi, qsv, videotoolbox, or auto
  videoCodec: 'h264_nvenc' // Hardware-accelerated encoder
});
The renderer validates that the specified hwAccel method is available in your FFmpeg build. Check diagnostics output for available acceleration methods.

Audio integration

Single audio file

const renderer = new Renderer({
  width: 1920,
  height: 1080,
  fps: 30,
  durationInSeconds: 10,
  audioFilePath: './audio/music.mp3'
});

Multiple audio tracks

const renderer = new Renderer({
  width: 1920,
  height: 1080,
  fps: 30,
  durationInSeconds: 10,
  audioTracks: [
    {
      path: './audio/music.mp3',
      volume: 0.8,
      fadeInDuration: 2,
      fadeOutDuration: 3
    },
    {
      path: './audio/voiceover.wav',
      offset: 5, // Start at 5 seconds
      volume: 1.0
    }
  ],
  audioCodec: 'aac',
  audioBitrate: '192k'
});

Subtitles

const renderer = new Renderer({
  width: 1920,
  height: 1080,
  fps: 30,
  durationInSeconds: 10,
  subtitles: './subtitles.srt' // Burns subtitles into video
});
Subtitles require video transcoding. You cannot use videoCodec: 'copy' when subtitles are enabled.

Input props

Pass data to your composition at runtime:
const renderer = new Renderer({
  width: 1920,
  height: 1080,
  fps: 30,
  durationInSeconds: 10,
  inputProps: {
    title: 'My Video',
    username: 'johndoe',
    theme: 'dark'
  }
});
Access props in your composition:
// In your HTML composition
const props = window.__HELIOS_PROPS__;
console.log(props.title); // 'My Video'

Deterministic rendering

Ensure reproducible renders across different machines:
const renderer = new Renderer({
  width: 1920,
  height: 1080,
  fps: 30,
  durationInSeconds: 10,
  randomSeed: 0x12345678, // Seed for Math.random()
  webCodecsPreference: 'software' // Avoid hardware variance
});
The renderer automatically:
  • Overrides Math.random() with a seeded PRNG
  • Controls time progression deterministically
  • Prevents race conditions in async operations

Browser configuration

const renderer = new Renderer({
  width: 1920,
  height: 1080,
  fps: 30,
  durationInSeconds: 10,
  browserConfig: {
    headless: false, // Show browser window
    executablePath: '/usr/bin/google-chrome', // Custom browser
    args: [
      '--disable-gpu',
      '--no-sandbox'
    ]
  }
});
Default browser arguments (always applied):
[
  '--use-gl=egl',
  '--ignore-gpu-blocklist',
  '--enable-gpu-rasterization',
  '--enable-zero-copy',
  '--disable-web-security',
  '--allow-file-access-from-files'
]

Error handling

try {
  await renderer.render(
    'file:///composition.html',
    './output.mp4',
    {
      onProgress: (p) => console.log(`${(p * 100).toFixed(0)}%`)
    }
  );
} catch (error) {
  if (error.message === 'Aborted') {
    console.log('Render was cancelled');
  } else {
    console.error('Render failed:', error);
  }
}
The renderer captures and propagates:
  • Page JavaScript errors
  • Browser crashes
  • FFmpeg encoding failures
  • Abort signal cancellations

Console output

The Renderer logs detailed information during rendering:
Starting render for composition: file:///composition.html (Mode: canvas)
[Helios Diagnostics] FFmpeg Version: 4.4.2
[Helios Diagnostics] FFmpeg HW Accel: cuda, vaapi
Navigating to file:///composition.html...
Page loaded.
Running diagnostics...
[Helios Diagnostics] {"webCodecsSupported": true, "gpuInfo": {...}}
Preparing render strategy...
Strategy prepared.
Starting capture for 300 frames...
Progress: Rendered 30 / 300 frames
Progress: Rendered 60 / 300 frames
...
Finishing render strategy...
Finished sending frames. Closing FFmpeg stdin.
FFmpeg has finished processing.
Browser closed.
Render complete! Output saved to: ./output.mp4

Build docs developers (and LLMs) love