Skip to main content
Helios provides comprehensive audio support with precise synchronization, multi-track mixing, and per-track volume control. Audio elements are automatically discovered and synchronized with the timeline.

Audio synchronization

Audio elements in your composition are automatically synchronized with Helios playback when using the DomDriver.
import { Helios } from '@helios-project/core';

const helios = new Helios({
  duration: 10,
  fps: 30,
  autoSyncAnimations: true // Enables DomDriver for audio sync
});

How it works

  1. Automatic discovery: The DomDriver scans the DOM for <audio> and <video> elements with data-helios-track attributes
  2. Synchronization: Media playback is kept in sync with the Helios timeline
  3. Precise seeking: Audio seeks to the exact position when you seek the Helios timeline
  4. Rate control: Playback rate changes are synchronized

Audio tracks

Identify audio elements with the data-helios-track attribute to enable automatic discovery and control.
function Composition() {
  return (
    <>
      <audio
        data-helios-track="background-music"
        src="/audio/background.mp3"
      />
      <audio
        data-helios-track="voiceover"
        src="/audio/narration.mp3"
      />
      <div>Visual content</div>
    </>
  );
}

Track metadata

Discovered audio tracks are exposed through the availableAudioTracks signal:
const tracks = helios.availableAudioTracks.value;
// [
//   { id: 'background-music', label: 'background-music' },
//   { id: 'voiceover', label: 'voiceover' }
// ]
id
string
Track identifier from data-helios-track
label
string
Human-readable track name

Volume control

Control audio volume at both the global and per-track level.

Global volume

// Set master volume (0.0 to 1.0)
helios.setAudioVolume(0.8);

// Get current volume
const volume = helios.volume.value;

// Mute/unmute all audio
helios.setAudioMuted(true);
const isMuted = helios.muted.value;

Per-track volume

// Set volume for specific track
helios.setAudioTrackVolume('background-music', 0.5);

// Mute/unmute specific track
helios.setAudioTrackMuted('voiceover', true);

// Get track state
const trackState = helios.audioTracks.value['background-music'];
// { volume: 0.5, muted: false }

Audio state

The audio state is exposed through Helios signals and state:
helios.subscribe((state) => {
  console.log('Master volume:', state.volume);
  console.log('Master muted:', state.muted);
  console.log('Track states:', state.audioTracks);
  console.log('Available tracks:', state.availableAudioTracks);
});
volume
number
Master volume level (0.0 to 1.0)
muted
boolean
Master mute state
audioTracks
Record<string, AudioTrackState>
Per-track volume and mute states
availableAudioTracks
AudioTrackMetadata[]
List of discovered audio tracks

Audio visualization

Access the Web Audio API for audio analysis and visualization.

Getting the audio context

const audioContext = await helios.getAudioContext();

if (audioContext) {
  // Use Web Audio API for visualization
}

Getting track source nodes

const sourceNode = await helios.getAudioSourceNode('background-music');

if (sourceNode) {
  // Connect to analyser or other audio nodes
  const analyser = audioContext.createAnalyser();
  sourceNode.connect(analyser);
}

Example: Waveform visualization

import { Helios } from '@helios-project/core';
import { useVideoFrame } from './hooks/useVideoFrame';
import { useEffect, useRef, useState } from 'react';

function AudioVisualizer({ helios, trackId }) {
  const frame = useVideoFrame(helios);
  const canvasRef = useRef(null);
  const [analyser, setAnalyser] = useState(null);
  
  useEffect(() => {
    async function setup() {
      const context = await helios.getAudioContext();
      const sourceNode = await helios.getAudioSourceNode(trackId);
      
      if (context && sourceNode) {
        const analyserNode = context.createAnalyser();
        analyserNode.fftSize = 2048;
        sourceNode.connect(analyserNode);
        setAnalyser(analyserNode);
      }
    }
    setup();
  }, [helios, trackId]);
  
  useEffect(() => {
    if (!analyser || !canvasRef.current) return;
    
    const canvas = canvasRef.current;
    const ctx = canvas.getContext('2d');
    const bufferLength = analyser.frequencyBinCount;
    const dataArray = new Uint8Array(bufferLength);
    
    analyser.getByteTimeDomainData(dataArray);
    
    // Clear canvas
    ctx.fillStyle = 'rgb(0, 0, 0)';
    ctx.fillRect(0, 0, canvas.width, canvas.height);
    
    // Draw waveform
    ctx.lineWidth = 2;
    ctx.strokeStyle = 'rgb(0, 255, 0)';
    ctx.beginPath();
    
    const sliceWidth = canvas.width / bufferLength;
    let x = 0;
    
    for (let i = 0; i < bufferLength; i++) {
      const v = dataArray[i] / 128.0;
      const y = (v * canvas.height) / 2;
      
      if (i === 0) {
        ctx.moveTo(x, y);
      } else {
        ctx.lineTo(x, y);
      }
      
      x += sliceWidth;
    }
    
    ctx.lineTo(canvas.width, canvas.height / 2);
    ctx.stroke();
  }, [analyser, frame]);
  
  return <canvas ref={canvasRef} width={800} height={200} />;
}

Manual track management

In headless or server environments where DOM scanning isn’t available, manually specify available tracks:
const helios = new Helios({
  duration: 10,
  fps: 30,
  availableAudioTracks: [
    { id: 'track-1', label: 'Background Music' },
    { id: 'track-2', label: 'Voiceover' }
  ],
  audioTracks: {
    'track-1': { volume: 0.7, muted: false },
    'track-2': { volume: 1.0, muted: false }
  }
});

// Or update at runtime
helios.setAvailableAudioTracks([
  { id: 'new-track', label: 'Sound Effects' }
]);

Common patterns

Audio mixer UI

import { Helios } from '@helios-project/core';
import { useState, useEffect } from 'react';

function AudioMixer({ helios }) {
  const [tracks, setTracks] = useState([]);
  const [trackStates, setTrackStates] = useState({});
  
  useEffect(() => {
    return helios.subscribe((state) => {
      setTracks(state.availableAudioTracks);
      setTrackStates(state.audioTracks);
    });
  }, [helios]);
  
  return (
    <div>
      <h3>Audio Mixer</h3>
      {tracks.map((track) => {
        const state = trackStates[track.id] || { volume: 1, muted: false };
        
        return (
          <div key={track.id}>
            <label>{track.label}</label>
            <input
              type="range"
              min="0"
              max="1"
              step="0.01"
              value={state.volume}
              onChange={(e) => {
                helios.setAudioTrackVolume(track.id, parseFloat(e.target.value));
              }}
            />
            <button
              onClick={() => {
                helios.setAudioTrackMuted(track.id, !state.muted);
              }}
            >
              {state.muted ? 'Unmute' : 'Mute'}
            </button>
          </div>
        );
      })}
    </div>
  );
}

Dynamic audio ducking

function DynamicAudioDucking({ helios, frame }) {
  const scenes = [
    { from: 0, hasVoiceover: false },
    { from: 90, hasVoiceover: true },  // Duck music during voiceover
    { from: 180, hasVoiceover: false },
    { from: 270, hasVoiceover: true }
  ];
  
  const currentScene = scenes.findLast(s => frame >= s.from);
  
  useEffect(() => {
    if (currentScene?.hasVoiceover) {
      helios.setAudioTrackVolume('background-music', 0.3); // Duck to 30%
    } else {
      helios.setAudioTrackVolume('background-music', 1.0); // Full volume
    }
  }, [currentScene, helios]);
  
  return null;
}

Audio-reactive animations

import { useState, useEffect } from 'react';

function AudioReactiveCircle({ helios, trackId }) {
  const [volume, setVolume] = useState(0);
  
  useEffect(() => {
    let rafId;
    
    async function setup() {
      const context = await helios.getAudioContext();
      const sourceNode = await helios.getAudioSourceNode(trackId);
      
      if (!context || !sourceNode) return;
      
      const analyser = context.createAnalyser();
      analyser.fftSize = 256;
      sourceNode.connect(analyser);
      
      const dataArray = new Uint8Array(analyser.frequencyBinCount);
      
      function update() {
        analyser.getByteFrequencyData(dataArray);
        const average = dataArray.reduce((a, b) => a + b) / dataArray.length;
        setVolume(average / 255);
        rafId = requestAnimationFrame(update);
      }
      
      update();
    }
    
    setup();
    return () => cancelAnimationFrame(rafId);
  }, [helios, trackId]);
  
  const scale = 1 + volume * 2;
  
  return (
    <div style={{
      width: 100,
      height: 100,
      borderRadius: '50%',
      background: 'cyan',
      transform: `scale(${scale})`
    }} />
  );
}

Programmatic audio control

function AudioController({ helios, frame }) {
  useEffect(() => {
    // Fade in music at start
    if (frame < 30) {
      const volume = frame / 30;
      helios.setAudioTrackVolume('background-music', volume);
    }
    
    // Fade out music at end
    const endFrame = helios.duration.value * helios.fps.value;
    if (frame > endFrame - 30) {
      const volume = (endFrame - frame) / 30;
      helios.setAudioTrackVolume('background-music', volume);
    }
  }, [frame, helios]);
  
  return null;
}

Build docs developers (and LLMs) love