Skip to main content
This example demonstrates how to create synchronized audio visualizations by analyzing audio data and rendering waveforms and visual effects that respond to the audio content.

Overview

The audio visualization example shows:
  • Synchronous audio buffer creation for deterministic rendering
  • Real-time waveform visualization from audio samples
  • Volume-based visual effects using RMS analysis
  • Frame-perfect audio-visual synchronization

Complete implementation

composition.html
<!DOCTYPE html>
<html lang="en">
<head>
  <meta charset="UTF-8">
  <title>Audio Visualization</title>
  <style>
    body, html { margin: 0; padding: 0; width: 100%; height: 100%; overflow: hidden; background: #000; }
    canvas { width: 100%; height: 100%; display: block; }
  </style>
</head>
<body>
  <canvas id="canvas"></canvas>
  <script type="module" src="./src/main.ts"></script>
</body>
</html>
main.ts
import { Helios } from '@helios-project/core';

// 1. Setup Audio Buffer synchronously
const sampleRate = 44100;
const duration = 10;
const AudioContextClass = (window.AudioContext || (window as any).webkitAudioContext) as typeof AudioContext;
const ctx = new AudioContextClass({ sampleRate });
const buffer = ctx.createBuffer(1, sampleRate * duration, sampleRate);
const data = buffer.getChannelData(0);

// Fill data: Sine sweep + Beats
for (let i = 0; i < data.length; i++) {
    const t = i / sampleRate;
    // Frequency sweep from 100Hz to 1000Hz
    const freq = 100 + (900 * t / duration);
    const sine = Math.sin(2 * Math.PI * freq * t);

    // Beat every 0.5s
    const beatFreq = 2; // Hz
    const beatEnv = Math.exp(-10 * (t * beatFreq % 1)); // Decay envelope
    const kick = Math.sin(2 * Math.PI * 60 * t) * beatEnv;

    data[i] = (sine * 0.5) + (kick * 0.5);
}

// 2. Setup Canvas
const canvas = document.getElementById('canvas') as HTMLCanvasElement;
const canvasCtx = canvas.getContext('2d')!;

function resize() {
    canvas.width = window.innerWidth;
    canvas.height = window.innerHeight;
}
window.addEventListener('resize', resize);
resize();

// 3. Setup Helios
const helios = new Helios({
    fps: 30,
    duration: duration
});

helios.bindToDocumentTimeline();
(window as any).helios = helios;

// 4. Draw Loop
function draw(frame: number) {
    const time = frame / helios.fps.value;
    const { width, height } = canvas;

    // Clear
    canvasCtx.fillStyle = '#111';
    canvasCtx.fillRect(0, 0, width, height);

    // Calculate Sample Window
    const centerSample = Math.floor(time * sampleRate);
    const windowSize = 1024; // Samples to visualize
    const startSample = Math.max(0, centerSample - windowSize / 2);
    const endSample = Math.min(data.length, centerSample + windowSize / 2);

    // Analyze: RMS
    let sumSquares = 0;
    for(let i = startSample; i < endSample; i++) {
        sumSquares += data[i] * data[i];
    }
    const rms = Math.sqrt(sumSquares / (endSample - startSample || 1));

    // Draw Pulsating Circle (Volume)
    const radius = 50 + (rms * 300);
    canvasCtx.beginPath();
    canvasCtx.arc(width / 2, height / 2, radius, 0, Math.PI * 2);
    canvasCtx.fillStyle = `rgba(255, 50, 50, ${0.5 + rms})`;
    canvasCtx.fill();

    // Draw Waveform
    canvasCtx.beginPath();
    canvasCtx.strokeStyle = '#00ffcc';
    canvasCtx.lineWidth = 2;

    for (let i = 0; i < windowSize; i++) {
        const idx = startSample + i;
        if (idx >= data.length) break;

        const sample = data[idx];
        const y = (height / 2) + (sample * (height / 4));
        const x = (i / windowSize) * width;

        if (i === 0) canvasCtx.moveTo(x, y);
        else canvasCtx.lineTo(x, y);
    }
    canvasCtx.stroke();

    // Draw Time Info
    canvasCtx.fillStyle = '#fff';
    canvasCtx.font = '20px monospace';
    canvasCtx.fillText(`Time: ${time.toFixed(2)}s`, 20, 30);
}

helios.subscribe((state: { currentFrame: number }) => draw(state.currentFrame));

Key patterns

Synchronous audio buffer creation

Create audio buffers synchronously before Helios initialization to ensure deterministic rendering:
const sampleRate = 44100;
const duration = 10;
const ctx = new AudioContext({ sampleRate });
const buffer = ctx.createBuffer(1, sampleRate * duration, sampleRate);
const data = buffer.getChannelData(0);
This approach ensures that audio data is available immediately during frame rendering without async delays.

Sample window calculation

Calculate which audio samples to visualize based on current time:
const time = frame / helios.fps.value;
const centerSample = Math.floor(time * sampleRate);
const windowSize = 1024;
const startSample = Math.max(0, centerSample - windowSize / 2);
const endSample = Math.min(data.length, centerSample + windowSize / 2);
This creates a moving window of audio samples that stays synchronized with playback.

RMS volume analysis

Calculate root mean square (RMS) for volume-based effects:
let sumSquares = 0;
for(let i = startSample; i < endSample; i++) {
    sumSquares += data[i] * data[i];
}
const rms = Math.sqrt(sumSquares / (endSample - startSample || 1));
RMS provides a perceptually accurate measure of audio loudness.

Waveform rendering

Draw audio waveforms by mapping samples to canvas coordinates:
for (let i = 0; i < windowSize; i++) {
    const idx = startSample + i;
    if (idx >= data.length) break;

    const sample = data[idx];
    const y = (height / 2) + (sample * (height / 4));
    const x = (i / windowSize) * width;

    if (i === 0) canvasCtx.moveTo(x, y);
    else canvasCtx.lineTo(x, y);
}
canvasCtx.stroke();

Performance tips

Optimize sample window size

Balance visual detail with performance by choosing appropriate window sizes:
  • Small windows (256-512 samples): Fast rendering, less detail
  • Medium windows (1024-2048 samples): Good balance for most use cases
  • Large windows (4096+ samples): High detail, may impact performance

Use typed arrays efficiently

Audio sample data is stored in Float32Array. Access it directly without creating intermediate arrays:
// Good - direct access
const sample = data[idx];

// Avoid - creates intermediate array
const samples = Array.from(data.slice(startSample, endSample));

Cache audio analysis results

For complex visualizations, cache analysis results per frame:
const analysisCache = new Map();

function getAnalysis(frame: number) {
    if (!analysisCache.has(frame)) {
        // Perform expensive analysis
        const result = analyzeFrame(frame);
        analysisCache.set(frame, result);
    }
    return analysisCache.get(frame);
}

Limit canvas redraws

Only redraw elements that change between frames. Use layered canvases for static backgrounds:
// Static background canvas (drawn once)
const bgCanvas = document.createElement('canvas');
const bgCtx = bgCanvas.getContext('2d')!;
// Draw background once

// Animation canvas (redrawn each frame)
function draw(frame: number) {
    // Copy static background
    canvasCtx.drawImage(bgCanvas, 0, 0);
    // Draw dynamic waveform
    drawWaveform();
}

Advanced techniques

Frequency analysis with FFT

For frequency-based visualizations, implement FFT analysis:
import { FFT } from 'fft.js';

const fftSize = 2048;
const fft = new FFT(fftSize);
const samples = new Array(fftSize);
const spectrum = fft.createComplexArray();

function analyzeFrequency(startSample: number) {
    // Fill samples array
    for (let i = 0; i < fftSize; i++) {
        samples[i] = data[startSample + i] || 0;
    }
    
    // Transform to frequency domain
    fft.realTransform(spectrum, samples);
    
    // Use spectrum data for visualization
    return spectrum;
}

Multi-channel audio

Visualize stereo or multi-channel audio:
const buffer = ctx.createBuffer(2, sampleRate * duration, sampleRate);
const leftChannel = buffer.getChannelData(0);
const rightChannel = buffer.getChannelData(1);

function draw(frame: number) {
    // Draw left channel
    drawWaveform(leftChannel, height / 4);
    // Draw right channel
    drawWaveform(rightChannel, (height * 3) / 4);
}

Beat detection

Implement simple beat detection for rhythm-based effects:
function detectBeat(startSample: number, windowSize: number): boolean {
    let energy = 0;
    for (let i = 0; i < windowSize; i++) {
        const sample = data[startSample + i];
        energy += sample * sample;
    }
    
    // Compare to historical average
    const threshold = averageEnergy * 1.5;
    return energy > threshold;
}

Build docs developers (and LLMs) love