Skip to main content
The FFmpeg muxer service provides WebAssembly-based media muxing capabilities to combine separate video, audio, and subtitle streams into a single container file.

Overview

This service wraps @ffmpeg/ffmpeg to:
  • Mux video and audio streams into MP4 containers
  • Embed WebVTT subtitles into MKV containers
  • Handle stream synchronization and codec configuration
  • Perform automatic cleanup of temporary files

Types

MuxRequest

Configuration object for the muxing operation.
ffmpeg
FFmpeg
required
FFmpeg instance to use for muxing operations
outputFileName
string
required
Name of the output file (e.g., “output.mp4” or “output.mkv”)
videoBlob
Blob
Video stream data as a Blob. If omitted, only audio will be muxed
audioBlob
Blob
Audio stream data as a Blob. If omitted, only video will be muxed
subtitleText
string
WebVTT subtitle content as a string. When provided, output will be MKV format
subtitleLanguage
string
ISO 639 language code for subtitle track (e.g., “en”, “es”). Defaults to “und” (undetermined)

MuxResult

Result of the muxing operation.
blob
Blob
The muxed media file as a Blob
mime
string
MIME type of the output file. Either “video/mp4” or “video/x-matroska” (MKV)

Methods

muxStreams

Muxes video, audio, and subtitle streams into a single container file.
async function muxStreams({
  ffmpeg,
  outputFileName,
  videoBlob,
  audioBlob,
  subtitleText,
  subtitleLanguage,
}: MuxRequest): Promise<MuxResult>

Parameters

request
MuxRequest
required
Mux configuration object

Returns

Promise<MuxResult>
Promise resolving to the muxed file blob and MIME type

Behavior

File handling:
  • Writes input streams to FFmpeg’s virtual filesystem as video.ts, audio.ts, and subtitles.vtt
  • Reads the output file after muxing completes
  • Automatically cleans up temporary files on both success and failure
Stream mapping:
  • Video + Audio: Maps both streams with -map 0:v:0 -map 1:a:0, copies codecs, applies AAC ADTS to ASC bitstream filter
  • Video only: Maps video stream with -map 0:v:0, copies video codec
  • Audio only: Maps audio stream, re-encodes to AAC at 192kbps with resampling to fix timing
Subtitle handling:
  • Subtitles are only supported with MKV containers (video/x-matroska)
  • WebVTT format is used for subtitle encoding
  • Language metadata is added to the subtitle track
Error handling:
  • Throws "No media to mux" if neither video nor audio blob is provided
  • Performs best-effort cleanup of temporary files even on failure

Example

import { FFmpeg } from "@ffmpeg/ffmpeg";
import { muxStreams } from "./ffmpeg-muxer";

const ffmpeg = new FFmpeg();
await ffmpeg.load();

const result = await muxStreams({
  ffmpeg,
  outputFileName: "output.mp4",
  videoBlob: new Blob([videoData]),
  audioBlob: new Blob([audioData]),
});

const url = URL.createObjectURL(result.blob);
The -shortest flag is always applied to prevent the output from extending beyond the shortest input stream.
When including subtitles, the output format must be MKV. MP4 containers do not support embedded WebVTT subtitles.

FFmpeg command examples

Video + audio muxing

ffmpeg -y \
  -i video.ts \
  -i audio.ts \
  -map 0:v:0 -map 1:a:0 \
  -c:v copy -c:a copy \
  -bsf:a aac_adtstoasc \
  -shortest output.mp4

Audio-only with resampling

ffmpeg -y \
  -i audio.ts \
  -map 0:a:0 \
  -c:a aac -b:a 192k \
  -af aresample=async=1:first_pts=0 \
  -shortest output.mp4

Video + audio + subtitles

ffmpeg -y \
  -i video.ts \
  -i audio.ts \
  -i subtitles.vtt \
  -map 0:v:0 -map 1:a:0 -map 2:s:0 \
  -c:v copy -c:a copy -c:s webvtt \
  -bsf:a aac_adtstoasc \
  -metadata:s:s:0 language=en \
  -shortest output.mkv

Source location

~/workspace/source/src/background/src/services/ffmpeg-muxer.ts:26

Build docs developers (and LLMs) love