Skip to main content

Documentation Index

Fetch the complete documentation index at: https://mintlify.com/rpng/open_vins/llms.txt

Use this file to discover all available pages before exploring further.

Visual-inertial sensing fuses two complementary modalities: cameras provide high-density external measurements of the environment, while IMUs measure the platform’s internal ego-motion. The IMU is crucial for providing estimator robustness and resolving scale in monocular setups. However, this fusion is only as good as the calibration between the two sensors. Small errors in camera-IMU extrinsics, time offsets, or IMU noise parameters can rapidly degrade performance on dynamic trajectories. OpenVINS addresses this by supporting online estimation of camera-IMU extrinsic transforms and time offsets during operation, but a high-quality offline calibration remains the best starting point.
OpenVINS supports online calibration of camera extrinsics, intrinsics, and time offset via the calib_cam_extrinsics, calib_cam_intrinsics, and calib_cam_timeoffset flags in estimator_config.yaml. Even so, starting from a well-calibrated offline result leads to faster convergence and more accurate trajectories.

Camera Intrinsic Calibration

Camera intrinsic calibration recovers the focal length, principal point, and lens distortion coefficients. OpenVINS supports both the pinhole-radtan (radial-tangential) and pinhole-equi (equidistant/fisheye) camera models. The Kalibr toolbox is the recommended tool for this step.
1

Install Kalibr

Clone and build the Kalibr toolbox in your ROS workspace:
cd ~/catkin_ws/src
git clone https://github.com/ethz-asl/kalibr.git
cd ~/catkin_ws && catkin build kalibr
2

Print a calibration target

Print the Aprilgrid 6x6 0.8×0.8 m target on an A0 page. Flat, rigid mounting is essential — any warping in the target introduces calibration error.
3

Prepare your sensor

Before recording, configure the camera for sharp images:
  • Lower exposure time to minimize motion blur (high framerate helps)
  • Verify the camera is in focus using kalibr_camera_focus or manual inspection
  • Ensure the calibration target fits fully inside the image at multiple distances and orientations
4

Record a calibration bag

Move either the board or the camera so the target appears at different positions, orientations, and distances across the full image plane. A variety of viewing angles is critical for recovering distortion coefficients.
rosbag record -O static_calib.bag /camera/image_raw
5

Run the camera calibration

Choose the camera model based on your lens type:
kalibr_calibrate_cameras \
  --bag static_calib.bag \
  --topics /camera/image_raw \
  --models pinhole-radtan \
  --target aprilgrid_6x6.yaml \
  --bag-freq 10.0
The --bag-freq 10.0 flag subsamples the bag to reduce processing time, which can otherwise take several hours on large datasets.
6

Inspect the results

Review the output PDF report. A well-calibrated camera should show reprojection errors below 0.2–0.5 pixels. If online calibration is planned, values up to 1 pixel may be acceptable, but a tighter offline result always leads to better online convergence.
  • Move slowly and smoothly to avoid motion blur, especially with global shutter cameras at low framerates.
  • Cover all corners of the image — distortion is largest at the periphery and cannot be estimated if the target never appears there.
  • Vary the distance between the target and the camera so that the focal length is well-constrained.
  • Do not move just one axis — tilt the board in multiple directions to ensure good angular coverage.
  • An example calibration script, dataset, and config can be found in the RPNG ar_table_dataset repository: calibrate_camera_static.sh.

IMU Noise Calibration

The IMU noise model used by OpenVINS (and Kalibr for dynamic calibration) requires four parameters characterizing the stochastic behavior of the gyroscope and accelerometer. These values enter directly into the filter’s covariance propagation.
ParameterYAML keySymbolUnits
Gyroscope white noisegyroscope_noise_densityσ_grad/s / √Hz
Accelerometer white noiseaccelerometer_noise_densityσ_am/s² / √Hz
Gyroscope random walkgyroscope_random_walkσ_bgrad/s² / √Hz
Accelerometer random walkaccelerometer_random_walkσ_bam/s³ / √Hz
These four values are recovered from an Allan variance analysis. Plotting the Allan deviation against averaging time τ reveals two characteristic slopes: a line with slope −1/2 on the left (read at τ=1 s for white noise) and a line with slope +1/2 on the right (read at τ=3 s for random walk). The allan_variance_ros project automates this process.
1

Install allan_variance_ros

cd ~/catkin_ws/src
git clone https://github.com/ori-drs/allan_variance_ros.git
cd ~/catkin_ws && catkin build allan_variance_ros
2

Collect a stationary dataset

Place the sensor completely still on a stable surface and record at least 2–3 hours of IMU data. Longer is better — the authors recommend up to 20 hours for a reliable random walk estimate.
rosbag record -O imu_static.bag /imu0
3

Create the config file

Create a YAML config specifying your IMU topic, rate, and dataset duration:
imu_topic: /imu0
imu_rate: 200
measure_rate: 200
sequence_time: 7200  # seconds (2 hours)
4

Run the analysis

# Generate the Allan variance data files
rosrun allan_variance_ros allan_variance imu_static.bag config.yaml

# Compute and plot the noise parameters
rosrun allan_variance_ros analysis.py --data allan_variance.csv
The script outputs the four noise density values directly.
5

Add to your IMU config

Write the recovered values into your kalibr_imu_chain.yaml. The EuRoC MAV ADIS16448 IMU uses:
imu0:
  accelerometer_noise_density: 2.0000e-3   # m/s^2/sqrt(Hz)
  accelerometer_random_walk:   3.0000e-3   # m/s^3/sqrt(Hz)
  gyroscope_noise_density:     1.6968e-04  # rad/s/sqrt(Hz)
  gyroscope_random_walk:       1.9393e-05  # rad/s^2/sqrt(Hz)
  update_rate: 200.0
  model: "kalibr"
Inflate the recovered noise values by 10–20× in practice. This accounts for unmodelled errors (vibration, temperature drift, quantization) and keeps the filter from being overconfident. Test different inflation levels against ground truth to find the best tradeoff for your hardware.
An example script is available at calibrate_imu.sh in the RPNG ar_table_dataset repository.

Camera-IMU Extrinsic Calibration

With intrinsics and IMU noise in hand, the final step is to estimate the rigid transform between the camera and the IMU — the rotation R_ItoC and translation p_CinI. Kalibr’s kalibr_calibrate_imu_camera tool solves this by fitting an SE(3) B-spline to the synchronized camera and IMU data.
Camera-IMU extrinsic calibration requires exciting all axes of the IMU. Per the degenerate motion analysis in Geneva et al. (2019), you need at least one translational motion and two degrees of orientation change for these parameters to be observable. Pure translation or rotation-only motions will result in an unobservable (or poorly conditioned) calibration.
1

Prepare your calibration target and sensor

Use the same Aprilgrid target from the intrinsic step. Configure the camera for:
  • Low exposure time to minimize motion blur during dynamic motion
  • Image rate of 20–30 Hz (high enough for good B-spline fitting)
  • IMU rate of 200–500 Hz (the higher the better for spline accuracy)
2

Create an imu.yaml file

Kalibr needs an IMU description file with your measured noise parameters:
imu0:
  accelerometer_noise_density: 2.0e-3
  accelerometer_random_walk: 3.0e-3
  gyroscope_noise_density: 1.6968e-4
  gyroscope_random_walk: 1.9393e-5
  rostopic: /imu0
  update_rate: 200.0
A template is available here.
3

Record a dynamic calibration bag

Move the sensor smoothly and continuously while keeping the Aprilgrid in view. Aim for:
  • 30–60 seconds of data
  • Rotations about all three axes (pitch, roll, yaw)
  • Some translational movement as well
  • Smooth, non-jerky motions — abrupt movements degrade the spline fit
rosbag record -O dynamic_calib.bag /camera/image_raw /imu0
4

Run the camera-IMU calibration

kalibr_calibrate_imu_camera \
  --bag dynamic_calib.bag \
  --cam camchain.yaml \
  --imu imu.yaml \
  --target aprilgrid_6x6.yaml
This can take several hours depending on dataset length and machine speed.
5

Validate the output

Inspect the output PDF carefully:
  • The B-spline fit to the IMU data should look smooth and track the raw measurements closely
  • Accelerometer and gyroscope residuals should stay within their 3σ bounds
  • Estimated biases should not drift outside the 3σ envelope (if they do, your trajectory was too aggressive or your noise values are wrong)
  • Cross-check the estimated rotation and translation with hand-measured values
  • Inspect the IMU time dt plot in the PDF report carefully. Irregular IMU timestamps cause spline fitting artifacts.
  • If the accelerometer or gyroscope residuals exceed 3σ, try recollecting with gentler motion or increasing your noise parameter estimates.
  • More orientation excitation is always better — practice a figure-eight or random rotation sequence before recording.
  • An example script is available at calibrate_camera_dynamic.sh.

Build docs developers (and LLMs) love