Visual-inertial sensing fuses two complementary modalities: cameras provide high-density external measurements of the environment, while IMUs measure the platform’s internal ego-motion. The IMU is crucial for providing estimator robustness and resolving scale in monocular setups. However, this fusion is only as good as the calibration between the two sensors. Small errors in camera-IMU extrinsics, time offsets, or IMU noise parameters can rapidly degrade performance on dynamic trajectories. OpenVINS addresses this by supporting online estimation of camera-IMU extrinsic transforms and time offsets during operation, but a high-quality offline calibration remains the best starting point.Documentation Index
Fetch the complete documentation index at: https://mintlify.com/rpng/open_vins/llms.txt
Use this file to discover all available pages before exploring further.
OpenVINS supports online calibration of camera extrinsics, intrinsics, and time offset via the
calib_cam_extrinsics, calib_cam_intrinsics, and calib_cam_timeoffset flags in estimator_config.yaml. Even so, starting from a well-calibrated offline result leads to faster convergence and more accurate trajectories.Camera Intrinsic Calibration
Camera intrinsic calibration recovers the focal length, principal point, and lens distortion coefficients. OpenVINS supports both the pinhole-radtan (radial-tangential) and pinhole-equi (equidistant/fisheye) camera models. The Kalibr toolbox is the recommended tool for this step.Print a calibration target
Print the Aprilgrid 6x6 0.8×0.8 m target on an A0 page. Flat, rigid mounting is essential — any warping in the target introduces calibration error.
Prepare your sensor
Before recording, configure the camera for sharp images:
- Lower exposure time to minimize motion blur (high framerate helps)
- Verify the camera is in focus using
kalibr_camera_focusor manual inspection - Ensure the calibration target fits fully inside the image at multiple distances and orientations
Record a calibration bag
Move either the board or the camera so the target appears at different positions, orientations, and distances across the full image plane. A variety of viewing angles is critical for recovering distortion coefficients.
Run the camera calibration
Choose the camera model based on your lens type:The
--bag-freq 10.0 flag subsamples the bag to reduce processing time, which can otherwise take several hours on large datasets.Tips for a good calibration dataset
Tips for a good calibration dataset
- Move slowly and smoothly to avoid motion blur, especially with global shutter cameras at low framerates.
- Cover all corners of the image — distortion is largest at the periphery and cannot be estimated if the target never appears there.
- Vary the distance between the target and the camera so that the focal length is well-constrained.
- Do not move just one axis — tilt the board in multiple directions to ensure good angular coverage.
- An example calibration script, dataset, and config can be found in the RPNG ar_table_dataset repository:
calibrate_camera_static.sh.
IMU Noise Calibration
The IMU noise model used by OpenVINS (and Kalibr for dynamic calibration) requires four parameters characterizing the stochastic behavior of the gyroscope and accelerometer. These values enter directly into the filter’s covariance propagation.| Parameter | YAML key | Symbol | Units |
|---|---|---|---|
| Gyroscope white noise | gyroscope_noise_density | σ_g | rad/s / √Hz |
| Accelerometer white noise | accelerometer_noise_density | σ_a | m/s² / √Hz |
| Gyroscope random walk | gyroscope_random_walk | σ_bg | rad/s² / √Hz |
| Accelerometer random walk | accelerometer_random_walk | σ_ba | m/s³ / √Hz |
Collect a stationary dataset
Place the sensor completely still on a stable surface and record at least 2–3 hours of IMU data. Longer is better — the authors recommend up to 20 hours for a reliable random walk estimate.
calibrate_imu.sh in the RPNG ar_table_dataset repository.
Camera-IMU Extrinsic Calibration
With intrinsics and IMU noise in hand, the final step is to estimate the rigid transform between the camera and the IMU — the rotationR_ItoC and translation p_CinI. Kalibr’s kalibr_calibrate_imu_camera tool solves this by fitting an SE(3) B-spline to the synchronized camera and IMU data.
Prepare your calibration target and sensor
Use the same Aprilgrid target from the intrinsic step. Configure the camera for:
- Low exposure time to minimize motion blur during dynamic motion
- Image rate of 20–30 Hz (high enough for good B-spline fitting)
- IMU rate of 200–500 Hz (the higher the better for spline accuracy)
Create an imu.yaml file
Kalibr needs an IMU description file with your measured noise parameters:A template is available here.
Record a dynamic calibration bag
Move the sensor smoothly and continuously while keeping the Aprilgrid in view. Aim for:
- 30–60 seconds of data
- Rotations about all three axes (pitch, roll, yaw)
- Some translational movement as well
- Smooth, non-jerky motions — abrupt movements degrade the spline fit
Run the camera-IMU calibration
Validate the output
Inspect the output PDF carefully:
- The B-spline fit to the IMU data should look smooth and track the raw measurements closely
- Accelerometer and gyroscope residuals should stay within their 3σ bounds
- Estimated biases should not drift outside the 3σ envelope (if they do, your trajectory was too aggressive or your noise values are wrong)
- Cross-check the estimated rotation and translation with hand-measured values
Tips for a successful dynamic calibration
Tips for a successful dynamic calibration
- Inspect the IMU time
dtplot in the PDF report carefully. Irregular IMU timestamps cause spline fitting artifacts. - If the accelerometer or gyroscope residuals exceed 3σ, try recollecting with gentler motion or increasing your noise parameter estimates.
- More orientation excitation is always better — practice a figure-eight or random rotation sequence before recording.
- An example script is available at
calibrate_camera_dynamic.sh.