Skip to main content
LimelightLib gives your FRC robot seamless access to every Limelight feature through a single LimelightHelpers.java file. Drop it into your WPILib project and start reading pose estimates, tracking AprilTags, running neural networks, and controlling your camera in minutes.

Installation

Add LimelightHelpers.java to your WPILib project in one step

Quickstart

Read your first target and get a pose estimate running fast

AprilTag Localization

Use fiducial tags to determine your robot’s field position

API Reference

Full reference for every method and data class

What you can do with LimelightLib

MegaTag2 Localization

Gyro-fused pose estimation for highly accurate robot localization

Neural Networks

Run object detection and classification pipelines on the camera

Retroreflective Tracking

Track retroreflective tape and color targets with tx/ty offsets

Multi-Camera

Use multiple Limelights simultaneously by name

How it works

1

Copy LimelightHelpers.java into your project

Download the latest release from GitHub and place LimelightHelpers.java in your src/main/java/frc/robot/ directory.
2

Configure your camera in the Limelight web UI

Set up your pipeline (AprilTag, neural detector, retroreflective) using the Limelight web interface at http://limelight.local:5801.
3

Call LimelightHelpers from your robot code

Use the static methods in LimelightHelpers to read target data, pose estimates, and control camera settings over NetworkTables.
4

Fuse vision with your pose estimator

Pass PoseEstimate results directly into WPILib’s SwerveDrivePoseEstimator or DifferentialDrivePoseEstimator via addVisionMeasurement.
LimelightLib v1.14 requires LLOS 2026.0 or later on your Limelight camera. Check your firmware version in the Limelight web interface before use.

Build docs developers (and LLMs) love