Skip to main content
This guide walks through the four things most FRC teams need first: checking for a valid target, reading angular offsets, getting a MegaTag2 pose estimate, and passing that estimate to WPILib’s pose estimator.
Passing an empty string "" as the camera name is equivalent to passing "limelight" — both refer to the default camera hostname. When using multiple Limelights, pass each camera’s configured hostname (e.g., "limelight-front", "limelight-back").
1

Check for a valid target

Before reading any target data, confirm the camera has a valid detection. getTV returns true when at least one target is present in the active pipeline.
Robot.java
boolean hasTarget = LimelightHelpers.getTV("limelight");

if (hasTarget) {
    // safe to read target data
}
2

Read horizontal and vertical offsets

getTX returns the horizontal angle from the crosshair to the primary target in degrees. getTY returns the vertical angle. Both values are relative to the configured crosshair position.
Robot.java
double tx = LimelightHelpers.getTX("limelight"); // degrees left/right
double ty = LimelightHelpers.getTY("limelight"); // degrees up/down
Use tx to steer toward a target and ty to estimate distance with trigonometry.
3

Get a MegaTag2 pose estimate

MegaTag2 fuses your robot’s gyroscope heading with AprilTag observations to produce accurate field-relative pose estimates. You must call SetRobotOrientation with the current heading every loop before calling the MegaTag2 method.
Call SetRobotOrientation at the top of periodic() on every loop cycle, not just when you plan to use the pose estimate. The Limelight uses the orientation data internally to run the fusion algorithm.
Robot.java
// currentYawDegrees comes from your gyroscope (e.g., navX or Pigeon2)
LimelightHelpers.SetRobotOrientation(
    "limelight",
    currentYawDegrees,  // robot yaw — required
    0,                  // yaw rate — not required, pass 0
    0,                  // pitch   — not required, pass 0
    0,                  // pitch rate — not required, pass 0
    0,                  // roll    — not required, pass 0
    0                   // roll rate  — not required, pass 0
);

LimelightHelpers.PoseEstimate mt2 =
    LimelightHelpers.getBotPoseEstimate_wpiBlue_MegaTag2("limelight");
getBotPoseEstimate_wpiBlue_MegaTag2 returns a PoseEstimate in the WPILib Blue alliance coordinate system, which is the standard for WPILib odometry. Use getBotPoseEstimate_wpiRed_MegaTag2 if your match logic requires red-alliance coordinates.
4

Add the measurement to your pose estimator

Pass the PoseEstimate to WPILib’s SwerveDrivePoseEstimator or DifferentialDrivePoseEstimator via addVisionMeasurement. Always check that the result is non-null and that at least one tag was seen.
Robot.java
if (mt2 != null && mt2.tagCount > 0) {
    poseEstimator.addVisionMeasurement(mt2.pose, mt2.timestampSeconds);
}
mt2.timestampSeconds is already latency-corrected — the library adjusts the NetworkTables server timestamp by the pipeline and capture latency before returning it.

Complete periodic() example

Here is a full periodic() method combining all four steps:
Robot.java
@Override
public void periodic() {
    double currentYawDegrees = gyro.getAngle(); // replace with your gyro source

    // Step 1: check for a valid target
    boolean hasTarget = LimelightHelpers.getTV("limelight");

    if (hasTarget) {
        // Step 2: read angular offsets
        double tx = LimelightHelpers.getTX("limelight");
        double ty = LimelightHelpers.getTY("limelight");

        System.out.println("Target offset — tx: " + tx + "°, ty: " + ty + "°");
    }

    // Step 3: provide robot orientation to MegaTag2 every loop
    LimelightHelpers.SetRobotOrientation("limelight", currentYawDegrees, 0, 0, 0, 0, 0);

    LimelightHelpers.PoseEstimate mt2 =
        LimelightHelpers.getBotPoseEstimate_wpiBlue_MegaTag2("limelight");

    // Step 4: fuse vision measurement into pose estimator
    if (mt2 != null && mt2.tagCount > 0) {
        poseEstimator.addVisionMeasurement(mt2.pose, mt2.timestampSeconds);
    }
}

Build docs developers (and LLMs) love