Skip to main content

Documentation Index

Fetch the complete documentation index at: https://mintlify.com/qualcomm-qrb-ros/qrb_ros_simulation/llms.txt

Use this file to discover all available pages before exploring further.

The QRB Robot Base AMR Mini is a smaller variant of the QRB Robot Base AMR. It shares the same differential-drive architecture and sensor capabilities — LiDAR, IMU, and odometry — but uses a different LiDAR configuration tuned for its compact form factor. Like the standard AMR, launching it requires only a single command and starts the simulation automatically.

Launch the simulation

ros2 launch qrb_ros_sim_gazebo gazebo_robot_base_mini.launch.py
Gazebo opens, loads the default warehouse world, and spawns the AMR Mini. The ros-gz bridge starts immediately and begins publishing sensor data to ROS 2 topics.
The AMR Mini starts the simulation automatically. You do not need to press the Play button in Gazebo before the sensors and drive topics are active.

Differences from the standard AMR

The AMR Mini uses a dedicated laser configuration file — qrb_robot_base_mini_laser_params.yaml — instead of the standard qrb_robot_base_laser_params.yaml. The key difference is the LiDAR horizontal scan coverage:
ParameterQRB Robot Base AMRQRB Robot Base AMR Mini
Horizontal samples11501800
Min angle (rad)−2.005−3.124
Max angle (rad)2.0053.142
Range min/max (m)0.15 / 25.00.15 / 25.0
Update rate (Hz)1515
The AMR Mini’s LiDAR provides a near-360° field of view, compared to the approximately 229° coverage of the standard AMR. The AMR Mini also enables the RGB camera and depth camera by default, whereas the standard AMR does not include cameras. This adds the following topics:
TopicTypeDescription
/camera/color/image_rawsensor_msgs/msg/ImageRGB camera image stream
/camera/color/camera_infosensor_msgs/msg/CameraInfoRGB camera intrinsic parameters
/camera/depth/image_rawsensor_msgs/msg/ImageDepth image stream
/camera/depth/camera_infosensor_msgs/msg/CameraInfoDepth camera intrinsic parameters
/camera/depth/pointssensor_msgs/msg/PointCloud23D point cloud from depth camera

Sending velocity commands

Control the robot by publishing to /cmd_vel:
ros2 topic pub /cmd_vel geometry_msgs/msg/Twist "{linear: {x: 0.5}, angular: {z: 0.3}}"

ROS topics

TopicTypeDescription
/scansensor_msgs/msg/LaserScanLiDAR scan data (~360°, 1800 samples, 15 Hz)
/imusensor_msgs/msg/ImuAcceleration and angular velocity at 50 Hz
/odomnav_msgs/msg/OdometryWheel odometry: position and velocity
/cmd_velgeometry_msgs/msg/TwistVelocity command input (subscribed)
/joint_statessensor_msgs/msg/JointStateWheel joint positions and velocities
/tftf2_msgs/msg/TFMessageReal-time coordinate frame transforms
/tf_statictf2_msgs/msg/TFMessageStatic coordinate frame transforms
/robot_descriptionstd_msgs/msg/StringURDF model published by robot_state_publisher
/clockrosgraph_msgs/msg/ClockSimulation time
You can supply a custom laser configuration file at launch time using the laser_config_file argument. For a full list of configurable sensors and parameters, see Sensor configuration.

Build docs developers (and LLMs) love