The QRB Mobile Manipulator combines the QRB Robot Base AMR with the RML-63 6-DOF robotic arm into a single mobile manipulation platform. It inherits the full sensor suite of the AMR Mini — LiDAR, IMU, odometry, RGB camera, and depth camera — alongside the arm and gripper controllers from the RML-63. Launching it follows the same two-step process as the standalone arm: start Gazebo first, press Play, then load the controllers in a separate terminal.Documentation Index
Fetch the complete documentation index at: https://mintlify.com/qualcomm-qrb-ros/qrb_ros_simulation/llms.txt
Use this file to discover all available pages before exploring further.
Launch the simulation
Launch Gazebo with the Mobile Manipulator
In your first terminal, source your install overlay and launch the mobile manipulator:Gazebo opens and spawns the combined robot in the default warehouse world. The robot state publisher starts immediately and the AMR sensor bridges come up automatically.
Press Play in Gazebo
Click the Play button in the Gazebo toolbar. The simulation clock starts and the ros2_control controller manager becomes reachable.
Do not proceed to the next step until you have pressed Play. The controller manager is not active until the simulation is running, and the loader launch will time out if you skip this step.
Load the arm and gripper controllers
Open a new terminal, source the install overlay, and run the controller loader:This spawns the same three controllers used by the standalone RML-63:
joint_state_broadcaster— broadcasts all joint states to/joint_states.rm_group_controller—JointTrajectoryControllerfor the six arm joints.hand_controller—JointTrajectoryControllerfor the four gripper joints.
Controlling the robot
Base movement
Send velocity commands to move the AMR base:Arm movement
Send joint trajectory goals to the arm using theFollowJointTrajectory action:
control_msgs/action/FollowJointTrajectory. See RML-63 Robotic Arm for detailed arm control information including joint names and gripper joints.
ROS topics
Base and arm state
| Topic | Type | Description |
|---|---|---|
/cmd_vel | geometry_msgs/msg/Twist | Velocity command for the AMR base (subscribed) |
/joint_states | sensor_msgs/msg/JointState | Positions and velocities for all joints |
/odom | nav_msgs/msg/Odometry | Wheel odometry: position and velocity |
/tf | tf2_msgs/msg/TFMessage | Real-time coordinate frame transforms |
/tf_static | tf2_msgs/msg/TFMessage | Static coordinate frame transforms |
/robot_description | std_msgs/msg/String | URDF model published by robot_state_publisher |
/clock | rosgraph_msgs/msg/Clock | Simulation time |
Sensors
| Topic | Type | Description |
|---|---|---|
/scan | sensor_msgs/msg/LaserScan | LiDAR scan data |
/imu | sensor_msgs/msg/Imu | Acceleration and angular velocity |
/camera/color/image_raw | sensor_msgs/msg/Image | RGB camera image stream |
/camera/color/camera_info | sensor_msgs/msg/CameraInfo | RGB camera intrinsic parameters |
/camera/depth/image_raw | sensor_msgs/msg/Image | Depth image stream |
/camera/depth/camera_info | sensor_msgs/msg/CameraInfo | Depth camera intrinsic parameters |
/camera/depth/points | sensor_msgs/msg/PointCloud2 | 3D point cloud from depth camera |