Official manual
TurtleBot3 e-Manual
Official TurtleBot3 guide from ROBOTIS covering all models and features
Platform overview
The TurtleBot3 is a small, affordable, programmable, ROS-based mobile robot designed for education, research, and product prototyping.Available models
This dev container supports all TurtleBot3 models:- Burger: Compact two-wheeled robot, ideal for learning and indoor navigation
- Waffle: Larger platform with additional sensors and payload capacity
- Waffle Pi: Extended version with Raspberry Pi camera and LiDAR
The default configuration uses the Burger model. You can change this by modifying the
TURTLEBOT3_MODEL environment variable in .devcontainer/devcontainer.json.Key features
Hardware specifications
- Differential drive: Two-wheeled mobile base for omnidirectional movement
- LiDAR sensor: 360-degree laser range finder for mapping and obstacle detection
- IMU: Inertial measurement unit for orientation tracking
- Odometry: Wheel encoders for position estimation
Software capabilities
SLAM
Simultaneous localization and mapping using Cartographer or SLAM Toolbox
Navigation
Autonomous navigation with obstacle avoidance using Navigation2 stack
Manipulation
Optional manipulator arm for pick-and-place tasks (Waffle/Waffle Pi)
Vision
Camera-based object detection and tracking (Waffle Pi)
Simulation environments
The dev container includes pre-configured Gazebo worlds:Empty world
Minimal environment for testing basic movement and control algorithms.TurtleBot3 world
Indoor environment with obstacles, perfect for navigation testing.House environment
Complex multi-room layout for advanced SLAM and navigation scenarios.GitHub repositories
Report bugs or contribute to TurtleBot3 development through the official GitHub repositories.
TurtleBot3 Issues
Bug reports and feature requests for TurtleBot3 packages
Common operations
Refer to the TurtleBot3 manual for detailed instructions on:- Teleoperation: Manual robot control via keyboard or joystick
- SLAM operation: Building maps of unknown environments
- Navigation setup: Configuring autonomous navigation with existing maps
- Sensor calibration: Fine-tuning LiDAR and IMU performance
- Custom applications: Developing your own robotics software