Prerequisites
- uv v0.8.4+ is required to parse
[tool.uv.extra-build-dependencies]inpyproject.toml(needed for buildingflash-attn) - NVIDIA GPU with compatible drivers
- CUDA 12.4 is recommended and officially tested (CUDA 11.8 also verified to work)
- For RTX-5090: Tested with CUDA 12.8,
flash-attn==2.8.0.post2,pytorch-cu128
If using CUDA 11.8, you must install a compatible version of
flash-attn manually (e.g., flash-attn==2.8.2 confirmed working with CUDA 11.8).Installation with uv
Clone the repository
GR00T relies on submodules for certain dependencies. Include them when cloning:If you’ve already cloned without submodules, initialize them separately:
Install uv
If you don’t have uv installed, follow the uv installation guide.
Create environment and install dependencies
After installing uv, create the environment and install GR00T:This installs the core dependencies including:
- PyTorch 2.7.1
- Transformers 4.51.3
- Diffusers 0.35.1
- Flash Attention 2.7.4
- And other required packages
Installation with Docker
For a containerized setup that avoids system-level dependency conflicts, use the Docker setup.Prerequisites
- Docker (version 20.10+) with post-installation setup to run without sudo
- NVIDIA Container Toolkit (installation guide)
- NVIDIA GPU with compatible drivers
- Bash shell
- Sufficient disk space (several GB)
Build and run
Build the Docker image
The build process uses This installs all dependencies and sets up the GR00T codebase at
nvcr.io/nvidia/pytorch:25.04-py3 as the base image:/workspace/gr00t/.Docker troubleshooting
GPU not detected
GPU not detected
- Verify NVIDIA Container Toolkit:
nvidia-container-toolkit --version - Restart Docker:
sudo systemctl restart docker - Test GPU access:
docker run --rm --gpus all nvidia/cuda:12.0.0-base-ubuntu22.04 nvidia-smi
Permission errors
Permission errors
Use Log out and back in for the changes to take effect.
sudo with Docker commands, or add your user to the docker group:Build failures
Build failures
- Check disk space:
df -h - Clean Docker:
docker system prune -a - Rebuild without cache:
sudo bash build.sh --no-cache
Verify installation
After installation, verify that GR00T is installed correctly:Next steps
Quick start
Get started with a quick inference example
Hardware requirements
Check hardware recommendations for training and deployment