Skip to main content
GR00T uses uv for fast, reproducible dependency management. You can also use Docker for a containerized setup.

Prerequisites

  • uv v0.8.4+ is required to parse [tool.uv.extra-build-dependencies] in pyproject.toml (needed for building flash-attn)
  • NVIDIA GPU with compatible drivers
  • CUDA 12.4 is recommended and officially tested (CUDA 11.8 also verified to work)
  • For RTX-5090: Tested with CUDA 12.8, flash-attn==2.8.0.post2, pytorch-cu128
If using CUDA 11.8, you must install a compatible version of flash-attn manually (e.g., flash-attn==2.8.2 confirmed working with CUDA 11.8).

Installation with uv

1

Clone the repository

GR00T relies on submodules for certain dependencies. Include them when cloning:
git clone --recurse-submodules https://github.com/NVIDIA/Isaac-GR00T
cd Isaac-GR00T
If you’ve already cloned without submodules, initialize them separately:
git submodule update --init --recursive
2

Install uv

If you don’t have uv installed, follow the uv installation guide.
3

Create environment and install dependencies

After installing uv, create the environment and install GR00T:
uv sync --python 3.10
uv pip install -e .
This installs the core dependencies including:
  • PyTorch 2.7.1
  • Transformers 4.51.3
  • Diffusers 0.35.1
  • Flash Attention 2.7.4
  • And other required packages
4

Optional: Install TensorRT for faster inference

For 2x faster inference with TensorRT acceleration:
uv sync --extra tensorrt
This includes ONNX and TensorRT dependencies for optimized deployment.

Installation with Docker

For a containerized setup that avoids system-level dependency conflicts, use the Docker setup.

Prerequisites

Build and run

1

Navigate to the docker directory

cd docker
2

Build the Docker image

The build process uses nvcr.io/nvidia/pytorch:25.04-py3 as the base image:
bash build.sh
This installs all dependencies and sets up the GR00T codebase at /workspace/gr00t/.
3

Run the container

docker run -it --rm --gpus all gr00t-dev /bin/bash
Development mode mounts your local codebase for live editing. Run this from the docker/ directory. Changes to your local GR00T code will be immediately reflected inside the container.

Docker troubleshooting

  • Verify NVIDIA Container Toolkit: nvidia-container-toolkit --version
  • Restart Docker: sudo systemctl restart docker
  • Test GPU access: docker run --rm --gpus all nvidia/cuda:12.0.0-base-ubuntu22.04 nvidia-smi
Use sudo with Docker commands, or add your user to the docker group:
sudo usermod -aG docker $USER
Log out and back in for the changes to take effect.
  • Check disk space: df -h
  • Clean Docker: docker system prune -a
  • Rebuild without cache: sudo bash build.sh --no-cache

Verify installation

After installation, verify that GR00T is installed correctly:
uv run python -c "import gr00t; print('GR00T successfully installed!')"

Next steps

Quick start

Get started with a quick inference example

Hardware requirements

Check hardware recommendations for training and deployment

Build docs developers (and LLMs) love