GR00T-WholeBodyControl
This is the codebase for the GR00T Whole-Body Control (WBC) projects. It hosts model checkpoints and scripts for training, evaluating, and deploying advanced whole-body controllers for humanoid robots. We currently support:
- Decoupled WBC: the decoupled controller (RL for lower body, and IK for upper body) used in NVIDIA GR00T N1.5 and N1.6 models;
- GEAR-SONIC Series: our latest iteration of generalist humanoid whole-body controllers (see our whitepaper).
News
- [2026-04-14] 🌐 Live web demo — try SONIC interactively in your browser. Features Kimodo text-to-motion generation.
- [2026-04-10] 🚀 Released SONIC training code and checkpoint on HuggingFace. Train from scratch or finetune. Additional embodiment support and VLA data collection pipeline. See Training Guide.
- [2026-03-24] 🔧 C++ inference stack update: motor error monitoring, TTS alerts, ZMQ protocol v4, idle-mode readaptation. ZMQ header size changed to 1280 bytes.
- [2026-03-16] 📦 BONES-SEED open-sourced — 142K+ human motions (~288 hours) with G1 MuJoCo trajectories.
- [2026-02-19] 🎉 Released GEAR-SONIC: pretrained checkpoints, C++ inference, VR teleoperation, and documentation.
- [2025-11-12] 🏁 Initial release with Decoupled WBC for GR00T N1.5 and N1.6.
Table of Contents
- News
- GEAR-SONIC
- VR Whole-Body Teleoperation
- Kinematic Planner
- SONIC Training
- TODOs
- What's Included
- Documentation
- Citation
- License
- Support
- Decoupled WBC
GEAR-SONIC
Website | Model | Paper | Docs
SONIC is a humanoid behavior foundation model that gives robots a core set of motor skills learned from large-scale human motion data. Rather than building separate controllers for predefined motions, SONIC uses motion tracking as a scalable training task, enabling a single unified policy to produce natural, whole-body movement and support a wide range of behaviors — from walking and crawling to teleoperation and multi-modal control. It is designed to generalize beyond the motions it has seen during training and to serve as a foundation for higher-level planning and interaction.
In this repo, we release SONIC's training code, deployment framework, model checkpoints, and teleoperation stack for data collection.
VR Whole-Body Teleoperation
SONIC supports real-time whole-body teleoperation via PICO VR headset, enabling natural human-to-robot motion transfer for data collection and interactive control.
| Walking | Running |
![]() | ![]() |
| Sideways Movement | Kneeling |
![]() | ![]() |
| Getting Up | Jumping |
![]() | ![]() |
| Bimanual Manipulation | Object Hand-off |
![]() | ![]() |
Kinematic Planner
SONIC includes a kinematic planner for real-time locomotion generation — choose a movement style, steer with keyboard/gamepad, and adjust speed and height on the fly.
| In-the-Wild Navigation | |
![]() | |
| Run | Happy |
![]() | ![]() |
| Stealth | Injured |
![]() | ![]() |
| Kneeling | Hand Crawling |
![]() | ![]() |
| Elbow Crawling | Boxing |
![]() | ![]() |
SONIC Training
SONIC can be trained from scratch on the Bones-SEED motion capture dataset (142K+ motions, ~288 hours, Unitree G1 retargeted), or finetuned from the released checkpoint on Hugging Face.
Quick start
# Install training dependencies (Isaac Lab must be installed separately — see docs) pip install -e "gear_sonic/[training]" # Download checkpoint + SMPL data from Hugging Face pip install huggingface_hub python download_from_hf.py --training # Download Bones-SEED G1 CSVs from bones-studio.ai/seed, then convert and filter python gear_sonic/data_process/convert_soma_csv_to_motion_lib.py \ --input /path/to/bones_seed/g1/csv/ \ --output data/motion_lib_bones_seed/robot --fps 30 --fps_source 120 --individual --num_workers 16 python gear_sonic/data_process/filter_and_copy_bones_data.py \ --source data/motion_lib_bones_seed/robot --dest data/motion_lib_bones_seed/robot_filtered # Finetune from released checkpoint (64+ GPUs recommended) accelerate launch --num_processes=8 gear_sonic/train_agent_trl.py \ +exp=manager/universal_token/all_modes/sonic_release \ +checkpoint=sonic_release/last.pt \ num_envs=4096 headless=True \ ++manager_env.commands.motion.motion_lib_cfg.motion_file=data/motion_lib_bones_seed/robot_filtered \ ++manager_env.commands.motion.motion_lib_cfg.smpl_motion_file=data/smpl_filtered
For the full guide including multi-node training, evaluation, ONNX export, and SOMA encoder setup: 📖 Installation (Training) | Training Guide
TODOs
- Release pretrained SONIC policy checkpoints
- Open source C++ inference stack
- Setup documentation
- Open source teleoperation stack and demonstration scripts
- Release training scripts and recipes for motion imitation and fine-tuning
- Open source large-scale data collection workflows and fine-tuning VLA scripts.
- Publish additional preprocessed large-scale human motion datasets
What's Included
This release includes:
gear_sonic_deploy: C++ inference stack for deploying SONIC policies on real hardwaregear_sonic: Full SONIC training stack — PPO training, data processing pipeline, and configuration system for training on Bones-SEED and custom motion datasets
Setup
Git LFS required. This repo contains large binary assets (meshes, ONNX models). Without Git LFS, you will get small pointer files instead of actual data, causing silent failures. Install Git LFS first if you don't have it:
sudo apt install git-lfs && git lfs install
git clone https://github.com/NVlabs/GR00T-WholeBodyControl.git cd GR00T-WholeBodyControl git lfs pull # Verify your environment python check_environment.py
Which environment do I need?
| I want to... | Environment | How to install |
|---|---|---|
| Train / finetune SONIC | Isaac Lab's Python env | Install Isaac Lab, then pip install -e "gear_sonic/[training]" |
| Run MuJoCo simulation | .venv_sim (auto-created) | bash install_scripts/install_mujoco_sim.sh |
| VR teleoperation | .venv_teleop (auto-created) | bash install_scripts/install_pico.sh |
| Collect data | .venv_data_collection (auto-created) | bash install_scripts/install_data_collection.sh |
| Deploy on real robot | C++ build | See deployment docs |
Each use case has its own lightweight environment. The install scripts use uv
and create isolated venvs automatically — you don't need to manage them manually.
Training is the only one that requires Isaac Lab (installed separately).
Documentation
Getting Started
Tutorials
Training
Best Practices
Citation
If you use GEAR-SONIC in your research, please cite:
@article{luo2025sonic, title={SONIC: Supersizing Motion Tracking for Natural Humanoid Whole-Body Control}, author={Luo, Zhengyi and Yuan, Ye and Wang, Tingwu and Li, Chenran and Chen, Sirui and Casta\~neda, Fernando and Cao, Zi-Ang and Li, Jiefeng and Minor, David and Ben, Qingwei and Da, Xingye and Ding, Runyu and Hogg, Cyrus and Song, Lina and Lim, Edy and Jeong, Eugene and He, Tairan and Xue, Haoru and Xiao, Wenli and Wang, Zi and Yuen, Simon and Kautz, Jan and Chang, Yan and Iqbal, Umar and Fan, Linxi and Zhu, Yuke}, journal={arXiv preprint arXiv:2511.07820}, year={2025} }
License
This project uses dual licensing:
- Source Code: Licensed under Apache License 2.0 - applies to all code, scripts, and software components in this repository
- Model Weights: Licensed under NVIDIA Open Model License - applies to all trained model checkpoints and weights
See LICENSE for the complete dual-license text.
Please review both licenses before using this project. The NVIDIA Open Model License permits commercial use with attribution and requires compliance with NVIDIA's Trustworthy AI terms.
All required legal documents, including the Apache 2.0 license, 3rd-party attributions, and DCO language, are consolidated in the /legal folder of this repository.
Support
For questions and issues, please contact the GEAR WBC team at gear-wbc@nvidia.com to provide feedback!
Decoupled WBC
For the Decoupled WBC used in GR00T N1.5 and N1.6 models, please refer to the Decoupled WBC documentation.
Acknowledgments
We would like to acknowledge the following projects from which parts of the code in this repo are derived from:
















