You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
 
 
 
 
 
 
Joe DiPrima 910a850441 Cap rotation speed to ~45 deg/s (was ~230 deg/s) 4 weeks ago
.github/workflows gear-sonic 1 month ago
decoupled_wbc gear-sonic 1 month ago
docs crawling 1 month ago
external_dependencies gear-sonic 1 month ago
gear_sonic docs update 1 month ago
gear_sonic_deploy Cap rotation speed to ~45 deg/s (was ~230 deg/s) 4 weeks ago
install_scripts gear-sonic 1 month ago
legal gear-sonic 1 month ago
media gear-sonic 1 month ago
.gitattributes gear-sonic 1 month ago
.gitignore gear-sonic 1 month ago
CITATION.cff gear-sonic 1 month ago
LICENSE gear-sonic 1 month ago
Makefile Initial commit 4 months ago
README.md update docs 1 month ago
download_from_hf.py gear-sonic 1 month ago
lint.sh Initial commit 4 months ago
pyproject.toml gear-sonic 1 month ago

README.md

GEAR SONIC Header

License IsaacLab Documentation


GR00T-WholeBodyControl

This is the codebase for the GR00T Whole-Body Control (WBC) projects. It hosts model checkpoints and scripts for training, evaluating, and deploying advanced whole-body controllers for humanoid robots. We currently support:

  • Decoupled WBC: the decoupled controller (RL for lower body, and IK for upper body) used in NVIDIA GR00T N1.5 and N1.6 models;
  • GEAR-SONIC Series: our latest iteration of generalist humanoid whole-body controllers (see our whitepaper).

Table of Contents

GEAR-SONIC

Website | Model | Paper | Docs

SONIC is a humanoid behavior foundation model that gives robots a core set of motor skills learned from large-scale human motion data. Rather than building separate controllers for predefined motions, SONIC uses motion tracking as a scalable training task, enabling a single unified policy to produce natural, whole-body movement and support a wide range of behaviors — from walking and crawling to teleoperation and multi-modal control. It is designed to generalize beyond the motions it has seen during training and to serve as a foundation for higher-level planning and interaction.

In this repo, we will release SONIC's training code, deployment framework, model checkpoints, and teleoperation stack for data collection.

VR Whole-Body Teleoperation

SONIC supports real-time whole-body teleoperation via PICO VR headset, enabling natural human-to-robot motion transfer for data collection and interactive control.

Walking Running
Sideways Movement Kneeling
Getting Up Jumping
Bimanual Manipulation Object Hand-off

Kinematic Planner

SONIC includes a kinematic planner for real-time locomotion generation — choose a movement style, steer with keyboard/gamepad, and adjust speed and height on the fly.

In-the-Wild Navigation
Run Happy
Stealth Injured
Kneeling Hand Crawling
Elbow Crawling Boxing

TODOs

  • Release pretrained SONIC policy checkpoints
  • Open source C++ inference stack
  • Setup documentation
  • Open source teleoperation stack and demonstration scripts
  • Release training scripts and recipes for motion imitation and fine-tuning
  • Open source large-scale data collection workflows and fine-tuning VLA scripts.
  • Publish additional preprocessed large-scale human motion datasets

What's Included

This release includes:

  • gear_sonic_deploy: C++ inference stack for deploying SONIC policies on real hardware
  • gear_sonic: Teleoperation stack for collecting demonstration data (no training code, YET.)

Setup

Clone the repository with Git LFS:

git clone https://github.com/NVlabs/GR00T-WholeBodyControl.git
cd GR00T-WholeBodyControl
git lfs pull

Documentation

📚 Full Documentation

Getting Started

Tutorials

Best Practices


Citation

If you use GEAR-SONIC in your research, please cite:

@article{luo2025sonic,
    title={SONIC: Supersizing Motion Tracking for Natural Humanoid Whole-Body Control},
    author={Luo, Zhengyi and Yuan, Ye and Wang, Tingwu and Li, Chenran and Chen, Sirui and Casta\~neda, Fernando and Cao, Zi-Ang and Li, Jiefeng and Minor, David and Ben, Qingwei and Da, Xingye and Ding, Runyu and Hogg, Cyrus and Song, Lina and Lim, Edy and Jeong, Eugene and He, Tairan and Xue, Haoru and Xiao, Wenli and Wang, Zi and Yuen, Simon and Kautz, Jan and Chang, Yan and Iqbal, Umar and Fan, Linxi and Zhu, Yuke},
    journal={arXiv preprint arXiv:2511.07820},
    year={2025}
}

License

This project uses dual licensing:

  • Source Code: Licensed under Apache License 2.0 - applies to all code, scripts, and software components in this repository
  • Model Weights: Licensed under NVIDIA Open Model License - applies to all trained model checkpoints and weights

See LICENSE for the complete dual-license text.

Please review both licenses before using this project. The NVIDIA Open Model License permits commercial use with attribution and requires compliance with NVIDIA's Trustworthy AI terms.

All required legal documents, including the Apache 2.0 license, 3rd-party attributions, and DCO language, are consolidated in the /legal folder of this repository.


Support

For questions and issues, please contact the GEAR WBC team at gear-wbc@nvidia.com to provide feedback!

Decoupled WBC

For the Decoupled WBC used in GR00T N1.5 and N1.6 models, please refer to the Decoupled WBC documentation.

Acknowledgments

We would like to acknowledge the following projects from which parts of the code in this repo are derived from: