6.2 KiB
| id | title | status | source_sections | related_topics | key_equations | key_terms | images | examples | open_questions |
|---|---|---|---|---|---|---|---|---|---|
| manipulation | Manipulation & Grasping | established | reference/sources/official-product-page.md, reference/sources/official-dex-hand.md, reference/sources/github-xr-teleoperate.md | [joint-configuration sensors-perception sdk-programming learning-and-ai] | [] | [dex3_1 inspire_hand force_position_hybrid tactile_sensor teleoperation xr_teleoperate] | [] | [] | [Arm workspace envelope and reachability maps Dex3-1 per-finger force limits Dex3-1 maximum grasp payload (500g claimed — verify) INSPIRE hand DOF and control details] |
Manipulation & Grasping
Arm control, hand dexterity, grasping strategies, and manipulation capabilities.
1. Arm Configuration
Arm DOF varies by G1 variant (see joint-configuration): [T0]
| Variant | Joints per Arm | Wrist Articulation | Notes |
|---|---|---|---|
| 23-DOF | 5 | 1-axis (yaw only) | Basic arm |
| 29/43-DOF | 7 | 3-axis (yaw+pitch+roll) | Articulated wrist |
Arm Payload
| Variant | Payload per Arm | Source | Tier |
|---|---|---|---|
| Standard | 2 kg | Official spec sheet | T0 |
| EDU | 3 kg | Official spec sheet | T0 |
2. End Effectors
Three hand options are available depending on variant: [T0]
End Prosthetic Hand (Base Model)
- Type: Simplified gripper
- DOF: Minimal (open/close)
- Use case: Basic object handling
- Available on: G1 Standard
Dex3-1 Three-Fingered Dexterous Hand (EDU A/B)
| Property | Value |
|---|---|
| Fingers | 3 (thumb, index, middle) |
| Total DOF per hand | 7 |
| Thumb DOF | 3 active |
| Index finger DOF | 2 active |
| Middle finger DOF | 2 active |
| Actuators | 6 micro brushless direct-drive + 1 gear-drive |
| Tactile sensors | 33 per hand |
| Control | Force-position hybrid |
| Grasp payload | Up to 500g (reported) [T2] |
| DDS command topic | rt/dex3/(left|right)/cmd |
| DDS state topic | rt/dex3/(left|right)/state |
| Message protocol | unitree_hg |
Capabilities: Grasp bottles, tools, boxes; door handle manipulation; basic tool use; object sorting [T1]
INSPIRE DFX Dexterous Hand (EDU C / Flagship Version C)
- Type: Full 5-finger advanced dexterous hand
- DOF: Higher than Dex3-1 (exact count per finger not yet confirmed)
- Features: Enhanced precision manipulation
- Compatibility: ROS2 teleoperation systems, multi-hand configurations
- Documentation: https://support.unitree.com/home/en/G1_developer/inspire_dfx_dexterous_hand [T0]
3. Grasping & Object Manipulation
Demonstrated capabilities with Dex3-1 hand: [T1 — Videos and demos]
- Common objects: Bottles, cups, tools, small boxes
- Door manipulation: Handle turning and door opening
- Tool use: Basic tool grasping and manipulation
- Object sorting: Pick-and-place operations
Grasping is typically performed using:
- Visual servoing via D435i depth camera
- Force-controlled approach using tactile feedback
- Compliant grasping via force-position hybrid control
4. Whole-Body Manipulation (Loco-Manipulation)
Coordinating locomotion with arm manipulation is an active research area: [T1 — Research papers]
- GR00T-WBC (NVIDIA): Purpose-built whole-body control framework for G1 that separates locomotion (balance) from upper-body (manipulation) control. The recommended path for loco-manipulation. See whole-body-control for full details.
- SoFTA framework (arXiv: SoFTA paper): Slow-Fast Two-Agent RL decoupling upper and lower body with different execution frequencies for precise manipulation during locomotion
- Safe control (arXiv:2502.02858): Projected Safe Set Algorithm enforces limb-level geometric constraints for collision prevention during manipulation in cluttered environments
The G1-D platform (wheeled variant) is specifically designed for manipulation tasks, with a mobile base providing stable platform support. [T0]
Mocap-Based Manipulation
Motion capture data can drive arm manipulation trajectories through the retargeting pipeline (see motion-retargeting). When combined with a whole-body controller (see whole-body-control), the robot can track human arm motions while maintaining balance — enabling demonstration-driven manipulation. [T1]
5. Teleoperation for Data Collection
Multiple teleoperation systems enable human demonstration collection: [T0 — GitHub repos]
XR Teleoperation (xr_teleoperate)
| Property | Value |
|---|---|
| Repository | https://github.com/unitreerobotics/xr_teleoperate |
| Supported devices | Apple Vision Pro, PICO 4 Ultra, Meta Quest 3 |
| Control modes | Hand tracking, controller tracking |
| Display modes | Immersive, pass-through |
| Data recording | Built-in episode capture |
| G1 configurations | 29-DOF and 23-DOF |
Supported end-effectors: Dex1-1, Dex3-1, INSPIRE hand, BrainCo hands
Kinect Teleoperation (kinect_teleoperate)
- Azure Kinect DK camera for body tracking
- MuJoCo 3.1.5 for visualization
- Safety "wake-up action" detection prevents accidental activation
Key Relationships
- Uses: joint-configuration (arm + hand joints, DOF varies by variant)
- Uses: sensors-perception (D435i for visual servoing, tactile for contact)
- Controlled via: sdk-programming (DDS topics for hand control)
- May use: learning-and-ai (learned manipulation and loco-manipulation policies)
- Data collection: learning-and-ai (teleoperation → imitation learning)