|
|
@ -0,0 +1,77 @@ |
|
|
|
|
|
--- |
|
|
|
|
|
id: gb10-offboard-compute |
|
|
|
|
|
title: "Dell Pro Max GB10 — Offboard AI Compute" |
|
|
|
|
|
status: proposed |
|
|
|
|
|
source_sections: "Cross-referenced from git/spark context system" |
|
|
|
|
|
related_topics: [hardware-specs, networking-comms, learning-and-ai, sensors-perception, deployment-operations] |
|
|
|
|
|
key_equations: [] |
|
|
|
|
|
key_terms: [dell-pro-max-gb10, dgx-spark, offboard-compute, llm, vlm, isaac-lab] |
|
|
|
|
|
images: [] |
|
|
|
|
|
examples: [] |
|
|
|
|
|
open_questions: |
|
|
|
|
|
- "DDS latency over Wi-Fi between GB10 and G1 under realistic conditions" |
|
|
|
|
|
- "Optimal LLM size for real-time task planning (latency vs. capability tradeoff)" |
|
|
|
|
|
--- |
|
|
|
|
|
|
|
|
|
|
|
# Dell Pro Max GB10 — Offboard AI Compute |
|
|
|
|
|
|
|
|
|
|
|
The Dell Pro Max GB10 (NVIDIA Grace Blackwell) can serve as an offboard AI brain for the G1, handling large model inference, training, and simulation that exceed the Jetson Orin NX's capabilities. |
|
|
|
|
|
|
|
|
|
|
|
**Full integration document:** See `git/spark/context/g1-integration.md` in the Dell Pro Max GB10 knowledge base for complete architecture, code examples, and setup instructions. |
|
|
|
|
|
|
|
|
|
|
|
## 1. Capability Comparison |
|
|
|
|
|
|
|
|
|
|
|
| Capability | G1 Orin NX | Dell Pro Max GB10 | |
|
|
|
|
|
|---|---|---| |
|
|
|
|
|
| AI compute | 100 TOPS | 1,000 TFLOPS (FP4) | |
|
|
|
|
|
| Memory | 16 GB | 128 GB unified LPDDR5X | |
|
|
|
|
|
| Max LLM | ~7B (quantized) | ~200B (FP4) | |
|
|
|
|
|
| CUDA arch | sm_87 | sm_121 (Blackwell) | |
|
|
|
|
|
| CPU | ARM (Orin) | ARM (Cortex-X925/A725) | |
|
|
|
|
|
| Price | Included in G1 EDU | $3,699-$3,999 | |
|
|
|
|
|
|
|
|
|
|
|
## 2. Connection |
|
|
|
|
|
|
|
|
|
|
|
- **Wi-Fi:** G1 Wi-Fi 6 ↔ GB10 Wi-Fi 7 (backward compatible). ~1 Gbps, 5-50 ms latency. |
|
|
|
|
|
- **10GbE:** GB10 RJ45 to G1 Ethernet. 10 Gbps, <1 ms latency. Best for lab use. |
|
|
|
|
|
- **Subnet:** GB10 joins 192.168.123.0/24 (e.g., 192.168.123.100) or uses a router bridge. |
|
|
|
|
|
|
|
|
|
|
|
## 3. Key Use Cases |
|
|
|
|
|
|
|
|
|
|
|
| Use Case | G1 Role | GB10 Role | Latency OK? | |
|
|
|
|
|
|---|---|---|---| |
|
|
|
|
|
| LLM task planning | Sends command, executes plan | Runs 70B+ LLM, returns plan | Yes (1-5s) | |
|
|
|
|
|
| Vision-language | Streams D435i frames | Runs large VLM | Yes (0.5-2s) | |
|
|
|
|
|
| RL policy training | Deploys trained policy | Runs Isaac Lab simulation | Offline | |
|
|
|
|
|
| Imitation learning | Collects demo data | Trains LeRobot policies | Offline | |
|
|
|
|
|
| Speech interaction | STT/TTS on Orin | LLM reasoning on GB10 | Yes (1-5s) | |
|
|
|
|
|
|
|
|
|
|
|
## 4. What Stays On-Robot |
|
|
|
|
|
|
|
|
|
|
|
- 500 Hz locomotion control loop (RK3588) |
|
|
|
|
|
- Balance and stability (real-time, cannot tolerate network latency) |
|
|
|
|
|
- Emergency stop |
|
|
|
|
|
- Basic perception (on Orin NX) |
|
|
|
|
|
|
|
|
|
|
|
The GB10 handles only high-level reasoning with relaxed latency requirements. |
|
|
|
|
|
|
|
|
|
|
|
## 5. LLM API Access |
|
|
|
|
|
|
|
|
|
|
|
GB10 runs an OpenAI-compatible API: |
|
|
|
|
|
```bash |
|
|
|
|
|
# From G1 Orin NX |
|
|
|
|
|
curl http://192.168.123.100:30000/v1/chat/completions \ |
|
|
|
|
|
-d '{"model":"llama","messages":[{"role":"user","content":"Walk to the table and pick up the red cup"}]}' |
|
|
|
|
|
``` |
|
|
|
|
|
|
|
|
|
|
|
## 6. ARM Compatibility |
|
|
|
|
|
|
|
|
|
|
|
Both systems are ARM64-native. Model files (.pt, .onnx, .gguf) trained on GB10 deploy directly to Orin NX without architecture conversion. Container images are interoperable (both aarch64). |
|
|
|
|
|
|
|
|
|
|
|
## Key Relationships |
|
|
|
|
|
|
|
|
|
|
|
- Computes for: [[learning-and-ai]] (training server) |
|
|
|
|
|
- Connects via: [[networking-comms]] (Wi-Fi or Ethernet) |
|
|
|
|
|
- Enhances: [[sensors-perception]] (large VLM inference) |
|
|
|
|
|
- Deployed from: [[deployment-operations]] (trained models → real robot) |
|
|
|
|
|
- Full reference: `git/spark/context/g1-integration.md` |