Browse Source

[bump] teleimager televuer version, update README, fit main

main
silencht 4 months ago
parent
commit
62e197c1e4
  1. 6
      CHANGELOG_zh-CN.md
  2. 45
      README_zh-CN.md
  3. 2
      teleop/teleimager
  4. 36
      teleop/teleop_hand_and_arm.py
  5. 2
      teleop/televuer

6
CHANGELOG_zh-CN.md

@ -7,17 +7,19 @@
- 升级 [televuer](https://github.com/silencht/televuer),具体请查看仓库README。
> 新版本的 [teleimager](https://github.com/silencht/teleimager/commit/ab5018691943433c24af4c9a7f3ea0c9a6fbaf3c) + [televuer](https://github.com/silencht/televuer/releases/tag/v3.0) 支持通过 **webrtc** 传输头部相机图像
>
> 支持 pass-through, ego, immersive 三种模式:pass-through 为通透模式,直接通过 VR 相机查看现实世界来观察机器人;ego 是在通透模式的基础上,添加一个机器人视角的小窗;immersive 是完全沉浸机器人第一人称视角模式。
- 丰富**录制模式**下的任务信息传递参数,修复和完善 EpisodeWriter。
- 完善系统的**状态机信息**、IPC模式。
- 新增 **pass-through 模式**,可以通过VR设备摄像头直接透视外界环境(而不借助机器人头部相机)
- 新增 **affinity CPU 亲和模式**,如果你不了解该模式,那么请无视它。
- 新增 **motion-switcher 功能**,无需遥控器即可自动进退 debug 模式。
- 支持 **inspire_FTP** 灵巧手
## 🏷️ v1.3
- 添加 [![Unitree LOGO](https://camo.githubusercontent.com/ff307b29fe96a9b115434a450bb921c2a17d4aa108460008a88c58a67d68df4e/68747470733a2f2f696d672e736869656c64732e696f2f62616467652f4769744875622d57696b692d3138313731373f6c6f676f3d676974687562)](https://github.com/unitreerobotics/xr_teleoperate/wiki) [![Unitree LOGO](https://camo.githubusercontent.com/6f5253a8776090a1f89fa7815e7543488a9ec200d153827b4bc7c3cb5e1c1555/68747470733a2f2f696d672e736869656c64732e696f2f62616467652f2d446973636f72642d3538363546323f7374796c653d666c6174266c6f676f3d446973636f7264266c6f676f436f6c6f723d7768697465)](https://discord.gg/ZwcVwxv5rq)

45
README_zh-CN.md

@ -43,10 +43,12 @@
- 升级 [televuer](https://github.com/silencht/televuer),具体请查看仓库README。
> 新版本的 [teleimager](https://github.com/silencht/teleimager/commit/ab5018691943433c24af4c9a7f3ea0c9a6fbaf3c) + [televuer](https://github.com/silencht/televuer/releases/tag/v3.0) 支持通过 webrtc 传输头部相机图像
>
> 支持 pass-through, ego, immersive 三种模式
- 完善系统的**状态机**信息、IPC模式。
- 新增 **pass-through 模式**,可以通过VR设备摄像头直接透视外界环境(而不借助机器人头部相机)
- 支持 **inspire_FTP** 灵巧手。
- ···
@ -142,10 +144,11 @@
# 安装 televuer 模块
(tv) unitree@Host:~/xr_teleoperate$ cd teleop/televuer
(tv) unitree@Host:~/xr_teleoperate/teleop/televuer$ pip install -e .
# 生成 televuer 模块所需的证书文件
# 1. 如果您使用 pico / quest 等 xr 设备
# 为 televuer 模块配置 SSL 证书,以便 XR 设备(如 Pico / Quest / Apple Vision Pro)通过 HTTPS / WebRTC 安全连接
# 1. 生成证书文件
# 1.1 如果您使用 pico / quest 等 xr 设备
(tv) unitree@Host:~/xr_teleoperate/teleop/televuer$ openssl req -x509 -nodes -days 365 -newkey rsa:2048 -keyout key.pem -out cert.pem
# 2. 如果您使用 apple vision pro 设备
# 1.2 如果您使用 apple vision pro 设备
(tv) unitree@Host:~/xr_teleoperate/teleop/televuer$ openssl genrsa -out rootCA.key 2048
(tv) unitree@Host:~/xr_teleoperate/teleop/televuer$ openssl req -x509 -new -nodes -key rootCA.key -sha256 -days 365 -out rootCA.pem -subj "/CN=xr-teleoperate"
(tv) unitree@Host:~/xr_teleoperate/teleop/televuer$ openssl genrsa -out key.pem 2048
@ -163,6 +166,14 @@ build cert.pem key.pem LICENSE pyproject.toml README.md rootCA.key rootCA
# 开启防火墙
(tv) unitree@Host:~/xr_teleoperate/teleop/televuer$ sudo ufw allow 8012
# 通过 AirDrop 将 rootCA.pem 复制到 Apple Vision Pro 并安装它
# 2. 配置证书路径,以下方式任选其一
# 2.1 环境变量配置(可选)
(tv) unitree@Host:~/xr_teleoperate/teleop/televuer$ echo 'export XR_TELEOP_CERT="$HOME/xr_teleoperate/teleop/televuer/cert.pem"' >> ~/.bashrc
(tv) unitree@Host:~/xr_teleoperate/teleop/televuer$ echo 'export XR_TELEOP_KEY="$HOME/xr_teleoperate/teleop/televuer/key.pem"' >> ~/.bashrc
(tv) unitree@Host:~/xr_teleoperate/teleop/televuer$ source ~/.bashrc
# 2.2 用户配置目录(可选)
(tv) unitree@Host:~/xr_teleoperate/teleop/televuer$ mkdir -p ~/.config/xr_teleoperate/
(tv) unitree@Host:~/xr_teleoperate/teleop/televuer$ cp cert.pem key.pem ~/.config/xr_teleoperate/
```
```bash
@ -249,22 +260,22 @@ build cert.pem key.pem LICENSE pyproject.toml README.md rootCA.key rootCA
- 基础控制参数
| ⚙️ 参数 | 📜 说明 | 🔘 目前可选值 | 📌 默认值 |
| :-----------: | :----------------------------------------------: | :------------------------------------------------------: | :------: |
| :---------------: | :----------------------------------------------------------: | :----------------------------------------------------------: | :---------------: |
| `--frequency` | 设置录制和控制的 FPS | 任意正常范围内的浮点数 | 30.0 |
| `--xr-mode` | 选择 XR 输入模式(通过什么方式控制机器人) | `hand`(**手势跟踪**)<br />`controller`(**手柄跟踪**) | `hand` |
| `--input-mode` | 选择 XR 输入模式(通过什么方式控制机器人) | `hand`(**手势跟踪**)<br />`controller`(**手柄跟踪**) | `hand` |
| `--display-mode` | 选择 XR 显示模式(通过什么方式查看机器人视角) | `immersive`(沉浸式)<br />`ego`(通透+第一人称小窗)<br />`pass-through`(通透) | `immersive` |
| `--arm` | 选择机器人设备类型(可参考 0. 📖 介绍) | `G1_29`<br />`G1_23`<br />`H1_2`<br />`H1` | `G1_29` |
| `--ee` | 选择手臂的末端执行器设备类型(可参考 0. 📖 介绍) | `dex1`<br />`dex3`<br />`inspire1`<br />`brainco` | 无默认值 |
| `--ee` | 选择手臂的末端执行器设备类型(可参考 0. 📖 介绍) | `dex1`<br />`dex3`<br />`inspire_ftp`<br />`inspire_dfx`<br />`brainco` | 无默认值 |
| `--img-server-ip` | 设置图像服务器的 IP 地址,用于接收图像服务流、配置 WebRTC 信令服务地址 | `IPv4` 地址 | `192.168.123.164` |
- 模式开关参数
| ⚙️ 参数 | 📜 说明 |
| :---------------: | :----------------------------------------------------------: |
| :----------: | :----------------------------------------------------------: |
| `--motion` | 【启用**运动控制**模式】<br />开启本模式后,可在机器人运控程序运行下进行遥操作程序。<br />**手势跟踪**模式下,可使用 [R3遥控器](https://www.unitree.com/cn/R3) 控制机器人正常行走;**手柄跟踪**模式下,也可使用[手柄摇杆控制机器人行走](https://github.com/unitreerobotics/xr_teleoperate/blob/375cdc27605de377c698e2b89cad0e5885724ca6/teleop/teleop_hand_and_arm.py#L247-L257)。 |
| `--headless` | 【启用**无图形界面**模式】<br />适用于本程序部署在开发计算单元(PC2)等无显示器情况 |
| `--sim` | 【启用[**仿真模式**](https://github.com/unitreerobotics/unitree_sim_isaaclab)】 |
| `--ipc` | 【进程间通信模式】<br />可通过进程间通信来控制 xr_teleoperate 程序的状态切换,此模式适合与代理程序进行交互 |
| `--pass-through` | 【透视模式】<br />在 VR 头显中使用透视模式直接观察外部环境(而不是使用机器人头部相机视频流) |
| `--img-server-ip` | 设置图像服务器的 IP 地址,用于接收图像服务流、配置 WebRTC 信令服务地址 |
| `--affinity` | 【CPU亲和模式】<br />设置 CPU 核心亲和性。如果你不知道这是什么,那么请不要设置它。 |
| `--record` | 【启用**数据录制**模式】<br />**r** 键进入遥操后,按 **s** 键可开启数据录制,再次按 **s** 键可结束录制并保存本次 episode 数据。<br />继续按下 **s** 键可重复前述过程。 |
| `--task-*` | 此类参数可配置录制的文件保存路径,任务目标、描述、步骤等信息 |
@ -383,19 +394,25 @@ build cert.pem key.pem LICENSE pyproject.toml README.md rootCA.key rootCA
# 将本地主机 xr_teleoperate/teleop/televuer 路径下在 1.1 节配置的 key.pem 和 cert.pem 文件拷贝到 PC2 对应路径
# 这两个文件是 teleimager 启动 WebRTC 服务时所必须的
(tv) unitree@Host:~$ scp ~/xr_teleoperate/teleop/televuer/key.pem ~/xr_teleoperate/teleop/televuer/cert.pem unitree@192.168.123.164:~/teleimager
# 根据 teleimager 仓库的 https://github.com/silencht/teleimager/blob/main/README.md 文档说明,在PC2配置证书路径,例如
(teleimager) unitree@PC2:~$ cd teleimager
(teleimager) unitree@PC2:~$ mkdir -p ~/.config/xr_teleoperate/
(teleimager) unitree@PC2:~/teleimager$ cp cert.pem key.pem ~/.config/xr_teleoperate/
```
3. 在**开发计算单元 PC2** 中按照 teleimager 文档配置 cam_config_server.yaml 并启动图像服务程序
```bash
(teleimager) unitree@PC2:~/image_server$ sudo $(which python) -m teleimager.image_server
(teleimager) unitree@PC2:~/image_server$ python -m teleimager.image_server
# 下面命令作用相同
(teleimager) unitree@PC2:~/image_server$ teleimager-server
```
4. 在**本地主机**上执行以下命令订阅图像:
```bash
(tv) unitree@Host:~$ cd ~/xr_teleoperate/teleop/teleimager/src
(tv) unitree@Host:~/xr_teleoperate/teleop/teleimager/src$ python -m teleimager.image_client
(tv) unitree@Host:~/xr_teleoperate/teleop/teleimager/src$ python -m teleimager.image_client --host 192.168.123.164
# 如果设置了 WebRTC 图像流,那么可以在浏览器中通过 https://192.168.123.164:60001 打开网址,随后点击 Start 按钮进行测试
```
@ -407,7 +424,7 @@ build cert.pem key.pem LICENSE pyproject.toml README.md rootCA.key rootCA
>
> 注意2:如果选择的G1机器人配置,且使用 [Inspire DFX 灵巧手](https://support.unitree.com/home/zh/G1_developer/inspire_dfx_dexterous_hand),相关issue [#46](https://github.com/unitreerobotics/xr_teleoperate/issues/46)。
>
> 注意3:如果选择的机器人配置中使用了 [Inspire FTP 灵巧手](https://support.unitree.com/home/zh/G1_developer/inspire_ftp_dexterity_hand),相关issue [ #48](https://github.com/unitreerobotics/xr_teleoperate/issues/48)。
> 注意3:如果选择的机器人配置中使用了 [Inspire FTP 灵巧手](https://support.unitree.com/home/zh/G1_developer/inspire_ftp_dexterity_hand),相关issue [ #48](https://github.com/unitreerobotics/xr_teleoperate/issues/48)。目前已经支持 FTP 灵巧手,请您查阅 `--ee` 参数。
首先,使用 [此链接: DFX_inspire_service](https://github.com/unitreerobotics/DFX_inspire_service) 克隆灵巧手控制接口程序,然后将其复制到宇树机器人的**PC2**。
@ -480,8 +497,6 @@ xr_teleoperate/
├── assets [存储机器人 URDF 相关文件]
├── hardware [存储 3D 打印模组]
├── teleop
│ ├── teleimager [全新的图像服务库,支持多种特性]
│ │

2
teleop/teleimager

@ -1 +1 @@
Subproject commit fb87149bc1c34723c2d5a7f6636236d73797d612
Subproject commit 04741e1635f8d364ee340656113159766055e66f

36
teleop/teleop_hand_and_arm.py

@ -76,17 +76,17 @@ if __name__ == '__main__':
parser = argparse.ArgumentParser()
# basic control parameters
parser.add_argument('--frequency', type = float, default = 30.0, help = 'control and record \'s frequency')
parser.add_argument('--xr-mode', type=str, choices=['hand', 'controller'], default='hand', help='Select XR device tracking source')
parser.add_argument('--input-mode', type=str, choices=['hand', 'controller'], default='hand', help='Select XR device input tracking source')
parser.add_argument('--display-mode', type=str, choices=['immersive', 'ego', 'pass-through'], default='immersive', help='Select XR device display mode')
parser.add_argument('--arm', type=str, choices=['G1_29', 'G1_23', 'H1_2', 'H1'], default='G1_29', help='Select arm controller')
parser.add_argument('--ee', type=str, choices=['dex1', 'dex3', 'inspire_ftp', 'inspire_dfx', 'brainco'], help='Select end effector controller')
parser.add_argument('--img-server-ip', type=str, default='10.0.7.96', help='IP address of image server, used by teleimager and televuer')
# mode flags
parser.add_argument('--motion', action = 'store_true', help = 'Enable motion control mode')
parser.add_argument('--headless', action='store_true', help='Enable headless mode (no display)')
parser.add_argument('--sim', action = 'store_true', help = 'Enable isaac simulation mode')
parser.add_argument('--ipc', action = 'store_true', help = 'Enable IPC server to handle input; otherwise enable sshkeyboard')
parser.add_argument('--pass-through', action='store_true', help='Enable passthrough mode (use real-world view in XR device)')
parser.add_argument('--affinity', action = 'store_true', help = 'Enable high priority and set CPU affinity mode')
parser.add_argument('--img-server-ip', type=str, default='192.168.123.164', help='IP address of image server')
# record mode and task info
parser.add_argument('--record', action = 'store_true', help = 'Enable data recording mode')
parser.add_argument('--task-dir', type = str, default = './utils/data/', help = 'path to save data')
@ -114,20 +114,24 @@ if __name__ == '__main__':
img_client = ImageClient(host=args.img_server_ip)
camera_config = img_client.get_cam_config()
logger_mp.debug(f"Camera config: {camera_config}")
xr_need_local_img = not (args.pass_through or camera_config['head_camera']['enable_webrtc'])
xr_need_local_img = not (args.display_mode == 'pass-through' or camera_config['head_camera']['enable_webrtc'])
# televuer_wrapper: obtain hand pose data from the XR device and transmit the robot's head camera image to the XR device.
tv_wrapper = TeleVuerWrapper(use_hand_tracking=args.xr_mode == "hand",
pass_through=args.pass_through,
tv_wrapper = TeleVuerWrapper(use_hand_tracking=args.input_mode == "hand",
binocular=camera_config['head_camera']['binocular'],
img_shape=camera_config['head_camera']['image_shape'],
# maybe should decrease fps for better performance?
# https://github.com/unitreerobotics/xr_teleoperate/issues/172
# display_fps=camera_config['head_camera']['fps'] ? args.frequency? 30.0?
display_mode=args.display_mode,
zmq=camera_config['head_camera']['enable_zmq'],
webrtc=camera_config['head_camera']['enable_webrtc'],
webrtc_url=f"https://{args.img_server_ip}:{camera_config['head_camera']['webrtc_port']}/offer",
display_fps=args.frequency)
)
# motion mode (G1: Regular mode R1+X, not Running mode R2+A)
if args.motion:
if args.xr_mode == "controller":
if args.input_mode == "controller":
loco_wrapper = LocoClientWrapper()
else:
motion_switcher = MotionSwitcher()
@ -267,17 +271,17 @@ if __name__ == '__main__':
# get xr's tele data
tele_data = tv_wrapper.get_tele_data()
if (args.ee == "dex3" or args.ee == "inspire1" or args.ee == "brainco") and args.xr_mode == "hand":
if (args.ee == "dex3" or args.ee == "inspire1" or args.ee == "brainco") and args.input_mode == "hand":
with left_hand_pos_array.get_lock():
left_hand_pos_array[:] = tele_data.left_hand_pos.flatten()
with right_hand_pos_array.get_lock():
right_hand_pos_array[:] = tele_data.right_hand_pos.flatten()
elif args.ee == "dex1" and args.xr_mode == "controller":
elif args.ee == "dex1" and args.input_mode == "controller":
with left_gripper_value.get_lock():
left_gripper_value.value = tele_data.left_ctrl_triggerValue
with right_gripper_value.get_lock():
right_gripper_value.value = tele_data.right_ctrl_triggerValue
elif args.ee == "dex1" and args.xr_mode == "hand":
elif args.ee == "dex1" and args.input_mode == "hand":
with left_gripper_value.get_lock():
left_gripper_value.value = tele_data.left_hand_pinchValue
with right_gripper_value.get_lock():
@ -286,7 +290,7 @@ if __name__ == '__main__':
pass
# high level control
if args.xr_mode == "controller" and args.motion:
if args.input_mode == "controller" and args.motion:
# quit teleoperate
if tele_data.right_ctrl_aButton:
START = False
@ -314,7 +318,7 @@ if __name__ == '__main__':
if args.record:
READY = recorder.is_ready() # now ready to (2) enter RECORD_RUNNING state
# dex hand or gripper
if args.ee == "dex3" and args.xr_mode == "hand":
if args.ee == "dex3" and args.input_mode == "hand":
with dual_hand_data_lock:
left_ee_state = dual_hand_state_array[:7]
right_ee_state = dual_hand_state_array[-7:]
@ -322,7 +326,7 @@ if __name__ == '__main__':
right_hand_action = dual_hand_action_array[-7:]
current_body_state = []
current_body_action = []
elif args.ee == "dex1" and args.xr_mode == "hand":
elif args.ee == "dex1" and args.input_mode == "hand":
with dual_gripper_data_lock:
left_ee_state = [dual_gripper_state_array[0]]
right_ee_state = [dual_gripper_state_array[1]]
@ -330,7 +334,7 @@ if __name__ == '__main__':
right_hand_action = [dual_gripper_action_array[1]]
current_body_state = []
current_body_action = []
elif args.ee == "dex1" and args.xr_mode == "controller":
elif args.ee == "dex1" and args.input_mode == "controller":
with dual_gripper_data_lock:
left_ee_state = [dual_gripper_state_array[0]]
right_ee_state = [dual_gripper_state_array[1]]
@ -340,7 +344,7 @@ if __name__ == '__main__':
current_body_action = [-tele_data.left_ctrl_thumbstickValue[1] * 0.3,
-tele_data.left_ctrl_thumbstickValue[0] * 0.3,
-tele_data.right_ctrl_thumbstickValue[0] * 0.3]
elif (args.ee == "inspire_dfx" or args.ee == "inspire_ftp" or args.ee == "brainco") and args.xr_mode == "hand":
elif (args.ee == "inspire_dfx" or args.ee == "inspire_ftp" or args.ee == "brainco") and args.input_mode == "hand":
with dual_hand_data_lock:
left_ee_state = dual_hand_state_array[:6]
right_ee_state = dual_hand_state_array[-6:]

2
teleop/televuer

@ -1 +1 @@
Subproject commit d6a20e483af8f5d0a77fdc307448808e018dd074
Subproject commit 6b9aafd0b357d31de5e9e8edd2ee322575bd8b8d
Loading…
Cancel
Save