Skip to content

Tian-Nian/control_your_robot

Repository files navigation

δΈ­ζ–‡
English

Chinese WIKI

A new open-source project has been released, building upon this project.

X-One-Pipeline This project now just provide sample based on Y1-IMETA cause we only have this brand robotic arm. Other brand robotic arm also support, you can find it at src.robot.controller, same as this project, and to realize a new robot, just make a config like config/x-one.yml, and a ${robot}.py at src/robot/robot, add it into src/robot/robot/init.py, then you can use your robot in the new project.

WECHAT

wechat_group my_wechat

if the wechat group overdue, you could add my wechat to join in.

Robot Node support!

Now you can you example/collect/collect_node.py to easily collect robot data paradell, this pipeline will decorate Robot class with an node. Just have a try.

visualize data by rerun!

refer to scripts/visual_hdf5_rerun.sh to visual your data, support type:

  1. control_your_robot raw data
  2. act / rdt hdf5 data

Data collect pipeline now can choose to save the data into the format you want!

You can try it by switch to the branch--newest to have a try.
A function added at CollectAny called _add_data_transform_pipeline(), add we provide two choice under utils/data_transofrm_pipeline.py:

  1. image_rgb_encode_pipeline
    this will encode every egb image in your data, then save it, it will cost lower memory for data saving by using jepg.
  2. general_hdf5_rdt_format_pipeline
    this will save the data into rdt_hdf5 format without instruction.

RoboTwin depoly pipeline support!

Now we already support RoboTwin pipeline! You can refer to under step for your Sim2Real experiment!

  1. link RoboTwin policy to control_your_robot policy
# example
ln -s path/to/RoboTwin/polciy/pi0 path/to/control_your_robot/policy/
  1. modify deploy.sh By using RoboTwin pipeline, you should set --robotwin first, then extra info should be set like:
# pi0 eval.sh
python script/eval_policy.py --config policy/$policy_name/deploy_policy.yml \
    --overrides \
    --task_name ${task_name} \
    --task_config ${task_config} \
    --train_config_name ${train_config_name} \
    --model_name ${model_name} \
    --ckpt_setting ${model_name} \
    --seed ${seed} \
    --policy_name ${policy_name} 

# your deploy.sh
python example/deploy/deploy.py \
    --base_model_name "openpi"\
    --base_model_class "None"\
    --base_model_path "None"\
    --base_task_name "test"\
    --base_robot_name "test_robot"\
    --base_robot_class "TestRobot"\
    --robotwin \
    --overrides \
    --task_name ${task_name} \
    --task_config ${task_config} \
    --train_config_name ${train_config_name} \
    --model_name ${model_name} \
    --ckpt_setting ${model_name} \
    --seed ${seed} \
    --policy_name ${policy_name} 

By RoboTwin pipeline, some of the base_ index may not make effect, here use None instead. Also some of the RoboTwin setting not take effect, you could ignore it.

Other deploy info you could refer to RoboTwin Document, like some of the model should modify deploy_policy.yml.

Control Your Robot!

This project aims to provide a comprehensive and ready-to-use pipeline for embodied intelligence research, covering everything from robotic arm control, data collection, to Vision-Language-Action (VLA) model training and deployment.

Quick Start!

Since this project includes several test examples, such as robotic arm tests, visual simulation, and full robot simulation, it is possible to quickly understand the overall framework without requiring any physical hardware.
Because no hardware is needed, you can install the environment simply by running:

 pip install -r requirements.txt
 # or
 GIT_LFS_SKIP_SMUDGE=1 uv sync
GIT_LFS_SKIP_SMUDGE=1 uv pip install -e .

This project provides special debug levels: "DEBUG", "INFO", and "ERROR". To fully observe the data flow, set it to "DEBUG":

export INFO_LEVEL="DEBUG"

Alternatively, you can set it in the main function:

import os
os.environ["INFO_LEVEL"] = "DEBUG" # DEBUG , INFO, ERROR
  1. Data Collection Tests

To collect your robot data, just change the robot class in this file.

# Node collect
python exaexample/collect/collect_node.py
# Single-threaded (may have accumulated delays due to function execution)
python example/collect/collect.py
  1. Model Deployment Tests

To deploy the model on your robot, you just need to setup a model class in src/robot/policy/test_policy/inference_model.py like under src/robot/policy/. Then setup your robot in scripts/deploy.sh.

# General deployment script
bash deploy.sh
# Offline data replay consistency test
# you should generate data at least 3 data first by running: python example/collect/collect.py
bash eval_offline.sh
  1. Remote Deployment and Data Transfer

Just change the robot / model class in the file, and correctly set the input/output transform.

# Start the server first, simulating the inference side (allows multiple connections, listens on a port)
python scripts/server.py
# On the client side, collect data and execute commands (example only executes 10 times)
python scripts/client.py
  1. Interesting Scripts
# Collect keypoints and perform trajectory replay
python scripts/collect_moving_ckpt.py 
# SAPIEN simulation, see planner/README.md for details
  1. Debug Scripts
# Because controller and sensor packages have __init__.py, execute with -m
python -m src.robot.controller.TestArm_controller
python -m src.robot.sensor.TestVision_sensor
python -m my_robot.test_robot
  1. Data Conversion Scripts
# After running python example/collect/collect.py and obtaining trajectories
python scripts/convert2rdt_hdf5.py save/test_robot/ save/rdt/

# TO openpi dataset(lerobot2.1)
python scripts/convert2act_hdf5.py save/test_robot/ save/openpi_hdf5/
# you should copy an instruction from task_instructions/ to save/openpi_hdf5/
python scripts/convert2openpi.py --raw_dir save/openpi_hdf5/ --repo_id my_repo_id
  1. upload data Important!!!

sensor.vision_sensor already set encode_rgb=True by DEFAULT, you do not need to use encode and decode.

# In the original dataset, image files occupy a large amount of storage space, which is unfavorable for data transmission. Therefore, a compression and decompression script is provided. It performs JPEG processing on the images to enable faster transfer. The script is configured by default for a dual-arm, three-view setup, but it can be adjusted according to specific needs.
# compress. will make a new floder: path/to/floder/_zip/
python scripts/upload_zip.py path/to/floder --encode

# decompress.
python scripts/upload_zip.py path/to/floder
  1. telop by joint/eef
# We provide two general architectures for teleoperation-based data collection. The first one is relatively simple, where the teleoperation control frequency is aligned with the data collection frequency. The second one is slightly more complex, allowing the teleoperation frequency to far exceed the data recording frequency. Both architectures have been validated through real-world robot experiments.
# same freq
python example/teleop/master_slave_arm_teleop.py 
# diff freq
python example/teleop/master_slave_arm_teleop_fs.py

On real robot!

  1. Implement Your Robot Configuration
Required:
__init__(self): Initialize all controllers and sensors
set_up(self): Execute the set_up parameters for each controller and sensor

Optional:
is_start(self): Check if the robot arm is currently moving
reset(self): Reset the robot arm to its initial position
  1. Some Additional Functions
replay(self, hdf5_path, key_banned, is_collect, episode_id):
    Replay previously collected trajectories.
    Note: key_banned must be set. If you are replaying based on joints, ban 'qpos'; otherwise, ban the other key.
    is_collect and episode_id are linked: they indicate that the replayed trajectory data is being collected and saved under the given episode_id.
    If episode_id is not set, you might overwrite the originally collected trajectories.

show_pic(self, data_path, pic_name):
    Replay video from a specified camera.
    Can be used directly and will automatically decode compressed images in the raw data.

πŸ€– Supported Devices

πŸŽ›οΈ Controllers

βœ… Implemented

Robotic Arm Mobile Base Dexterous Hand Others
Agilex Piper Agilex Tracer2.0 🚧 In Development πŸ“¦ To Be Added
RealMan 65B Agilex bunker πŸ“¦ To Be Added πŸ“¦ To Be Added
Daran ALOHA πŸ“¦ To Be Added πŸ“¦ To Be Added πŸ“¦ To Be Added
Y1 ALOHA πŸ“¦ To Be Added πŸ“¦ To Be Added πŸ“¦ To Be Added

🚧 Planned Support

Robotic Arm Mobile Base Dexterous Hand Others
JAKA πŸ“¦ To Be Added πŸ“¦ To Be Added πŸ“¦ To Be Added
Franka πŸ“¦ To Be Added πŸ“¦ To Be Added πŸ“¦ To Be Added
UR5e πŸ“¦ To Be Added πŸ“¦ To Be Added πŸ“¦ To Be Added

πŸ“‘ Sensors

βœ… Implemented

Vision Sensors Tactile Sensors Other Sensors
RealSense Series Vitac3D πŸ“¦ To Be Added

🚧 Planned Support For new sensor support requests, please open an issue, or submit a PR with your sensor configuration!

Directory Overview

Directory Description Main Content
πŸ“‚ controller Robot controller wrappers Classes for controlling arms, mobile bases, etc.
πŸ“‚ sensor Sensor wrappers Currently only RealSense cameras
πŸ“‚ utils Utility functions Math, logging, and other helper functions
πŸ“‚ data Data collection module Classes for data recording and processing
πŸ“‚ my_robot Robot integration wrappers Full robot system composition classes
πŸ“‚ policy VLA model policies Vision-Language-Action model implementations
πŸ“‚ scripts Example scripts Main entry points and test scripts
πŸ“‚ third_party Third-party dependencies External libraries requiring compilation
πŸ“‚ planner Motion planning module curobo planner wrappers + simulated robot code
πŸ“‚ example Example workflows Data collection, model deployment examples

About

this project provide a verity of code help you collect data from your robotic arm, have fun!

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Languages