Search Results for author: Binghao Huang

Found 9 papers, 2 papers with code

3D-ViTac: Learning Fine-Grained Manipulation with Visuo-Tactile Sensing

no code implementations31 Oct 2024 Binghao Huang, YiXuan Wang, Xinyi Yang, Yiyue Luo, Yunzhu Li

Tactile and visual perception are both crucial for humans to perform fine-grained interactions with their environment.

Imitation Learning

GenDP: 3D Semantic Fields for Category-Level Generalizable Diffusion Policy

no code implementations23 Oct 2024 YiXuan Wang, Guang Yin, Binghao Huang, Tarik Kelestemur, Jiuguang Wang, Yunzhu Li

Diffusion-based policies have shown remarkable capability in executing complex robotic manipulation tasks but lack explicit characterization of geometry and semantics, which often limits their ability to generalize to unseen objects and layouts.

RoboEXP: Action-Conditioned Scene Graph via Interactive Exploration for Robotic Manipulation

1 code implementation23 Feb 2024 Hanxiao Jiang, Binghao Huang, Ruihai Wu, Zhuoran Li, Shubham Garg, Hooshang Nayyeri, Shenlong Wang, Yunzhu Li

We introduce the novel task of interactive scene exploration, wherein robots autonomously explore environments and produce an action-conditioned scene graph (ACSG) that captures the structure of the underlying environment.

Robot Synesthesia: In-Hand Manipulation with Visuotactile Sensing

no code implementations4 Dec 2023 Ying Yuan, Haichuan Che, Yuzhe Qin, Binghao Huang, Zhao-Heng Yin, Kang-Won Lee, Yi Wu, Soo-Chul Lim, Xiaolong Wang

In this paper, we introduce a system that leverages visual and tactile sensory inputs to enable dexterous in-hand manipulation.

AnyTeleop: A General Vision-Based Dexterous Robot Arm-Hand Teleoperation System

no code implementations10 Jul 2023 Yuzhe Qin, Wei Yang, Binghao Huang, Karl Van Wyk, Hao Su, Xiaolong Wang, Yu-Wei Chao, Dieter Fox

For real-world experiments, AnyTeleop can outperform a previous system that was designed for a specific robot hardware with a higher success rate, using the same robot.

Imitation Learning

Rotating without Seeing: Towards In-hand Dexterity through Touch

no code implementations20 Mar 2023 Zhao-Heng Yin, Binghao Huang, Yuzhe Qin, Qifeng Chen, Xiaolong Wang

Relying on touch-only sensing, we can directly deploy the policy in a real robot hand and rotate novel objects that are not presented in training.

Object

Learning Continuous Grasping Function with a Dexterous Hand from Human Demonstrations

1 code implementation11 Jul 2022 Jianglong Ye, Jiashun Wang, Binghao Huang, Yuzhe Qin, Xiaolong Wang

We will first convert the large-scale human-object interaction trajectories to robot demonstrations via motion retargeting, and then use these demonstrations to train CGF.

Human-Object Interaction Detection motion retargeting

Cannot find the paper you are looking for? You can Submit a new open access paper.