Search Results for author: Jialiang Zhao

Found 9 papers, 1 papers with code

Scaling Proprioceptive-Visual Learning with Heterogeneous Pre-trained Transformers

2 code implementations30 Sep 2024 Lirui Wang, Xinlei Chen, Jialiang Zhao, Kaiming He

Previous robot learning methods often collect data to train with one specific embodiment for one task, which is expensive and prone to overfitting.

Transferable Tactile Transformers for Representation Learning Across Diverse Sensors and Tasks

no code implementations19 Jun 2024 Jialiang Zhao, Yuxiang Ma, Lirui Wang, Edward H. Adelson

FoTa is the largest and most diverse dataset in tactile sensing to date and it is made publicly available in a unified format.

Representation Learning

PoCo: Policy Composition from and for Heterogeneous Robot Learning

no code implementations4 Feb 2024 Lirui Wang, Jialiang Zhao, Yilun Du, Edward H. Adelson, Russ Tedrake

Training general robotic policies from heterogeneous data for different tasks is a significant challenge.

GelSight Svelte: A Human Finger-shaped Single-camera Tactile Robot Finger with Large Sensing Coverage and Proprioceptive Sensing

no code implementations19 Sep 2023 Jialiang Zhao, Edward H. Adelson

Moreover, existing methods to estimate proprioceptive information such as total forces and torques applied on the finger from camera-based tactile sensors are not effective when the contact geometry is complex.

FingerSLAM: Closed-loop Unknown Object Localization and Reconstruction from Visuo-tactile Feedback

no code implementations14 Mar 2023 Jialiang Zhao, Maria Bauza, Edward H. Adelson

FingerSLAM is constructed with two constituent pose estimators: a multi-pass refined tactile-based pose estimator that captures movements from detailed local textures, and a single-pass vision-based pose estimator that predicts from a global view of the object.

3D Reconstruction Object Localization +1

Learning to Compose Hierarchical Object-Centric Controllers for Robotic Manipulation

no code implementations9 Nov 2020 Mohit Sharma, Jacky Liang, Jialiang Zhao, Alex LaGrassa, Oliver Kroemer

Manipulation tasks can often be decomposed into multiple subtasks performed in parallel, e. g., sliding an object to a goal pose while maintaining contact with a table.

Object reinforcement-learning +2

Towards Robotic Assembly by Predicting Robust, Precise and Task-oriented Grasps

no code implementations4 Nov 2020 Jialiang Zhao, Daniel Troniak, Oliver Kroemer

Robust task-oriented grasp planning is vital for autonomous robotic precision assembly tasks.

Object

Towards Precise Robotic Grasping by Probabilistic Post-grasp Displacement Estimation

no code implementations4 Sep 2019 Jialiang Zhao, Jacky Liang, Oliver Kroemer

Precise robotic grasping is important for many industrial applications, such as assembly and palletizing, where the location of the object needs to be controlled and known.

Object Robotic Grasping

Annotation and Detection of Emotion in Text-based Dialogue Systems with CNN

no code implementations3 Oct 2017 Jialiang Zhao, Qi Gao

Knowledge of users' emotion states helps improve human-computer interaction.

Cannot find the paper you are looking for? You can Submit a new open access paper.