Search Results for author: Ruolin Ye

Found 6 papers, 2 papers with code

Visual-Tactile Sensing for In-Hand Object Reconstruction

no code implementations CVPR 2023 Wenqiang Xu, Zhenjun Yu, Han Xue, Ruolin Ye, Siqiong Yao, Cewu Lu

We propose a simulation environment, VT-Sim, which supports generating hand-object interaction for both rigid and deformable objects.

Object Object Reconstruction

GarmentTracking: Category-Level Garment Pose Tracking

1 code implementation CVPR 2023 Han Xue, Wenqiang Xu, Jieyi Zhang, Tutian Tang, Yutong Li, Wenxin Du, Ruolin Ye, Cewu Lu

In this work, we present a complete package to address the category-level garment pose tracking task: (1) A recording system VR-Garment, with which users can manipulate virtual garment models in simulation through a VR interface.

Pose Tracking

ContourRender: Detecting Arbitrary Contour Shape For Instance Segmentation In One Pass

no code implementations7 Jun 2021 Tutian Tang, Wenqiang Xu, Ruolin Ye, Yan-Feng Wang, Cewu Lu

In addition, we specifically select a subset from COCO val2017 named COCO ContourHard-val to further demonstrate the contour quality improvements.

Instance Segmentation Semantic Segmentation

H2O: A Benchmark for Visual Human-human Object Handover Analysis

no code implementations ICCV 2021 Ruolin Ye, Wenqiang Xu, Zhendong Xue, Tutian Tang, Yanfeng Wang, Cewu Lu

Besides, we also report the hand and object pose errors with existing baselines and show that the dataset can serve as the video demonstrations for robot imitation learning on the handover task.

Imitation Learning Object

Learning Universal Shape Dictionary for Realtime Instance Segmentation

1 code implementation2 Dec 2020 Tutian Tang, Wenqiang Xu, Ruolin Ye, Lixin Yang, Cewu Lu

First, it learns a dictionary from a large collection of shape datasets, making any shape being able to be decomposed into a linear combination through the dictionary.

Explainable Models Instance Segmentation +3

Cannot find the paper you are looking for? You can Submit a new open access paper.