Grasp Contact Prediction
4 papers with code • 2 benchmarks • 5 datasets
Predict contact between object and hand (human or robot).
Most implemented papers
ContactDB: Analyzing and Predicting Grasp Contact via Thermal Imaging
We present ContactDB, a novel dataset of contact maps for household objects that captures the rich hand-object contact that occurs during grasping, enabled by use of a thermal camera.
ContactPose: A Dataset of Grasps with Object Contact and Hand Pose
We introduce ContactPose, the first dataset of hand-object contact paired with hand pose, object pose, and RGB-D images.
GRAB: A Dataset of Whole-Body Human Grasping of Objects
Training computers to understand, model, and synthesize human grasping requires a rich dataset containing complex 3D object shapes, detailed contact information, hand pose and shape, and the 3D body motion over time.
CaTGrasp: Learning Category-Level Task-Relevant Grasping in Clutter from Simulation
This work proposes a framework to learn task-relevant grasping for industrial objects without the need of time-consuming real-world data collection or manual annotation.