Hand Pose Estimation
87 papers with code • 10 benchmarks • 22 datasets
Hand pose estimation is the task of finding the joints of the hand from an image or set of video frames.
( Image credit: Pose-REN )
Libraries
Use these libraries to find Hand Pose Estimation models and implementationsDatasets
Latest papers with no code
1st Place Solution of Egocentric 3D Hand Pose Estimation Challenge 2023 Technical Report:A Concise Pipeline for Egocentric Hand Pose Reconstruction
This report introduce our work on Egocentric 3D Hand Pose Estimation workshop.
CLIP-Hand3D: Exploiting 3D Hand Pose Estimation via Context-Aware Prompting
In particular, the distribution order of hand joints in various 3D space directions is derived from pose labels, forming corresponding text prompts that are subsequently encoded into text representations.
Video-Based Hand Pose Estimation for Remote Assessment of Bradykinesia in Parkinson's Disease
Three of the seven models demonstrated good accuracy for on-device recordings, and the accuracy decreased significantly for streaming recordings.
Spectral Graphormer: Spectral Graph-based Transformer for Egocentric Two-Hand Reconstruction using Multi-View Color Images
We propose a novel transformer-based framework that reconstructs two high fidelity hands from multi-view RGB images.
Denoising Diffusion for 3D Hand Pose Estimation from Images
Hand pose estimation from a single image has many applications.
Self-supervised Optimization of Hand Pose Estimation using Anatomical Features and Iterative Learning
The pipeline consists of a general machine learning model for hand pose estimation trained on a generalized dataset, spatial and temporal filtering to account for anatomical constraints of the hand, and a retraining step to improve the model.
UltraGlove: Hand Pose Estimation with Mems-Ultrasonic Sensors
Hand tracking is an important aspect of human-computer interaction and has a wide range of applications in extended reality devices.
Neural Voting Field for Camera-Space 3D Hand Pose Estimation
We present a unified framework for camera-space 3D hand pose estimation from a single RGB image based on 3D implicit representation.
ContactArt: Learning 3D Interaction Priors for Category-level Articulated Object and Hand Poses Estimation
We propose a new dataset and a novel approach to learning hand-object interaction priors for hand and articulated object pose estimation.
AssemblyHands: Towards Egocentric Activity Understanding via 3D Hand Pose Estimation
To obtain high-quality 3D hand pose annotations for the egocentric images, we develop an efficient pipeline, where we use an initial set of manual annotations to train a model to automatically annotate a much larger dataset.