Gesture Recognition
118 papers with code • 13 benchmarks • 14 datasets
Gesture Recognition is an active field of research with applications such as automatic recognition of sign language, interaction of humans and robots or for new ways of controlling video games.
Source: Gesture Recognition in RGB Videos Using Human Body Keypoints and Dynamic Time Warping
Libraries
Use these libraries to find Gesture Recognition models and implementationsDatasets
Latest papers with no code
Towards Open-World Gesture Recognition
We propose leveraging continual learning to make machine learning models adaptive to new tasks without degrading performance on previously learned tasks.
Simultaneous Gesture Classification and Localization with an Automatic Gesture Annotation Model
Training a real-time gesture recognition model heavily relies on annotated data.
Resource-Efficient Gesture Recognition using Low-Resolution Thermal Camera via Spiking Neural Networks and Sparse Segmentation
This work proposes a novel approach for hand gesture recognition using an inexpensive, low-resolution (24 x 32) thermal sensor processed by a Spiking Neural Network (SNN) followed by Sparse Segmentation and feature-based gesture classification via Robust Principal Component Analysis (R-PCA).
A multimodal gesture recognition dataset for desktop human-computer interaction
Gesture recognition is an indispensable component of natural and efficient human-computer interaction technology, particularly in desktop-level applications, where it can significantly enhance people's productivity.
Tackling Electrode Shift In Gesture Recognition with HD-EMG Electrode Subsets
sEMG pattern recognition algorithms have been explored extensively in decoding movement intent, yet are known to be vulnerable to changing recording conditions, exhibiting significant drops in performance across subjects, and even across sessions.
ICI-Free Channel Estimation and Wireless Gesture Recognition Based on Cellular Signals
Device-free wireless sensing attracts enormous attentions since it senses the environment without additional devices.
EMG subspace alignment and visualization for cross-subject hand gesture classification
Electromyograms (EMG)-based hand gesture recognition systems are a promising technology for human/machine interfaces.
Language Modeling on a SpiNNaker 2 Neuromorphic Chip
In this work, we demonstrate the first-ever implementation of a language model on a neuromorphic device - specifically the SpiNNaker 2 chip - based on a recently published event-based architecture called the EGRU.
Physical-Layer Semantic-Aware Network for Zero-Shot Wireless Sensing
Motivated by the observation that signals recorded by wireless receivers are closely related to a set of physical-layer semantic features, in this paper we propose a novel zero-shot wireless sensing solution that allows models constructed in one or a limited number of locations to be directly transferred to other locations without any labeled data.
Interpretable Underwater Diver Gesture Recognition
In recent years, usage and applications of Autonomous Underwater Vehicles has grown rapidly.