Construct Dynamic Graphs for Hand Gesture Recognition via Spatial-Temporal Attention

20 Jul 2019  ·  Yuxiao Chen, Long Zhao, Xi Peng, Jianbo Yuan, Dimitris N. Metaxas ·

We propose a Dynamic Graph-Based Spatial-Temporal Attention (DG-STA) method for hand gesture recognition. The key idea is to first construct a fully-connected graph from a hand skeleton, where the node features and edges are then automatically learned via a self-attention mechanism that performs in both spatial and temporal domains. We further propose to leverage the spatial-temporal cues of joint positions to guarantee robust recognition in challenging conditions. In addition, a novel spatial-temporal mask is applied to significantly cut down the computational cost by 99%. We carry out extensive experiments on benchmarks (DHG-14/28 and SHREC'17) and prove the superior performance of our method compared with the state-of-the-art methods. The source code can be found at https://github.com/yuxiaochen1103/DG-STA.

PDF Abstract

Datasets


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Hand Gesture Recognition DHG-14 DG-STA Accuracy 91.9 # 4
Hand Gesture Recognition DHG-28 DG-STA Accuracy 88 # 4
Hand Gesture Recognition SHREC 2017 DG-STA 14 gestures accuracy 94.4 # 3
28 gestures accuracy 90.7 # 3

Methods


No methods listed for this paper. Add relevant methods here