Continuous and Simultaneous Gesture and Posture Recognition for Commanding a Robotic Wheelchair; Towards Spotting the Signal Patterns

2 Dec 2015  ·  Ali Boyali, Naohisa Hashimoto, Manolya Kavakli ·

Spotting signal patterns with varying lengths has been still an open problem in the literature. In this study, we describe a signal pattern recognition approach for continuous and simultaneous classification of a tracked hand's posture and gestures and map them to steering commands for control of a robotic wheelchair. The developed methodology not only affords 100\% recognition accuracy on a streaming signal for continuous recognition, but also brings about a new perspective for building a training dictionary which eliminates human intervention to spot the gesture or postures on a training signal. In the training phase we employ a state of art subspace clustering method to find the most representative state samples. The recognition and training framework reveal boundaries of the patterns on the streaming signal with a successive decision tree structure intrinsically. We make use of the Collaborative ans Block Sparse Representation based classification methods for continuous gesture and posture recognition.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here