Development of a hand gesture based control interface using Deep Learning

This paper describes the implementation of a control system based on ten different hand gestures, providing a useful approach for the implementation of better user-friendly human-machine interfaces. Hand detection is achieved using fast detection and tracking algorithms, and classification by a light convolutional neural network. The experimental results show a real-time response with an accuracy of 95.09%, and making use of low power consumption. These results demonstrate that the proposed system could be applied in a large range of applications such as virtual reality, robotics, autonomous driving systems, human-machine interfaces, augmented reality among others.

PDF Abstract


  Add Datasets introduced or used in this paper

Results from the Paper

  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.


No methods listed for this paper. Add relevant methods here