A Flexible and Modular Body-Machine Interface for Individuals Living with Severe Disabilities

29 Jul 2020  ·  Cheikh Latyr Fall, Ulysse Côté-Allard, Quentin Mascret, Alexandre Campeau-Lecours, Mounir Boukadoum, Clément Gosselin, Benoit Gosselin ·

This paper presents a control interface to translate the residual body motions of individuals living with severe disabilities, into control commands for body-machine interaction. A custom, wireless, wearable multi-sensor network is used to collect motion data from multiple points on the body in real-time. The solution proposed successfully leverage electromyography gesture recognition techniques for the recognition of inertial measurement units-based commands (IMU), without the need for cumbersome and noisy surface electrodes. Motion pattern recognition is performed using a computationally inexpensive classifier (Linear Discriminant Analysis) so that the solution can be deployed onto lightweight embedded platforms. Five participants (three able-bodied and two living with upper-body disabilities) presenting different motion limitations (e.g. spasms, reduced motion range) were recruited. They were asked to perform up to 9 different motion classes, including head, shoulder, finger, and foot motions, with respect to their residual functional capacities. The measured prediction performances show an average accuracy of 99.96% for able-bodied individuals and 91.66% for participants with upper-body disabilities. The recorded dataset has also been made available online to the research community. Proof of concept for the real-time use of the system is given through an assembly task replicating activities of daily living using the JACO arm from Kinova Robotics.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper