Search Results for author: Mo Han

Found 8 papers, 0 papers with code

Inference of Upcoming Human Grasp Using EMG During Reach-to-Grasp Movement

no code implementations19 Apr 2021 Mo Han, Mehrshad Zandigohar, Sezen Yagmur Gunay, Gunar Schirner, Deniz Erdogmus

We collected and utilized data from large gesture vocabularies with multiple dynamic actions to encode the transitions from one grasp intent to another based on common sequences of the grasp movements.

Electromyography (EMG) General Classification +2

Multimodal Fusion of EMG and Vision for Human Grasp Intent Inference in Prosthetic Hand Control

no code implementations8 Apr 2021 Mehrshad Zandigohar, Mo Han, Mohammadreza Sharif, Sezen Yagmur Gunay, Mariusz P. Furmanek, Mathew Yarossi, Paolo Bonato, Cagdas Onal, Taskin Padir, Deniz Erdogmus, Gunar Schirner

Conclusion: Our experimental data analyses demonstrate that EMG and visual evidence show complementary strengths, and as a consequence, fusion of multimodal evidence can outperform each individual evidence modality at any given time.

Electroencephalogram (EEG) Electromyography (EMG)

From Hand-Perspective Visual Information to Grasp Type Probabilities: Deep Learning via Ranking Labels

no code implementations8 Mar 2021 Mo Han, Sezen Ya{ğ}mur Günay, İlkay Yıldız, Paolo Bonato, Cagdas D. Onal, Taşkın Padır, Gunar Schirner, Deniz Erdo{ğ}muş

Convolutional neural network-based computer vision control of the prosthetic hand has received increased attention as a method to replace or complement physiological signals due to its reliability by training visual information to predict the hand gesture.

HANDS: A Multimodal Dataset for Modeling Towards Human Grasp Intent Inference in Prosthetic Hands

no code implementations8 Mar 2021 Mo Han, Sezen Ya{ğ}mur Günay, Gunar Schirner, Taşkın Padır, Deniz Erdo{ğ}muş

Specifically, paired images from human eye-view and hand-view of various objects placed at different orientations have been captured at the initial state of grasping trials, followed by paired video, EMG and IMU from the arm of the human during a grasp, lift, put-down, and retract style trial structure.

Motion Planning

Universal Physiological Representation Learning with Soft-Disentangled Rateless Autoencoders

no code implementations28 Sep 2020 Mo Han, Ozan Ozdenizci, Toshiaki Koike-Akino, Ye Wang, Deniz Erdogmus

Human computer interaction (HCI) involves a multidisciplinary fusion of technologies, through which the control of external devices could be achieved by monitoring physiological status of users.

Disentanglement Subject Transfer

Disentangled Adversarial Autoencoder for Subject-Invariant Physiological Feature Extraction

no code implementations26 Aug 2020 Mo Han, Ozan Ozdenizci, Ye Wang, Toshiaki Koike-Akino, Deniz Erdogmus

Recent developments in biosignal processing have enabled users to exploit their physiological status for manipulating devices in a reliable and safe manner.

Subject Transfer Transfer Learning

Disentangled Adversarial Transfer Learning for Physiological Biosignals

no code implementations15 Apr 2020 Mo Han, Ozan Ozdenizci, Ye Wang, Toshiaki Koike-Akino, Deniz Erdogmus

Recent developments in wearable sensors demonstrate promising results for monitoring physiological status in effective and comfortable ways.

Transfer Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.