EEG emotion recognition using dynamical graph convolutional neural networks

In this paper, a multichannel EEG emotion recognition method based on a novel dynamical graph convolutional neural networks (DGCNN) is proposed. The basic idea of the proposed EEG emotion recognition method is to use a graph to model the multichannel EEG features and then perform EEG emotion classification based on this model. Different from the traditional graph convolutional neural networks (GCNN) methods, however, the proposed DGCNN method can dynamically learn the intrinsic relationship between different electroencephalogram (EEG) channels, represented by an adjacency matrix, via training a neural network so as to benefit for more discriminative EEG feature extraction. Then, the learned adjacency matrix is used for learning more discriminative features for improving the EEG emotion recognition. We conduct extensive experiments on the SJTU emotion EEG dataset (SEED) and DREAMER dataset. The experimental results demonstrate that the proposed method achieves better recognition performance than the state-of-the-art methods, in which the average recognition accuracy of 90.4\% is achieved for subject dependent experiment while 79.95\% for subject independent cross-validation one on the SEED database, and the average accuracies of 86.23\%, 84.54\% and 85.02\% are respectively obtained for valence, arousal and dominance classifications on the DREAMER database.

PDF Abstract

Datasets


Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Electroencephalogram (EEG) SEED-IV DGCNN Accuracy 69.88 # 2

Methods


No methods listed for this paper. Add relevant methods here