FOIT: Fast Online Instance Transfer for Improved EEG Emotion Recognition

The Electroencephalogram (EEG)-based emotion recognition is promising yet limited by the requirement of a large number of training data. Collecting substantial labeled samples in the training trails is the key to the generalization on the test trails. This process is time-consuming and laborious. In recent years, several studies have proposed various semi-supervised learning (e.g., active learning) and transfer learning (e.g., domain adaptation, style transfer mapping) methods to alleviate the requirement on training data. However, most of them are iterative methods, which need considerable training time and are unfeasible in practice. To tackle this problem, we present the Fast Online Instance Transfer (FOIT) for improved affective Brain-computer Interface (aBCI). FOIT selects auxiliary data from historical sessions and (or) other subjects heuristically, which are then combined with the training data for supervised training. The predictions on the test trails are made by the multiclassifier ensemble. As a one-shot algorithm, FOIT avoids time-consuming iterations. Experimental results show that FOIT brings significant improvement in accuracy for the three category classification (1%-8%) on the SEED dataset and four-category classification (1%-14%) on the SEED-IV dataset in the cross-subject, cross-session, and cross-all scenarios. The time cost over the baselines is moderate (~35s in average for our machine). To achieve the comparative accuracies, the iterative methods require much more time (~45s-~900s). FOIT provides a simple, fast, and practically feasible solution to improve the generalization of aBCIs and allows various choices of classifiers without constrains. Our codes are available online.

PDF

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here