Emotion Classification

94 papers with code • 10 benchmarks • 27 datasets

Emotion classification, or emotion categorization, is the task of recognising emotions to classify them into the corresponding category. Given an input, classify it as 'neutral or no emotion' or as one, or more, of several given emotions that best represent the mental state of the subject's facial expression, words, and so on. Some example benchmarks include ROCStories, Many Faces of Anger (MFA), and GoEmotions. Models can be evaluated using metrics such as the Concordance Correlation Coefficient (CCC) and the Mean Squared Error (MSE).

Libraries

Use these libraries to find Emotion Classification models and implementations

KAM -- a Kernel Attention Module for Emotion Classification with EEG Data

dykuang/bci-attention 17 Aug 2022

In this work, a kernel attention module is presented for the task of EEG-based emotion classification with neural networks.

3
17 Aug 2022

A Monotonicity Constrained Attention Module for Emotion Classification with Limited EEG Data

dykuang/bci-attention 17 Aug 2022

In this work, a parameter-efficient attention module is presented for emotion classification using a limited, or relatively small, number of electroencephalogram (EEG) signals.

3
17 Aug 2022

Dilated Context Integrated Network with Cross-Modal Consensus for Temporal Emotion Localization in Videos

yyjmjc/temporal-emotion-localization-in-videos 3 Aug 2022

In this paper, we introduce a new task, named Temporal Emotion Localization in videos~(TEL), which aims to detect human emotions and localize their corresponding temporal boundaries in untrimmed videos with aligned subtitles.

6
03 Aug 2022

ArmanEmo: A Persian Dataset for Text-based Emotion Detection

arman-rayan-sharif/arman-text-emotion 24 Jul 2022

With the recent proliferation of open textual data on social media platforms, Emotion Detection (ED) from Text has received more attention over the past years.

10
24 Jul 2022

BYEL : Bootstrap Your Emotion Latent

rhtm02/Bootstrap-Your-Emotion-Latent 20 Jul 2022

With the improved performance of deep learning, the number of studies trying to apply deep learning to human emotion analysis is increasing rapidly.

1
20 Jul 2022

GraphCFC: A Directed Graph Based Cross-Modal Feature Complementation Approach for Multimodal Conversational Emotion Recognition

lijfrank-open/GraphCFC 6 Jul 2022

In multimodal ERC, GNNs are capable of extracting both long-distance contextual information and inter-modal interactive information.

1
06 Jul 2022

Accurate Emotion Strength Assessment for Seen and Unseen Speech Based on Data-Driven Deep Learning

ttslr/strengthnet 15 Jun 2022

In this paper, we propose a data-driven deep learning model, i. e. StrengthNet, to improve the generalization of emotion strength assessment for seen and unseen speech.

76
15 Jun 2022

None Class Ranking Loss for Document-Level Relation Extraction

yangzhou12/ncrl 1 May 2022

This ignores the context of entity pairs and the label correlations between the none class and pre-defined classes, leading to sub-optimal predictions.

13
01 May 2022

Exploiting Multiple EEG Data Domains with Adversarial Learning

philipph77/acse-framework 16 Apr 2022

Electroencephalography (EEG) is shown to be a valuable data source for evaluating subjects' mental states.

8
16 Apr 2022

Is Cross-Attention Preferable to Self-Attention for Multi-Modal Emotion Recognition?

smartcameras/selfcrossattn 18 Feb 2022

Generally, models that fuse complementary information from multiple modalities outperform their uni-modal counterparts.

45
18 Feb 2022