Emotion Classification
94 papers with code • 10 benchmarks • 27 datasets
Emotion classification, or emotion categorization, is the task of recognising emotions to classify them into the corresponding category. Given an input, classify it as 'neutral or no emotion' or as one, or more, of several given emotions that best represent the mental state of the subject's facial expression, words, and so on. Some example benchmarks include ROCStories, Many Faces of Anger (MFA), and GoEmotions. Models can be evaluated using metrics such as the Concordance Correlation Coefficient (CCC) and the Mean Squared Error (MSE).
Libraries
Use these libraries to find Emotion Classification models and implementationsDatasets
Latest papers
EmoVIT: Revolutionizing Emotion Insights with Visual Instruction Tuning
Visual Instruction Tuning represents a novel learning paradigm involving the fine-tuning of pre-trained language models using task-specific instructions.
VLLMs Provide Better Context for Emotion Understanding Through Common Sense Reasoning
In the first stage, we propose prompting VLLMs to generate descriptions in natural language of the subject's apparent emotion relative to the visual context.
GiMeFive: Towards Interpretable Facial Emotion Classification
Deep convolutional neural networks have been shown to successfully recognize facial emotions for the past years in the realm of computer vision.
TONE: A 3-Tiered ONtology for Emotion analysis
Furthermore, we describe three distinct use cases that demonstrate the applicability of our ontology.
Investigating Shallow and Deep Learning Techniques for Emotion Classification in Short Persian Texts
The identification of emotions in short texts of low-resource languages poses a significant challenge, requiring specialized frameworks and computational intelligence techniques.
Learning Arousal-Valence Representation from Categorical Emotion Labels of Speech
In this work, we propose to learn the AV representation from categorical emotion labels of speech.
Impact of time and note duration tokenizations on deep learning symbolic music modeling
Symbolic music is widely used in various deep learning tasks, including generation, transcription, synthesis, and Music Information Retrieval (MIR).
EmoNeXt: an Adapted ConvNeXt for Facial Emotion Recognition
Facial expressions play a crucial role in human communication serving as a powerful and impactful means to express a wide range of emotions.
Emotion4MIDI: a Lyrics-based Emotion-Labeled Symbolic Music Dataset
We present a new large-scale emotion-labeled symbolic music dataset consisting of 12k MIDI songs.
Label-Aware Hyperbolic Embeddings for Fine-grained Emotion Classification
Fine-grained emotion classification (FEC) is a challenging task.