Sentiment Analysis
1296 papers with code • 39 benchmarks • 93 datasets
Sentiment Analysis is the task of classifying the polarity of a given text. For instance, a text-based tweet can be categorized into either "positive", "negative", or "neutral". Given the text and accompanying labels, a model can be trained to predict the correct sentiment.
Sentiment Analysis techniques can be categorized into machine learning approaches, lexicon-based approaches, and even hybrid methods. Some subcategories of research in sentiment analysis include: multimodal sentiment analysis, aspect-based sentiment analysis, fine-grained opinion analysis, language specific sentiment analysis.
More recently, deep learning techniques, such as RoBERTa and T5, are used to train high-performing sentiment classifiers that are evaluated using metrics like F1, recall, and precision. To evaluate sentiment analysis systems, benchmark datasets like SST, GLUE, and IMDB movie reviews are used.
Further readings:
Libraries
Use these libraries to find Sentiment Analysis models and implementationsDatasets
Subtasks
- Aspect-Based Sentiment Analysis (ABSA)
- Multimodal Sentiment Analysis
- Aspect Sentiment Triplet Extraction
- Twitter Sentiment Analysis
- Twitter Sentiment Analysis
- Aspect Term Extraction and Sentiment Classification
- target-oriented opinion words extraction
- Arabic Sentiment Analysis
- Persian Sentiment Analysis
- Aspect-oriented Opinion Extraction
- Fine-Grained Opinion Analysis
- Aspect-Sentiment-Opinion Triplet Extraction
- Aspect-Category-Opinion-Sentiment Quadruple Extraction
- Vietnamese Aspect-Based Sentiment Analysis
- Vietnamese Sentiment Analysis
- Pcl Detection
Most implemented papers
emoji2vec: Learning Emoji Representations from their Description
Many current natural language processing applications for social media rely on representation learning and utilize pre-trained word embeddings.
Using millions of emoji occurrences to learn any-domain representations for detecting sentiment, emotion and sarcasm
NLP tasks are often limited by scarcity of manually annotated data.
Multi-Task Deep Neural Networks for Natural Language Understanding
In this paper, we present a Multi-Task Deep Neural Network (MT-DNN) for learning representations across multiple natural language understanding (NLU) tasks.
Simplifying Graph Convolutional Networks
Graph Convolutional Networks (GCNs) and their variants have experienced significant attention and have become the de facto methods for learning graph representations.
TinyBERT: Distilling BERT for Natural Language Understanding
To accelerate inference and reduce model size while maintaining accuracy, we first propose a novel Transformer distillation method that is specially designed for knowledge distillation (KD) of the Transformer-based models.
Mockingjay: Unsupervised Speech Representation Learning with Deep Bidirectional Transformer Encoders
We present Mockingjay as a new speech representation learning approach, where bidirectional Transformer encoders are pre-trained on a large amount of unlabeled speech.
ZEN: Pre-training Chinese Text Encoder Enhanced by N-gram Representations
Moreover, it is shown that reasonable performance can be obtained when ZEN is trained on a small corpus, which is important for applying pre-training techniques to scenarios with limited data.
SKEP: Sentiment Knowledge Enhanced Pre-training for Sentiment Analysis
In particular, the prediction of aspect-sentiment pairs is converted into multi-label classification, aiming to capture the dependency between words in a pair.
A new ANEW: Evaluation of a word list for sentiment analysis in microblogs
Sentiment analysis of microblogs such as Twitter has recently gained a fair amount of attention.
Sentiment Analysis of Twitter Data for Predicting Stock Market Movements
In this paper, we have applied sentiment analysis and supervised machine learning principles to the tweets extracted from twitter and analyze the correlation between stock market movements of a company and sentiments in tweets.