Sentiment Classification

305 papers with code • 1 benchmarks • 9 datasets

This task has no description! Would you like to contribute one?

Libraries

Use these libraries to find Sentiment Classification models and implementations

Most implemented papers

A Structured Self-attentive Sentence Embedding

jadore801120/attention-is-all-you-need-pytorch 9 Mar 2017

This paper proposes a new model for extracting an interpretable sentence embedding by introducing self-attention.

Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks

stanfordnlp/treelstm IJCNLP 2015

Because of their superior ability to preserve sequence information over time, Long Short-Term Memory (LSTM) networks, a type of recurrent neural network with a more complex computational unit, have obtained strong results on a variety of sequence modeling tasks.

A C-LSTM Neural Network for Text Classification

zackhy/TextClassification 27 Nov 2015

In this work, we combine the strengths of both architectures and propose a novel and unified model called C-LSTM for sentence representation and text classification.

Effective LSTMs for Target-Dependent Sentiment Classification

songyouwei/ABSA-PyTorch COLING 2016

Target-dependent sentiment classification remains a challenge: modeling the semantic relatedness of a target with its context words in a sentence.

NEZHA: Neural Contextualized Representation for Chinese Language Understanding

PaddlePaddle/PaddleNLP 31 Aug 2019

The pre-trained language models have achieved great successes in various natural language understanding (NLU) tasks due to its capacity to capture the deep contextualized information in text by pre-training on large-scale corpora.

Aspect Level Sentiment Classification with Deep Memory Network

songyouwei/ABSA-PyTorch EMNLP 2016

Such importance degree and text representation are calculated with multiple computational layers, each of which is a neural attention model over an external memory.

Quasi-Recurrent Neural Networks

salesforce/pytorch-qrnn 5 Nov 2016

Recurrent neural networks are a powerful tool for modeling sequential data, but the dependence of each timestep's computation on the previous timestep's output limits parallelism and makes RNNs unwieldy for very long sequences.

Mockingjay: Unsupervised Speech Representation Learning with Deep Bidirectional Transformer Encoders

andi611/Self-Supervised-Speech-Pretraining-and-Representation-Learning 25 Oct 2019

We present Mockingjay as a new speech representation learning approach, where bidirectional Transformer encoders are pre-trained on a large amount of unlabeled speech.

Knowing What, How and Why: A Near Complete Solution for Aspect-based Sentiment Analysis

xuuuluuu/SemEval-Triplet-data 5 Nov 2019

In this paper, we introduce a new subtask under ABSA, named aspect sentiment triplet extraction (ASTE).