Browse > Natural Language Processing > Sentiment Analysis

Sentiment Analysis

213 papers with code · Natural Language Processing

Sentiment analysis is the task of classifying the polarity of a given text.

State-of-the-art leaderboards

Greatest papers with code

Adversarial Training Methods for Semi-Supervised Text Classification

25 May 2016tensorflow/models

Adversarial training provides a means of regularizing supervised learning algorithms while virtual adversarial training is able to extend supervised learning algorithms to the semi-supervised setting. However, both methods require making small perturbations to numerous entries of the input vector, which is inappropriate for sparse high-dimensional inputs such as one-hot word representations.

SENTIMENT ANALYSIS TEXT CLASSIFICATION WORD EMBEDDINGS

Effective Use of Word Order for Text Categorization with Convolutional Neural Networks

1 Dec 2014tensorflow/models

Convolutional neural network (CNN) is a neural network that can make use of the internal structure of data such as the 2D structure of image data. This paper studies CNN on text categorization to exploit the 1D structure (namely, word order) of text data for accurate prediction.

SENTIMENT ANALYSIS TEXT CATEGORIZATION

Deep contextualized word representations

HLT 2018 zalandoresearch/flair

We introduce a new type of deep contextualized word representation that models both (1) complex characteristics of word use (e.g., syntax and semantics), and (2) how these uses vary across linguistic contexts (i.e., to model polysemy). Our word vectors are learned functions of the internal states of a deep bidirectional language model (biLM), which is pre-trained on a large text corpus.

COREFERENCE RESOLUTION LANGUAGE MODELLING NAMED ENTITY RECOGNITION NATURAL LANGUAGE INFERENCE QUESTION ANSWERING SEMANTIC ROLE LABELING SENTIMENT ANALYSIS

Convolutional Neural Networks for Sentence Classification

EMNLP 2014 dennybritz/cnn-text-classification-tf

We report on a series of experiments with convolutional neural networks (CNN) trained on top of pre-trained word vectors for sentence-level classification tasks. We show that a simple CNN with little hyperparameter tuning and static vectors achieves excellent results on multiple benchmarks.

SENTENCE CLASSIFICATION SENTIMENT ANALYSIS

A Structured Self-attentive Sentence Embedding

9 Mar 2017jadore801120/attention-is-all-you-need-pytorch

This paper proposes a new model for extracting an interpretable sentence embedding by introducing self-attention. Instead of using a vector, we use a 2-D matrix to represent the embedding, with each row of the matrix attending on a different part of the sentence.

NATURAL LANGUAGE INFERENCE SENTENCE EMBEDDING SENTIMENT ANALYSIS

Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks

IJCNLP 2015 tensorflow/fold

Because of their superior ability to preserve sequence information over time, Long Short-Term Memory (LSTM) networks, a type of recurrent neural network with a more complex computational unit, have obtained strong results on a variety of sequence modeling tasks. The only underlying LSTM structure that has been explored so far is a linear chain.

SENTIMENT ANALYSIS

The Natural Language Decathlon: Multitask Learning as Question Answering

ICLR 2019 salesforce/decaNLP

Furthermore, we present a new Multitask Question Answering Network (MQAN) jointly learns all tasks in decaNLP without any task-specific modules or parameters in the multitask setting. Though designed for decaNLP, MQAN also achieves state of the art results on the WikiSQL semantic parsing task in the single-task setting.

DOMAIN ADAPTATION MACHINE TRANSLATION NAMED ENTITY RECOGNITION NATURAL LANGUAGE INFERENCE QUESTION ANSWERING RELATION EXTRACTION SEMANTIC PARSING SEMANTIC ROLE LABELING SENTIMENT ANALYSIS TEXT CLASSIFICATION TRANSFER LEARNING

Universal Sentence Encoder

29 Mar 2018facebookresearch/InferSent

For both variants, we investigate and report the relationship between model complexity, resource consumption, the availability of transfer task training data, and task performance. We find that transfer learning using sentence embeddings tends to outperform word level transfer.

SEMANTIC TEXTUAL SIMILARITY SENTENCE EMBEDDINGS SENTIMENT ANALYSIS SUBJECTIVITY ANALYSIS TEXT CLASSIFICATION TRANSFER LEARNING WORD EMBEDDINGS

Learning to Generate Reviews and Discovering Sentiment

ICLR 2018 openai/generating-reviews-discovering-sentiment

We explore the properties of byte-level recurrent language models. When given sufficient amounts of capacity, training data, and compute time, the representations learned by these models include disentangled features corresponding to high-level concepts.

SENTIMENT ANALYSIS SUBJECTIVITY ANALYSIS

Quasi-Recurrent Neural Networks

5 Nov 2016salesforce/pytorch-qrnn

Recurrent neural networks are a powerful tool for modeling sequential data, but the dependence of each timestep's computation on the previous timestep's output limits parallelism and makes RNNs unwieldy for very long sequences. We introduce quasi-recurrent neural networks (QRNNs), an approach to neural sequence modeling that alternates convolutional layers, which apply in parallel across timesteps, and a minimalist recurrent pooling function that applies in parallel across channels.

LANGUAGE MODELLING MACHINE TRANSLATION SENTIMENT ANALYSIS