Browse > Natural Language Processing > Sentiment Analysis

Sentiment Analysis

359 papers with code · Natural Language Processing

Sentiment analysis is the task of classifying the polarity of a given text.

Leaderboards

TREND DATASET BEST METHOD PAPER TITLE PAPER CODE COMPARE

Greatest papers with code

Adversarial Training Methods for Semi-Supervised Text Classification

25 May 2016tensorflow/models

Adversarial training provides a means of regularizing supervised learning algorithms while virtual adversarial training is able to extend supervised learning algorithms to the semi-supervised setting.

SENTIMENT ANALYSIS TEXT CLASSIFICATION WORD EMBEDDINGS

Effective Use of Word Order for Text Categorization with Convolutional Neural Networks

HLT 2015 tensorflow/models

Convolutional neural network (CNN) is a neural network that can make use of the internal structure of data such as the 2D structure of image data.

SENTIMENT ANALYSIS

DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter

NeurIPS 2019 huggingface/transformers

As Transfer Learning from large-scale pre-trained models becomes more prevalent in Natural Language Processing (NLP), operating these large models in on-the-edge and/or under constrained computational training or inference budgets remains challenging.

LANGUAGE MODELLING LINGUISTIC ACCEPTABILITY NATURAL LANGUAGE INFERENCE QUESTION ANSWERING SEMANTIC TEXTUAL SIMILARITY SENTIMENT ANALYSIS TRANSFER LEARNING

RoBERTa: A Robustly Optimized BERT Pretraining Approach

26 Jul 2019huggingface/transformers

Language model pretraining has led to significant performance gains but careful comparison between different approaches is challenging.

 SOTA for Question Answering on SQuAD2.0 dev (using extra training data)

LANGUAGE MODELLING LEXICAL SIMPLIFICATION NATURAL LANGUAGE INFERENCE QUESTION ANSWERING READING COMPREHENSION SEMANTIC TEXTUAL SIMILARITY SENTIMENT ANALYSIS

XLNet: Generalized Autoregressive Pretraining for Language Understanding

NeurIPS 2019 huggingface/transformers

With the capability of modeling bidirectional contexts, denoising autoencoding based pretraining like BERT achieves better performance than pretraining approaches based on autoregressive language modeling.

DOCUMENT RANKING LANGUAGE MODELLING NATURAL LANGUAGE INFERENCE QUESTION ANSWERING READING COMPREHENSION SEMANTIC TEXTUAL SIMILARITY SENTIMENT ANALYSIS TEXT CLASSIFICATION

Well-Read Students Learn Better: On the Importance of Pre-training Compact Models

ICLR 2020 google-research/bert

Recent developments in natural language representations have been accompanied by large and expensive models that leverage vast amounts of general-domain text through self-supervised pre-training.

LANGUAGE MODELLING MODEL COMPRESSION SENTIMENT ANALYSIS

Deep contextualized word representations

NAACL 2018 zalandoresearch/flair

We introduce a new type of deep contextualized word representation that models both (1) complex characteristics of word use (e. g., syntax and semantics), and (2) how these uses vary across linguistic contexts (i. e., to model polysemy).

CITATION INTENT CLASSIFICATION COREFERENCE RESOLUTION LANGUAGE MODELLING NAMED ENTITY RECOGNITION NATURAL LANGUAGE INFERENCE QUESTION ANSWERING SEMANTIC ROLE LABELING SENTIMENT ANALYSIS