Sentiment analysis is the task of classifying the polarity of a given text.
We introduce a new language representation model called BERT, which stands for Bidirectional Encoder Representations from Transformers.
Ranked #1 on
Question Answering
on CoQA
COMMON SENSE REASONING CONVERSATIONAL RESPONSE SELECTION CROSS-LINGUAL NATURAL LANGUAGE INFERENCE NAMED ENTITY RECOGNITION NATURAL LANGUAGE UNDERSTANDING QUESTION ANSWERING SENTENCE CLASSIFICATION SENTIMENT ANALYSIS
Adversarial training provides a means of regularizing supervised learning algorithms while virtual adversarial training is able to extend supervised learning algorithms to the semi-supervised setting.
Ranked #12 on
Sentiment Analysis
on IMDb
SEMI SUPERVISED TEXT CLASSIFICATION SEMI-SUPERVISED TEXT CLASSIFICATION SENTIMENT ANALYSIS WORD EMBEDDINGS
Convolutional neural network (CNN) is a neural network that can make use of the internal structure of data such as the 2D structure of image data.
Ranked #15 on
Sentiment Analysis
on IMDb
Recent progress in pre-trained neural language models has significantly improved the performance of many natural language processing (NLP) tasks.
Ranked #1 on
Question Answering
on SQuAD1.1 dev
COMMON SENSE REASONING COREFERENCE RESOLUTION LINGUISTIC ACCEPTABILITY NAMED ENTITY RECOGNITION NATURAL LANGUAGE INFERENCE NATURAL LANGUAGE UNDERSTANDING QUESTION ANSWERING READING COMPREHENSION SEMANTIC TEXTUAL SIMILARITY SENTIMENT ANALYSIS WORD SENSE DISAMBIGUATION
As Transfer Learning from large-scale pre-trained models becomes more prevalent in Natural Language Processing (NLP), operating these large models in on-the-edge and/or under constrained computational training or inference budgets remains challenging.
Ranked #8 on
Semantic Textual Similarity
on MRPC
KNOWLEDGE DISTILLATION LANGUAGE MODELLING LINGUISTIC ACCEPTABILITY NATURAL LANGUAGE INFERENCE QUESTION ANSWERING SEMANTIC TEXTUAL SIMILARITY SENTIMENT ANALYSIS TRANSFER LEARNING
Language model pretraining has led to significant performance gains but careful comparison between different approaches is challenging.
Ranked #2 on
Common Sense Reasoning
on SWAG
COMMON SENSE REASONING LANGUAGE MODELLING LEXICAL SIMPLIFICATION LINGUISTIC ACCEPTABILITY NATURAL LANGUAGE INFERENCE QUESTION ANSWERING READING COMPREHENSION SENTIMENT ANALYSIS
With the capability of modeling bidirectional contexts, denoising autoencoding based pretraining like BERT achieves better performance than pretraining approaches based on autoregressive language modeling.
DOCUMENT RANKING HUMOR DETECTION LANGUAGE MODELLING NATURAL LANGUAGE INFERENCE PARAPHRASE IDENTIFICATION QUESTION ANSWERING READING COMPREHENSION SENTIMENT ANALYSIS TEXT CLASSIFICATION
Humans read and write hundreds of billions of messages every day.
Ranked #15 on
Natural Language Inference
on RTE
LINGUISTIC ACCEPTABILITY NATURAL LANGUAGE INFERENCE SENTIMENT ANALYSIS TRANSFER LEARNING
Recent developments in natural language representations have been accompanied by large and expensive models that leverage vast amounts of general-domain text through self-supervised pre-training.
KNOWLEDGE DISTILLATION LANGUAGE MODELLING MODEL COMPRESSION SENTIMENT ANALYSIS
This paper explores a simple and efficient baseline for text classification.
Ranked #1 on
Sentiment Analysis
on Sogou News