About

Natural Language Understanding is an important field of Natural Language Processing which contains various tasks such as text classification, natural language inference and story comprehension.

Source: Find a Reasonable Ending for Stories: Does Logic Relation Help the Story Cloze Test?

Benchmarks

TREND DATASET BEST METHOD PAPER TITLE PAPER CODE COMPARE

Datasets

Greatest papers with code

Neural Architecture Search with Reinforcement Learning

5 Nov 2016tensorflow/models

Our cell achieves a test set perplexity of 62. 4 on the Penn Treebank, which is 3. 6 perplexity better than the previous state-of-the-art model.

IMAGE CLASSIFICATION LANGUAGE MODELLING NATURAL LANGUAGE UNDERSTANDING NEURAL ARCHITECTURE SEARCH

ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators

ICLR 2020 tensorflow/models

Then, instead of training a model that predicts the original identities of the corrupted tokens, we train a discriminative model that predicts whether each token in the corrupted input was replaced by a generator sample or not.

LANGUAGE MODELLING NATURAL LANGUAGE UNDERSTANDING QUESTION ANSWERING

I-BERT: Integer-only BERT Quantization

5 Jan 2021huggingface/transformers

Transformer based models, like BERT and RoBERTa, have achieved state-of-the-art results in many Natural Language Processing tasks.

NATURAL LANGUAGE INFERENCE NATURAL LANGUAGE UNDERSTANDING QUANTIZATION

BARThez: a Skilled Pretrained French Sequence-to-Sequence Model

23 Oct 2020huggingface/transformers

We show BARThez to be very competitive with state-of-the-art BERT-based French language models such as CamemBERT and FlauBERT.

 Ranked #1 on Text Summarization on OrangeSum (using extra training data)

NATURAL LANGUAGE UNDERSTANDING SELF-SUPERVISED LEARNING TEXT SUMMARIZATION TRANSFER LEARNING

Optimal Subarchitecture Extraction For BERT

20 Oct 2020huggingface/transformers

We extract an optimal subset of architectural parameters for the BERT architecture from Devlin et al. (2018) by applying recent breakthroughs in algorithms for neural architecture search.

NATURAL LANGUAGE UNDERSTANDING NEURAL ARCHITECTURE SEARCH

Is Supervised Syntactic Parsing Beneficial for Language Understanding? An Empirical Investigation

15 Aug 2020huggingface/transformers

Traditional NLP has long held (supervised) syntactic parsing necessary for successful higher-level semantic language understanding (LU).

LANGUAGE MODELLING NATURAL LANGUAGE UNDERSTANDING

ConvBERT: Improving BERT with Span-based Dynamic Convolution

NeurIPS 2020 huggingface/transformers

The novel convolution heads, together with the rest self-attention heads, form a new mixed attention block that is more efficient at both global and local context learning.

NATURAL LANGUAGE UNDERSTANDING