Natural Language Understanding

417 papers with code • 5 benchmarks • 57 datasets

Natural Language Understanding is an important field of Natural Language Processing which contains various tasks such as text classification, natural language inference and story comprehension. Applications enabled by natural language understanding range from question answering to automated reasoning.

Source: Find a Reasonable Ending for Stories: Does Logic Relation Help the Story Cloze Test?

Libraries

Use these libraries to find Natural Language Understanding models and implementations
5 papers
2,016
5 papers
869
3 papers
6,049
See all 10 libraries.

Most implemented papers

BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

google-research/bert NAACL 2019

We introduce a new language representation model called BERT, which stands for Bidirectional Encoder Representations from Transformers.

Learning Transferable Visual Models From Natural Language Supervision

openai/CLIP 26 Feb 2021

State-of-the-art computer vision systems are trained to predict a fixed set of predetermined object categories.

A Neural Conversational Model

farizrahman4u/seq2seq 19 Jun 2015

We find that this straightforward model can generate simple conversations given a large conversational training dataset.

ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators

google-research/electra ICLR 2020

Then, instead of training a model that predicts the original identities of the corrupted tokens, we train a discriminative model that predicts whether each token in the corrupted input was replaced by a generator sample or not.

Cross-lingual Language Model Pretraining

huggingface/transformers NeurIPS 2019

On unsupervised machine translation, we obtain 34. 3 BLEU on WMT'16 German-English, improving the previous state of the art by more than 9 BLEU.

Snips Voice Platform: an embedded Spoken Language Understanding system for private-by-design voice interfaces

snipsco/snips-nlu 25 May 2018

This paper presents the machine learning architecture of the Snips Voice Platform, a software solution to perform Spoken Language Understanding on microprocessors typical of IoT devices.

BERT for Joint Intent Classification and Slot Filling

monologg/JointBERT 28 Feb 2019

Intent classification and slot filling are two essential tasks for natural language understanding.

Neural Architecture Search with Reinforcement Learning

tensorflow/models 5 Nov 2016

Our cell achieves a test set perplexity of 62. 4 on the Penn Treebank, which is 3. 6 perplexity better than the previous state-of-the-art model.

Know What You Don't Know: Unanswerable Questions for SQuAD

worksheets/0x9a15a170 ACL 2018

Extractive reading comprehension systems can often locate the correct answer to a question in a context document, but they also tend to make unreliable guesses on questions for which the correct answer is not stated in the context.

GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding

jsalt18-sentence-repl/jiant WS 2018

For natural language understanding (NLU) technology to be maximally useful, both practically and as a scientific object of study, it must be general: it must be able to process language in a way that is not exclusively tailored to any one specific task or dataset.