Browse > Natural Language Processing > Language Modelling

Language Modelling

521 papers with code · Natural Language Processing

Language modeling is the task of predicting the next word or character in a document.

* indicates models using dynamic evaluation; where, at test time, models may adapt to seen tokens in order to improve performance on following tokens. (Mikolov et al., (2010), Kraus et al., (2017))

( Image credit: Exploring the Limits of Language Modeling )

Leaderboards

Greatest papers with code

Semi-supervised Sequence Learning

NeurIPS 2015 tensorflow/models

In our experiments, we find that long short term memory recurrent networks after being pretrained with the two approaches are more stable and generalize better.

LANGUAGE MODELLING TEXT CLASSIFICATION

Exploring the Limits of Language Modeling

7 Feb 2016tensorflow/models

In this work we explore recent advances in Recurrent Neural Networks for large scale Language Modeling, a task central to language understanding.

LANGUAGE MODELLING

One Billion Word Benchmark for Measuring Progress in Statistical Language Modeling

11 Dec 2013tensorflow/models

We propose a new benchmark corpus to be used for measuring progress in statistical language modeling.

LANGUAGE MODELLING

FlauBERT: Unsupervised Language Model Pre-training for French

11 Dec 2019huggingface/transformers

Language models have become a key step to achieve state-of-the art results in many different Natural Language Processing (NLP) tasks.

LANGUAGE MODELLING NATURAL LANGUAGE INFERENCE TEXT CLASSIFICATION WORD SENSE DISAMBIGUATION

CamemBERT: a Tasty French Language Model

10 Nov 2019huggingface/transformers

We measure the performance of CamemBERT compared to multilingual models in multiple downstream tasks, namely part-of-speech tagging, dependency parsing, named-entity recognition, and natural language inference.

DEPENDENCY PARSING LANGUAGE MODELLING NAMED ENTITY RECOGNITION NATURAL LANGUAGE INFERENCE PART-OF-SPEECH TAGGING

Unsupervised Cross-lingual Representation Learning at Scale

5 Nov 2019huggingface/transformers

We also present a detailed empirical analysis of the key factors that are required to achieve these gains, including the trade-offs between (1) positive transfer and capacity dilution and (2) the performance of high and low resource languages at scale.

CROSS-LINGUAL TRANSFER LANGUAGE MODELLING REPRESENTATION LEARNING

DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter

NeurIPS 2019 huggingface/transformers

As Transfer Learning from large-scale pre-trained models becomes more prevalent in Natural Language Processing (NLP), operating these large models in on-the-edge and/or under constrained computational training or inference budgets remains challenging.

LANGUAGE MODELLING LINGUISTIC ACCEPTABILITY NATURAL LANGUAGE INFERENCE QUESTION ANSWERING SEMANTIC TEXTUAL SIMILARITY SENTIMENT ANALYSIS TRANSFER LEARNING

CTRL: A Conditional Transformer Language Model for Controllable Generation

Preprint 2019 huggingface/transformers

Large-scale language models show promising text generation capabilities, but users cannot easily control particular aspects of the generated text.

LANGUAGE MODELLING TEXT GENERATION