Browse > Natural Language Processing > Language Modelling

Language Modelling

424 papers with code · Natural Language Processing

Language modeling is the task of predicting the next word or character in a document.

* indicates models using dynamic evaluation; where, at test time, models may adapt to seen tokens in order to improve performance on following tokens. (Mikolov et al., (2010), Kraus et al., (2017))

Leaderboards

Latest papers with code

Understanding and Robustifying Differentiable Architecture Search

ICLR 2020 MetaAnonym/RobustDARTS

Differentiable Architecture Search (DARTS) has attracted a lot of attention due to its simplicity and small search costs achieved by a continuous relaxation and an approximation of the resulting bi-level optimization problem.

DISPARITY ESTIMATION IMAGE CLASSIFICATION LANGUAGE MODELLING REGRESSION

12
01 Jan 2020

FINBERT: FINANCIAL SENTIMENT ANALYSIS WITH PRE-TRAINED LANGUAGE MODELS

ICLR 2020 ProsusAI/finBERT

While many sentiment classification solutions report high accuracy scores in product or movie review datasets, the performance of the methods in niche domains such as finance still largely falls behind.

LANGUAGE MODELLING SENTIMENT ANALYSIS TRANSFER LEARNING

7
01 Jan 2020

Encoder-Agnostic Adaptation for Conditional Language Generation

ICLR 2020 anon37234/encoder-agnostic-adaptation

Large pretrained language models have changed the way researchers approach discriminative natural language understanding tasks, leading to the dominance of approaches that adapt a pretrained model for arbitrary downstream tasks.

LANGUAGE MODELLING TEXT GENERATION

0
01 Jan 2020

DATA: Differentiable ArchiTecture Approximation

NeurIPS 2019 XinbangZhang/DATA-NAS

Neural architecture search (NAS) is inherently subject to the gap of architectures during searching and validating.

IMAGE CLASSIFICATION LANGUAGE MODELLING NEURAL ARCHITECTURE SEARCH SEMANTIC SEGMENTATION

0
01 Dec 2019

SMILES Transformer: Pre-trained Molecular Fingerprint for Low Data Drug Discovery

12 Nov 2019DSPsleeporg/smiles-transformer

Inspired by Transformer and pre-trained language models from natural language processing, SMILES Transformer learns molecular fingerprints through unsupervised pre-training of the sequence-to-sequence language model using a huge corpus of SMILES, a text representation system for molecules.

DRUG DISCOVERY LANGUAGE MODELLING

2
12 Nov 2019

BP-Transformer: Modelling Long-Range Context via Binary Partitioning

11 Nov 2019yzh119/BPT

The Transformer model is widely successful on many natural language processing tasks.

LANGUAGE MODELLING MACHINE TRANSLATION TEXT CLASSIFICATION

16
11 Nov 2019

GORC: A large contextual citation graph of academic papers

7 Nov 2019allenai/s2-gorc

We introduce the Semantic Scholar Graph of References in Context (GORC), a large contextual citation graph of 81. 1M academic publications, including parsed full text for 8. 1M open access papers, across broad domains of science.

LANGUAGE MODELLING

12
07 Nov 2019

A Programmable Approach to Model Compression

6 Nov 2019NVlabs/condensa

However, while the results are desirable, finding the best compression strategy for a given neural network, target platform, and optimization objective often requires extensive experimentation.

IMAGE CLASSIFICATION LANGUAGE MODELLING MODEL COMPRESSION QUANTIZATION

68
06 Nov 2019

Inducing brain-relevant bias in natural language processing models

NeurIPS 2019 danrsc/bert_brain_neurips_2019

Progress in natural language processing (NLP) models that estimate representations of word sequences has recently been leveraged to improve the understanding of language processing in the brain.

LANGUAGE MODELLING

2
29 Oct 2019