Search Results for author: Wang Ling

Found 43 papers, 7 papers with code

MAD for Robust Reinforcement Learning in Machine Translation

no code implementations18 Jul 2022 Domenic Donato, Lei Yu, Wang Ling, Chris Dyer

We introduce a new distributed policy gradient algorithm and show that it outperforms existing reward-aware training procedures such as REINFORCE, minimum risk training (MRT) and proximal policy optimization (PPO) in terms of training stability and generalization performance when optimizing machine translation models.

Machine Translation reinforcement-learning +3

Enabling arbitrary translation objectives with Adaptive Tree Search

no code implementations ICLR 2022 Wang Ling, Wojciech Stokowiec, Domenic Donato, Laurent Sartran, Lei Yu, Austin Matthews, Chris Dyer

When applied to autoregressive models, our algorithm has different biases than beam search has, which enables a new analysis of the role of decoding bias in autoregressive models.

Translation

A Mutual Information Maximization Perspective of Language Representation Learning

no code implementations ICLR 2020 Lingpeng Kong, Cyprien de Masson d'Autume, Wang Ling, Lei Yu, Zihang Dai, Dani Yogatama

We show state-of-the-art word representation learning methods maximize an objective function that is a lower bound on the mutual information between different parts of a word sequence (i. e., a sentence).

Representation Learning Sentence

Better Document-Level Machine Translation with Bayes' Rule

no code implementations TACL 2020 Lei Yu, Laurent Sartran, Wojciech Stokowiec, Wang Ling, Lingpeng Kong, Phil Blunsom, Chris Dyer

We show that Bayes' rule provides an effective mechanism for creating document translation models that can be learned from only parallel sentences and monolingual documents---a compelling benefit as parallel documents are not always available.

Document Level Machine Translation Document Translation +4

Putting Machine Translation in Context with the Noisy Channel Model

no code implementations25 Sep 2019 Lei Yu, Laurent Sartran, Wojciech Stokowiec, Wang Ling, Lingpeng Kong, Phil Blunsom, Chris Dyer

We show that Bayes' rule provides a compelling mechanism for controlling unconditional document language models, using the long-standing challenge of effectively leveraging document context in machine translation.

Document Translation Language Modelling +3

Relative Pixel Prediction For Autoregressive Image Generation

no code implementations25 Sep 2019 Wang Ling, Chris Dyer, Lei Yu, Lingpeng Kong, Dani Yogatama, Susannah Young

In natural images, transitions between adjacent pixels tend to be smooth and gradual, a fact that has long been exploited in image compression models based on predictive coding.

Colorization Image Colorization +4

Learning and Evaluating General Linguistic Intelligence

no code implementations31 Jan 2019 Dani Yogatama, Cyprien de Masson d'Autume, Jerome Connor, Tomas Kocisky, Mike Chrzanowski, Lingpeng Kong, Angeliki Lazaridou, Wang Ling, Lei Yu, Chris Dyer, Phil Blunsom

We define general linguistic intelligence as the ability to reuse previously acquired knowledge about a language's lexicon, syntax, semantics, and pragmatic conventions to adapt to new tasks quickly.

Natural Language Understanding Question Answering

Variational Smoothing in Recurrent Neural Network Language Models

no code implementations ICLR 2019 Lingpeng Kong, Gabor Melis, Wang Ling, Lei Yu, Dani Yogatama

We present a new theoretical perspective of data noising in recurrent neural network language models (Xie et al., 2017).

Language Modelling

Memory Architectures in Recurrent Neural Network Language Models

no code implementations ICLR 2018 Dani Yogatama, Yishu Miao, Gabor Melis, Wang Ling, Adhiguna Kuncoro, Chris Dyer, Phil Blunsom

We compare and analyze sequential, random access, and stack memory architectures for recurrent neural network language models.

Program Induction by Rationale Generation : Learning to Solve and Explain Algebraic Word Problems

1 code implementation11 May 2017 Wang Ling, Dani Yogatama, Chris Dyer, Phil Blunsom

Solving algebraic word problems requires executing a series of arithmetic operations---a program---to obtain a final answer.

Program induction

Generative and Discriminative Text Classification with Recurrent Neural Networks

2 code implementations6 Mar 2017 Dani Yogatama, Chris Dyer, Wang Ling, Phil Blunsom

We empirically characterize the performance of discriminative and generative LSTM models for text classification.

Continual Learning General Classification +2

Learning to Compose Words into Sentences with Reinforcement Learning

no code implementations28 Nov 2016 Dani Yogatama, Phil Blunsom, Chris Dyer, Edward Grefenstette, Wang Ling

We use reinforcement learning to learn tree-structured neural networks for computing representations of natural language sentences.

reinforcement-learning Reinforcement Learning (RL)

Reference-Aware Language Models

no code implementations EMNLP 2017 Zichao Yang, Phil Blunsom, Chris Dyer, Wang Ling

We propose a general class of language models that treat reference as an explicit stochastic latent variable.

Dialogue Generation Recipe Generation

Learning the Curriculum with Bayesian Optimization for Task-Specific Word Representation Learning

no code implementations ACL 2016 Yulia Tsvetkov, Manaal Faruqui, Wang Ling, Brian MacWhinney, Chris Dyer

We use Bayesian optimization to learn curricula for word representation learning, optimizing performance on downstream tasks that depend on the learned representations as features.

Bayesian Optimization Representation Learning

Character-based Neural Machine Translation

no code implementations14 Nov 2015 Wang Ling, Isabel Trancoso, Chris Dyer, Alan W. black

We introduce a neural machine translation model that views the input and output sentences as sequences of characters rather than words.

Machine Translation Translation

Privacy-Preserving Multi-Document Summarization

no code implementations6 Aug 2015 Luís Marujo, José Portêlo, Wang Ling, David Martins de Matos, João P. Neto, Anatole Gershman, Jaime Carbonell, Isabel Trancoso, Bhiksha Raj

State-of-the-art extractive multi-document summarization systems are usually designed without any concern about privacy issues, meaning that all documents are open to third parties.

Document Summarization Multi-Document Summarization +1

Linguistic Evaluation of Support Verb Constructions by OpenLogos and Google Translate

no code implementations LREC 2014 Anabela Barreiro, Johanna Monti, Brigitte Orliac, Susanne Preu{\ss}, Kutz Arrieta, Wang Ling, Fern Batista, o, Isabel Trancoso

This paper presents a systematic human evaluation of translations of English support verb constructions produced by a rule-based machine translation (RBMT) system (OpenLogos) and a statistical machine translation (SMT) system (Google Translate) for five languages: French, German, Italian, Portuguese and Spanish.

Machine Translation Translation

Recognition of Named-Event Passages in News Articles

no code implementations COLING 2012 Luis Marujo, Wang Ling, Anatole Gershman, Jaime Carbonell, João P. Neto, David Matos

We extend the concept of Named Entities to Named Events - commonly occurring events such as battles and earthquakes.

Cannot find the paper you are looking for? You can Submit a new open access paper.