Search Results for author: Marie-Anne Lachaux

Found 6 papers, 5 papers with code

DOBF: A Deobfuscation Pre-Training Objective for Programming Languages

1 code implementation15 Feb 2021 Baptiste Roziere, Marie-Anne Lachaux, Marc Szafraniec, Guillaume Lample

Recent advances in self-supervised learning have dramatically improved the state of the art on a wide variety of tasks.

Code Search Code Translation +2

Target Conditioning for One-to-Many Generation

no code implementations Findings of the Association for Computational Linguistics 2020 Marie-Anne Lachaux, Armand Joulin, Guillaume Lample

In this paper, we propose to explicitly model this one-to-many mapping by conditioning the decoder of a NMT model on a latent variable that represents the domain of target sentences.

Machine Translation

Unsupervised Translation of Programming Languages

6 code implementations NeurIPS 2020 Marie-Anne Lachaux, Baptiste Roziere, Lowik Chanussot, Guillaume Lample

We train our model on source code from open source GitHub projects, and show that it can translate functions between C++, Java, and Python with high accuracy.

Code Translation Unsupervised Machine Translation

Poly-encoders: Architectures and Pre-training Strategies for Fast and Accurate Multi-sentence Scoring

2 code implementations ICLR 2020 Samuel Humeau, Kurt Shuster, Marie-Anne Lachaux, Jason Weston

The use of deep pre-trained transformers has led to remarkable progress in a number of applications (Devlin et al., 2018).

Cannot find the paper you are looking for? You can Submit a new open access paper.