Search Results for author: Rahma Chaabouni

Found 11 papers, 7 papers with code

Can Transformers Jump Around Right in Natural Language? Assessing Performance Transfer from SCAN

no code implementations EMNLP (BlackboxNLP) 2021 Rahma Chaabouni, Roberto Dessì, Eugene Kharitonov

We present several focused modifications of Transformer that greatly improve generalization capabilities on SCAN and select one that remains on par with a vanilla Transformer on a standard machine translation (MT) task.

Machine Translation Translation

``LazImpa'': Lazy and Impatient neural agents learn to communicate efficiently

no code implementations CONLL 2020 Mathieu Rita, Rahma Chaabouni, Emmanuel Dupoux

Previous work has shown that artificial neural agents naturally develop surprisingly non-efficient codes.

"LazImpa": Lazy and Impatient neural agents learn to communicate efficiently

1 code implementation5 Oct 2020 Mathieu Rita, Rahma Chaabouni, Emmanuel Dupoux

Previous work has shown that artificial neural agents naturally develop surprisingly non-efficient codes.

What they do when in doubt: a study of inductive biases in seq2seq learners

1 code implementation ICLR 2021 Eugene Kharitonov, Rahma Chaabouni

Sequence-to-sequence (seq2seq) learners are widely used, but we still have only limited knowledge about what inductive biases shape the way they generalize.

Memorization

Compositionality and Generalization in Emergent Languages

1 code implementation ACL 2020 Rahma Chaabouni, Eugene Kharitonov, Diane Bouchacourt, Emmanuel Dupoux, Marco Baroni

Third, while compositionality is not necessary for generalization, it provides an advantage in terms of language transmission: The more compositional a language is, the more easily it will be picked up by new learners, even when the latter differ in architecture from the original agents.

Disentanglement

EGG: a toolkit for research on Emergence of lanGuage in Games

no code implementations IJCNLP 2019 Eugene Kharitonov, Rahma Chaabouni, Diane Bouchacourt, Marco Baroni

There is renewed interest in simulating language emergence among deep neural agents that communicate to jointly solve a task, spurred by the practical aim to develop language-enabled interactive AIs, as well as by theoretical questions about the evolution of human language.

Entropy Minimization In Emergent Languages

1 code implementation ICML 2020 Eugene Kharitonov, Rahma Chaabouni, Diane Bouchacourt, Marco Baroni

There is growing interest in studying the languages that emerge when neural agents are jointly trained to solve tasks requiring communication through a discrete channel.

Representation Learning

Anti-efficient encoding in emergent communication

1 code implementation NeurIPS 2019 Rahma Chaabouni, Eugene Kharitonov, Emmanuel Dupoux, Marco Baroni

Despite renewed interest in emergent language simulations with neural networks, little is known about the basic properties of the induced code, and how they compare to human language.

Word-order biases in deep-agent emergent communication

1 code implementation ACL 2019 Rahma Chaabouni, Eugene Kharitonov, Alessandro Lazaric, Emmanuel Dupoux, Marco Baroni

We train models to communicate about paths in a simple gridworld, using miniature languages that reflect or violate various natural language trends, such as the tendency to avoid redundancy or to minimize long-distance dependencies.

Learning weakly supervised multimodal phoneme embeddings

no code implementations23 Apr 2017 Rahma Chaabouni, Ewan Dunbar, Neil Zeghidour, Emmanuel Dupoux

Recent works have explored deep architectures for learning multimodal speech representation (e. g. audio and images, articulation and audio) in a supervised way.

Multi-Task Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.