Search Results for author: Emmanuel Chemla

Found 9 papers, 6 papers with code

What Makes Two Language Models Think Alike?

no code implementations18 Jun 2024 Jeanne Salle, Louis Jalouzot, Nur Lan, Emmanuel Chemla, Yair Lakretz

Do architectural differences significantly affect the way models represent and process language?

Metric Learning

The Impact of Syntactic and Semantic Proximity on Machine Translation with Back-Translation

no code implementations26 Mar 2024 Nicolas Guerin, Shane Steinert-Threlkeld, Emmanuel Chemla

Unsupervised on-the-fly back-translation, in conjunction with multilingual pretraining, is the dominant method for unsupervised neural machine translation.

Translation Unsupervised Machine Translation

Metric-Learning Encoding Models Identify Processing Profiles of Linguistic Features in BERT's Representations

1 code implementation18 Feb 2024 Louis Jalouzot, Robin Sobczyk, Bastien Lhopitallier, Jeanne Salle, Nur Lan, Emmanuel Chemla, Yair Lakretz

Together, this demonstrates the utility of Metric-Learning Encoding Methods for studying how linguistic features are neurally encoded in language models and the advantage of MLEMs over traditional methods.

Metric Learning

Bridging the Empirical-Theoretical Gap in Neural Network Formal Language Learning Using Minimum Description Length

1 code implementation15 Feb 2024 Nur Lan, Emmanuel Chemla, Roni Katzir

Neural networks offer good approximation to many tasks but consistently fail to reach perfect generalization, even when theoretical work shows that such perfect solutions can be expressed by certain architectures.

Minimum Description Length Hopfield Networks

1 code implementation11 Nov 2023 Matan Abudy, Nur Lan, Emmanuel Chemla, Roni Katzir

Associative memory architectures are designed for memorization but also offer, through their retrieval method, a form of generalization to unseen inputs: stored memories can be seen as prototypes from this point of view.

Memorization Retrieval

Minimum Description Length Recurrent Neural Networks

1 code implementation31 Oct 2021 Nur Lan, Michal Geyer, Emmanuel Chemla, Roni Katzir

We train neural networks to optimize a Minimum Description Length score, i. e., to balance between the complexity of the network and its accuracy at a task.

On the Spontaneous Emergence of Discrete and Compositional Signals

1 code implementation ACL 2020 Nur Geffen Lan, Emmanuel Chemla, Shane Steinert-Threlkeld

We propose a general framework to study language emergence through signaling games with neural agents.

Cannot find the paper you are looking for? You can Submit a new open access paper.