Search Results for author: Elena Voita

Found 17 papers, 11 papers with code

Neurons in Large Language Models: Dead, N-gram, Positional

no code implementations9 Sep 2023 Elena Voita, Javier Ferrando, Christoforos Nalmpantis

Specifically, we focus on the OPT family of models ranging from 125m to 66b parameters and rely only on whether an FFN neuron is activated or not.

Position

Looking for a Needle in a Haystack: A Comprehensive Study of Hallucinations in Neural Machine Translation

2 code implementations10 Aug 2022 Nuno M. Guerreiro, Elena Voita, André F. T. Martins

Although the problem of hallucinations in neural machine translation (NMT) has received some attention, research on this highly pathological phenomenon lacks solid ground.

Machine Translation NMT

Language Modeling, Lexical Translation, Reordering: The Training Process of NMT through the Lens of Classical SMT

no code implementations EMNLP 2021 Elena Voita, Rico Sennrich, Ivan Titov

Differently from the traditional statistical MT that decomposes the translation task into distinct separately learned components, neural machine translation uses a single neural network to model the entire translation process.

Language Modelling Machine Translation +4

Unsupervised Discovery of Interpretable Latent Manipulations in Language VAEs

no code implementations1 Jan 2021 Max Ryabinin, Artem Babenko, Elena Voita

In this work, we make the first step towards unsupervised discovery of interpretable directions in language latent spaces.

Sentence Text Generation

Analyzing the Source and Target Contributions to Predictions in Neural Machine Translation

1 code implementation ACL 2021 Elena Voita, Rico Sennrich, Ivan Titov

We find that models trained with more data tend to rely on source information more and to have more sharp token contributions; the training process is non-monotonic with several stages of different nature.

Language Modelling Machine Translation +2

Embedding Words in Non-Vector Space with Unsupervised Graph Learning

1 code implementation EMNLP 2020 Max Ryabinin, Sergei Popov, Liudmila Prokhorenkova, Elena Voita

We adopt a recent method learning a representation of data in the form of a differentiable weighted graph and use it to modify the GloVe training algorithm.

Graph Learning Word Embeddings +1

Information-Theoretic Probing with Minimum Description Length

2 code implementations EMNLP 2020 Elena Voita, Ivan Titov

Instead, we propose an alternative to the standard probes, information-theoretic probing with minimum description length (MDL).

Sequence Modeling with Unconstrained Generation Order

1 code implementation NeurIPS 2019 Dmitrii Emelianenko, Elena Voita, Pavel Serdyukov

The dominant approach to sequence generation is to produce a sequence in some predefined order, e. g. left to right.

Image Captioning Machine Translation +1

The Bottom-up Evolution of Representations in the Transformer: A Study with Machine Translation and Language Modeling Objectives

no code implementations IJCNLP 2019 Elena Voita, Rico Sennrich, Ivan Titov

In this work, we use canonical correlation analysis and mutual information estimators to study how information flows across Transformer layers and how this process depends on the choice of learning objective.

Language Modelling Machine Translation +2

When a Good Translation is Wrong in Context: Context-Aware Machine Translation Improves on Deixis, Ellipsis, and Lexical Cohesion

1 code implementation ACL 2019 Elena Voita, Rico Sennrich, Ivan Titov

Though machine translation errors caused by the lack of context beyond one sentence have long been acknowledged, the development of context-aware NMT systems is hampered by several problems.

Machine Translation NMT +2

A Large-Scale Test Set for the Evaluation of Context-Aware Pronoun Translation in Neural Machine Translation

1 code implementation WS 2018 Mathias Müller, Annette Rios, Elena Voita, Rico Sennrich

We show that, while gains in BLEU are moderate for those systems, they outperform baselines by a large margin in terms of accuracy on our contrastive test set.

Machine Translation Sentence +1

Context-Aware Neural Machine Translation Learns Anaphora Resolution

no code implementations ACL 2018 Elena Voita, Pavel Serdyukov, Rico Sennrich, Ivan Titov

Standard machine translation systems process sentences in isolation and hence ignore extra-sentential information, even though extended context can both prevent mistakes in ambiguous cases and improve translation coherence.

Machine Translation Translation

Cannot find the paper you are looking for? You can Submit a new open access paper.