Search Results for author: Eva Hasler

Found 19 papers, 6 papers with code

A Preference-driven Paradigm for Enhanced Translation with Large Language Models

no code implementations17 Apr 2024 Dawei Zhu, Sony Trenous, Xiaoyu Shen, Dietrich Klakow, Bill Byrne, Eva Hasler

Recent research has shown that large language models (LLMs) can achieve remarkable translation performance through supervised fine-tuning (SFT) using only a small amount of parallel data.

Sentence Translation

Trained MT Metrics Learn to Cope with Machine-translated References

1 code implementation1 Dec 2023 Jannis Vamvas, Tobias Domhan, Sony Trenous, Rico Sennrich, Eva Hasler

Neural metrics trained on human evaluations of MT tend to correlate well with human judgments, but their behavior is not fully understood.

Automatic Evaluation and Analysis of Idioms in Neural Machine Translation

1 code implementation10 Oct 2022 Christos Baziotis, Prashant Mathur, Eva Hasler

A major open problem in neural machine translation (NMT) is the translation of idiomatic expressions, such as "under the weather".

Machine Translation NMT +1

The Devil is in the Details: On the Pitfalls of Vocabulary Selection in Neural Machine Translation

1 code implementation NAACL 2022 Tobias Domhan, Eva Hasler, Ke Tran, Sony Trenous, Bill Byrne, Felix Hieber

Vocabulary selection, or lexical shortlisting, is a well-known technique to improve latency of Neural Machine Translation models by constraining the set of allowed output words during inference.

Machine Translation Sentence +1

Controlling Japanese Honorifics in English-to-Japanese Neural Machine Translation

no code implementations WS 2019 Weston Feely, Eva Hasler, Adri{\`a} de Gispert

In the Japanese language different levels of honorific speech are used to convey respect, deference, humility, formality and social distance.

Machine Translation NMT +2

Neural Machine Translation Decoding with Terminology Constraints

no code implementations NAACL 2018 Eva Hasler, Adrià De Gispert, Gonzalo Iglesias, Bill Byrne

Despite the impressive quality improvements yielded by neural machine translation (NMT) systems, controlling their translation output to adhere to user-provided terminology constraints remains an open problem.

Machine Translation NMT +1

Accelerating NMT Batched Beam Decoding with LMBR Posteriors for Deployment

no code implementations NAACL 2018 Gonzalo Iglesias, William Tambellini, Adrià De Gispert, Eva Hasler, Bill Byrne

We describe a batched beam decoding algorithm for NMT with LMBR n-gram posteriors, showing that LMBR techniques still yield gains on top of the best recently reported results with Transformers.


A Comparison of Neural Models for Word Ordering

1 code implementation WS 2017 Eva Hasler, Felix Stahlberg, Marcus Tomalin, Adri`a de Gispert, Bill Byrne

We compare several language models for the word-ordering task and propose a new bag-to-sequence neural model based on attention-based sequence-to-sequence models.

Neural Machine Translation by Minimising the Bayes-risk with Respect to Syntactic Translation Lattices

no code implementations EACL 2017 Felix Stahlberg, Adrià De Gispert, Eva Hasler, Bill Byrne

This makes our approach much more flexible than $n$-best list or lattice rescoring as the neural decoder is not restricted to the SMT search space.

Machine Translation NMT +1

Multilingual Image Description with Neural Sequence Models

1 code implementation15 Oct 2015 Desmond Elliott, Stella Frank, Eva Hasler

In this paper we present an approach to multi-language image description bringing together insights from neural machine translation and neural image description.

Image Captioning Translation

Cannot find the paper you are looking for? You can Submit a new open access paper.