Search Results for author: Jindřich Helcl

Found 18 papers, 5 papers with code

Non-Autoregressive Machine Translation: It’s Not as Fast as it Seems

no code implementations NAACL 2022 Jindřich Helcl, Barry Haddow, Alexandra Birch

In this paper, we point out flaws in the evaluation methodology present in the literature on NAR models and we provide a fair comparison between a state-of-the-art NAR model and the autoregressive submissions to the shared task.

Machine Translation Translation

UFAL Submissions to the IWSLT 2016 MT Track

no code implementations IWSLT 2016 Ondřej Bojar, Ondřej Cífka, Jindřich Helcl, Tom Kocmi, Roman Sudarikov

We present our submissions to the IWSLT 2016 machine translation task, as our first attempt to translate subtitles and one of our early experiments with neural machine translation (NMT).

Machine Translation NMT +1

CUNI Submission to MRL 2023 Shared Task on Multi-lingual Multi-task Information Retrieval

no code implementations25 Oct 2023 Jindřich Helcl, Jindřich Libovický

The goal of the shared task was to develop systems for named entity recognition and question answering in several under-represented languages.

Information Retrieval Machine Translation +5

CUNI Systems for the WMT22 Czech-Ukrainian Translation Task

no code implementations1 Dec 2022 Martin Popel, Jindřich Libovický, Jindřich Helcl

We present Charles University submissions to the WMT22 General Translation Shared Task on Czech-Ukrainian and Ukrainian-Czech machine translation.

Machine Translation Translation

CUNI Non-Autoregressive System for the WMT 22 Efficient Translation Shared Task

no code implementations1 Dec 2022 Jindřich Helcl

We present a non-autoregressive system submission to the WMT 22 Efficient Translation Shared Task.

Translation

Non-Autoregressive Machine Translation: It's Not as Fast as it Seems

no code implementations4 May 2022 Jindřich Helcl, Barry Haddow, Alexandra Birch

In this paper, we point out flaws in the evaluation methodology present in the literature on NAR models and we provide a fair comparison between a state-of-the-art NAR model and the autoregressive submissions to the shared task.

Machine Translation Translation

Improving Fluency of Non-Autoregressive Machine Translation

no code implementations7 Apr 2020 Zdeněk Kasner, Jindřich Libovický, Jindřich Helcl

Non-autoregressive (nAR) models for machine translation (MT) manifest superior decoding speed when compared to autoregressive (AR) models, at the expense of impaired fluency of their outputs.

Machine Translation Translation

Input Combination Strategies for Multi-Source Transformer Decoder

no code implementations12 Nov 2018 Jindřich Libovický, Jindřich Helcl, David Mareček

In multi-source sequence-to-sequence tasks, the attention mechanism can be modeled in several ways.

Translation

End-to-End Non-Autoregressive Neural Machine Translation with Connectionist Temporal Classification

1 code implementation12 Nov 2018 Jindřich Libovický, Jindřich Helcl

Autoregressive decoding is the only part of sequence-to-sequence models that prevents them from massive parallelization at inference time.

General Classification Machine Translation +1

CUNI System for the WMT18 Multimodal Translation Task

no code implementations12 Nov 2018 Jindřich Helcl, Jindřich Libovický, Dušan Variš

For our submission, we acquired both textual and multimodal additional data.

Translation

CUNI System for the WMT17 Multimodal Translation Task

no code implementations14 Jul 2017 Jindřich Helcl, Jindřich Libovický

For Task 1 (multimodal translation), our best scoring system is a purely textual neural translation of the source image caption to the target language.

Image Captioning Task 2 +1

Attention Strategies for Multi-Source Sequence-to-Sequence Learning

1 code implementation21 Apr 2017 Jindřich Libovický, Jindřich Helcl

Modeling attention in neural multi-source sequence-to-sequence learning remains a relatively unexplored area, despite its usefulness in tasks that incorporate multiple source languages or modalities.

Automatic Post-Editing Translation

CUNI System for WMT16 Automatic Post-Editing and Multimodal Translation Tasks

no code implementations WS 2016 Jindřich Libovický, Jindřich Helcl, Marek Tlustý, Pavel Pecina, Ondřej Bojar

Neural sequence to sequence learning recently became a very promising paradigm in machine translation, achieving competitive results with statistical phrase-based systems.

Automatic Post-Editing Multimodal Machine Translation +1

Cannot find the paper you are looking for? You can Submit a new open access paper.