Search Results for author: Dimitar Shterionov

Found 17 papers, 1 papers with code

Selecting Parallel In-domain Sentences for Neural Machine Translation Using Monolingual Texts

1 code implementation11 Dec 2021 Javad PourMostafa Roshan Sharami, Dimitar Shterionov, Pieter Spronck

We then select the top K sentences with the highest similarity score to train a new machine translation system tuned to the specific in-domain data.

Machine Translation Translation

Machine Translationese: Effects of Algorithmic Bias on Linguistic Complexity in Machine Translation

no code implementations EACL 2021 Eva Vanmassenhove, Dimitar Shterionov, Matthew Gwilliam

Recent studies in the field of Machine Translation (MT) and Natural Language Processing (NLP) have shown that existing models amplify biases observed in the training data.

Machine Translation Translation

An Investigative Study of Multi-Modal Cross-Lingual Retrieval

no code implementations LREC 2020 Piyush Arora, Dimitar Shterionov, Yasufumi Moriya, Abhishek Kaushik, Daria Dzendzik, Gareth Jones

In this paper we devote special attention to the automatic translation (AT) component which is crucial for the overall quality of the MMCLIR system.

Information Retrieval Machine Translation +2

Selecting Backtranslated Data from Multiple Sources for Improved Neural Machine Translation

no code implementations ACL 2020 Xabier Soto, Dimitar Shterionov, Alberto Poncelas, Andy Way

Machine translation (MT) has benefited from using synthetic training data originating from translating monolingual corpora, a technique known as backtranslation.

Machine Translation Translation

Combining SMT and NMT Back-Translated Data for Efficient NMT

no code implementations9 Sep 2019 Alberto Poncelas, Maja Popovic, Dimitar Shterionov, Gideon Maillette de Buy Wenniger, Andy Way

Neural Machine Translation (NMT) models achieve their best performance when large sets of parallel data are used for training.

Machine Translation Translation

APE through Neural and Statistical MT with Augmented Data. ADAPT/DCU Submission to the WMT 2019 APE Shared Task

no code implementations WS 2019 Dimitar Shterionov, Joachim Wagner, F{\'e}lix do Carmo

Automatic post-editing (APE) can be reduced to a machine translation (MT) task, where the source is the output of a specific MT system and the target is its post-edited variant.

Automatic Post-Editing Domain Adaptation +1

Lost in Translation: Loss and Decay of Linguistic Richness in Machine Translation

no code implementations WS 2019 Eva Vanmassenhove, Dimitar Shterionov, Andy Way

This work presents an empirical approach to quantifying the loss of lexical richness in Machine Translation (MT) systems compared to Human Translation (HT).

Machine Translation Translation

ABI Neural Ensemble Model for Gender Prediction Adapt Bar-Ilan Submission for the CLIN29 Shared Task on Gender Prediction

no code implementations23 Feb 2019 Eva Vanmassenhove, Amit Moryossef, Alberto Poncelas, Andy Way, Dimitar Shterionov

In contradiction with the results described in previous comparable shared tasks, our neural models performed better than our best traditional approaches with our best feature set-up.

Gender Prediction

Investigating Backtranslation in Neural Machine Translation

no code implementations17 Apr 2018 Alberto Poncelas, Dimitar Shterionov, Andy Way, Gideon Maillette de Buy Wenniger, Peyman Passban

A prerequisite for training corpus-based machine translation (MT) systems -- either Statistical MT (SMT) or Neural MT (NMT) -- is the availability of high-quality parallel data.

Machine Translation Translation

Inference and learning in probabilistic logic programs using weighted Boolean formulas

no code implementations25 Apr 2013 Daan Fierens, Guy Van Den Broeck, Joris Renkens, Dimitar Shterionov, Bernd Gutmann, Ingo Thon, Gerda Janssens, Luc De Raedt

This paper investigates how classical inference and learning tasks known from the graphical model community can be tackled for probabilistic logic programs.

Cannot find the paper you are looking for? You can Submit a new open access paper.