Search Results for author: Amirhossein Tebbifakhr

Found 8 papers, 0 papers with code

Automatic Translation for Multiple NLP tasks: a Multi-task Approach to Machine-oriented NMT Adaptation

no code implementations EAMT 2020 Amirhossein Tebbifakhr, Matteo Negri, Marco Turchi

We address this problem by proposing a multi-task approach to machine-oriented NMT adaptation, which is capable to serve multiple downstream tasks with a single system.

Machine Translation NMT +1

Machine-oriented NMT Adaptation for Zero-shot NLP tasks: Comparing the Usefulness of Close and Distant Languages

no code implementations VarDial (COLING) 2020 Amirhossein Tebbifakhr, Matteo Negri, Marco Turchi

In this work, we tackle the problem in a multilingual setting where a single NMT model translates from multiple languages for downstream automatic processing in the target language.

Machine Translation NMT

Machine Translation for Machines: the Sentiment Classification Use Case

no code implementations IJCNLP 2019 Amirhossein Tebbifakhr, Luisa Bentivogli, Matteo Negri, Marco Turchi

Towards this objective, we present a reinforcement learning technique based on a new candidate sampling strategy, which exploits the results obtained on the downstream task as weak feedback.

Classification General Classification +7

Effort-Aware Neural Automatic Post-Editing

no code implementations WS 2019 Amirhossein Tebbifakhr, Matteo Negri, Marco Turchi

For this purpose, following the common approach in multilingual NMT, we prepend a special token to the beginning of both the source text and the MT output indicating the required amount of post-editing.

Automatic Post-Editing NMT +1

Multi-source transformer with combined losses for automatic post editing

no code implementations WS 2018 Amirhossein Tebbifakhr, Ruchit Agrawal, Matteo Negri, Marco Turchi

In the first subtask, our system improves over the baseline up to -5. 3 TER and +8. 23 BLEU points ranking second out of 11 submitted runs.

Automatic Post-Editing NMT +2

Dimension Projection among Languages based on Pseudo-relevant Documents for Query Translation

no code implementations25 May 2016 Javid Dadashkarimi, Mahsa S. Shahshahani, Amirhossein Tebbifakhr, Heshaam Faili, Azadeh Shakery

Using top-ranked documents in response to a query has been shown to be an effective approach to improve the quality of query translation in dictionary-based cross-language information retrieval.

Information Retrieval Machine Translation +2

Cannot find the paper you are looking for? You can Submit a new open access paper.