Search Results for author: Leonard Dahlmann

Found 5 papers, 0 papers with code

Deploying a BERT-based Query-Title Relevance Classifier in a Production System: a View from the Trenches

no code implementations23 Aug 2021 Leonard Dahlmann, Tomer Lancewicki

We successfully optimize a Query-Title Relevance (QTR) classifier for deployment via a compact model, which we name BERT Bidirectional Long Short-Term Memory (BertBiLSTM).

Data Augmentation Knowledge Distillation +5

Diving Deep into Context-Aware Neural Machine Translation

no code implementations WMT (EMNLP) 2020 Jingjing Huo, Christian Herold, Yingbo Gao, Leonard Dahlmann, Shahram Khadivi, Hermann Ney

Context-aware neural machine translation (NMT) is a promising direction to improve the translation quality by making use of the additional context, e. g., document-level translation, or having meta-information.

Machine Translation NMT +1

Neural and Statistical Methods for Leveraging Meta-information in Machine Translation

no code implementations MTSummit 2017 Shahram Khadivi, Patrick Wilken, Leonard Dahlmann, Evgeny Matusov

In this paper, we discuss different methods which use meta information and richer context that may accompany source language input to improve machine translation quality.

Machine Translation Translation

Cannot find the paper you are looking for? You can Submit a new open access paper.