Search Results for author: Tamali Banerjee

Found 5 papers, 0 papers with code

Denoising-based UNMT is more robust to word-order divergence than MASS-based UNMT

no code implementations2 Mar 2023 Tamali Banerjee, Rudra Murthy V, Pushpak Bhattacharyya

We aim to investigate whether UNMT approaches with self-supervised pre-training are robust to word-order divergence between language pairs.

Denoising Translation

Crosslingual Embeddings are Essential in UNMT for Distant Languages: An English to IndoAryan Case Study

no code implementations MTSummit 2021 Tamali Banerjee, Rudra Murthy V, Pushpak Bhattacharyya

In this paper, we show that initializing the embedding layer of UNMT models with cross-lingual embeddings shows significant improvements in BLEU score over existing approaches with embeddings randomly initialized.

Denoising Translation +1

Scrambled Translation Problem: A Problem of Denoising UNMT

no code implementations MTSummit 2021 Tamali Banerjee, Rudra Murthy V, Pushpak Bhattacharyya

We hypothesise that the reason behind \textit{scrambled translation problem} is 'shuffling noise' which is introduced in every input sentence as a denoising strategy.

Denoising Machine Translation +2

Meaningless yet meaningful: Morphology grounded subword-level NMT

no code implementations WS 2018 Tamali Banerjee, Pushpak Bhattacharyya

We explore the use of two independent subsystems Byte Pair Encoding (BPE) and Morfessor as basic units for subword-level neural machine translation (NMT).

NMT Segmentation +2

Cannot find the paper you are looking for? You can Submit a new open access paper.