Search Results for author: Rudra Murthy V

Found 9 papers, 3 papers with code

Prompting with Pseudo-Code Instructions

1 code implementation19 May 2023 Mayank Mishra, Prince Kumar, Riyaz Bhat, Rudra Murthy V, Danish Contractor, Srikanth Tamilselvam

Prompting with natural language instructions has recently emerged as a popular method of harnessing the capabilities of large language models.

Denoising-based UNMT is more robust to word-order divergence than MASS-based UNMT

no code implementations2 Mar 2023 Tamali Banerjee, Rudra Murthy V, Pushpak Bhattacharyya

We aim to investigate whether UNMT approaches with self-supervised pre-training are robust to word-order divergence between language pairs.

Denoising Translation

Semi-Structured Object Sequence Encoders

no code implementations3 Jan 2023 Rudra Murthy V, Riyaz Bhat, Chulaka Gunasekara, Siva Sankalp Patel, Hui Wan, Tejas Indulal Dhamecha, Danish Contractor, Marina Danilevsky

In this paper we explore the task of modeling semi-structured object sequences; in particular, we focus our attention on the problem of developing a structure-aware input representation for such sequences.


Naamapadam: A Large-Scale Named Entity Annotated Data for Indic Languages

1 code implementation20 Dec 2022 Arnav Mhaske, Harshit Kedia, Sumanth Doddapaneni, Mitesh M. Khapra, Pratyush Kumar, Rudra Murthy V, Anoop Kunchukuttan

The dataset contains more than 400k sentences annotated with a total of at least 100k entities from three standard entity categories (Person, Location, and, Organization) for 9 out of the 11 languages.

Named Entity Recognition Sentence

Crosslingual Embeddings are Essential in UNMT for Distant Languages: An English to IndoAryan Case Study

no code implementations MTSummit 2021 Tamali Banerjee, Rudra Murthy V, Pushpak Bhattacharyya

In this paper, we show that initializing the embedding layer of UNMT models with cross-lingual embeddings shows significant improvements in BLEU score over existing approaches with embeddings randomly initialized.

Denoising Translation +1

Scrambled Translation Problem: A Problem of Denoising UNMT

no code implementations MTSummit 2021 Tamali Banerjee, Rudra Murthy V, Pushpak Bhattacharyya

We hypothesise that the reason behind \textit{scrambled translation problem} is 'shuffling noise' which is introduced in every input sentence as a denoising strategy.

Denoising Machine Translation +2

Addressing word-order Divergence in Multilingual Neural Machine Translation for extremely Low Resource Languages

no code implementations NAACL 2019 Rudra Murthy V, Anoop Kunchukuttan, Pushpak Bhattacharyya

To bridge this divergence, We propose to pre-order the assisting language sentence to match the word order of the source language and train the parent model.

Machine Translation NMT +3

Sharing Network Parameters for Crosslingual Named Entity Recognition

no code implementations1 Jul 2016 Rudra Murthy V, Mitesh Khapra, Pushpak Bhattacharyya

In this paper, we propose a neural network based model which allows sharing the decoder as well as word and character level parameters between two languages thereby allowing a resource fortunate language to aid a resource deprived language.

Decoder named-entity-recognition +2

Cannot find the paper you are looking for? You can Submit a new open access paper.