Search Results for author: Bhavitvya Malik

Found 4 papers, 2 papers with code

Quality or Quantity? On Data Scale and Diversity in Adapting Large Language Models for Low-Resource Translation

no code implementations23 Aug 2024 Vivek Iyer, Bhavitvya Malik, Pavel Stepachev, Pinzhen Chen, Barry Haddow, Alexandra Birch

Despite the recent popularity of Large Language Models (LLMs) in Machine Translation (MT), their performance in low-resource languages (LRLs) still lags significantly behind Neural Machine Translation (NMT) models.

Diversity Machine Translation +2

UDApter -- Efficient Domain Adaptation Using Adapters

1 code implementation7 Feb 2023 Bhavitvya Malik, Abhinav Ramesh Kashyap, Min-Yen Kan, Soujanya Poria

We even outperform unsupervised domain adaptation methods such as DANN and DSN in sentiment classification, and we are within 0. 85% F1 for natural language inference task, by fine-tuning only a fraction of the full model parameters.

Language Modeling Language Modelling +4

Cannot find the paper you are looking for? You can Submit a new open access paper.