Search Results for author: Tsendsuren Munkhdalai

Found 20 papers, 8 papers with code

Contextual Biasing with the Knuth-Morris-Pratt Matching Algorithm

no code implementations29 Sep 2023 Weiran Wang, Zelin Wu, Diamantino Caseiro, Tsendsuren Munkhdalai, Khe Chai Sim, Pat Rondon, Golan Pundak, Gan Song, Rohit Prabhavalkar, Zhong Meng, Ding Zhao, Tara Sainath, Pedro Moreno Mengibar

Contextual biasing refers to the problem of biasing the automatic speech recognition (ASR) systems towards rare entities that are relevant to the specific user or application scenarios.

Automatic Speech Recognition Automatic Speech Recognition (ASR) +1

Diverse Distributions of Self-Supervised Tasks for Meta-Learning in NLP

no code implementations EMNLP 2021 Trapit Bansal, Karthick Gunasekaran, Tong Wang, Tsendsuren Munkhdalai, Andrew McCallum

Meta-learning considers the problem of learning an efficient learning process that can leverage its past experience to accurately solve new tasks.

Few-Shot Learning

Fast Contextual Adaptation with Neural Associative Memory for On-Device Personalized Speech Recognition

no code implementations5 Oct 2021 Tsendsuren Munkhdalai, Khe Chai Sim, Angad Chandorkar, Fan Gao, Mason Chua, Trevor Strohman, Françoise Beaufays

Fast contextual adaptation has shown to be effective in improving Automatic Speech Recognition (ASR) of rare words and when combined with an on-device personalized training, it can yield an even better recognition result.

Automatic Speech Recognition Automatic Speech Recognition (ASR) +2

A Locally Adaptive Interpretable Regression

1 code implementation7 May 2020 Lkhagvadorj Munkhdalai, Tsendsuren Munkhdalai, Keun Ho Ryu

Therefore, LoAIR is a step towards bridging the gap between econometrics, statistics, and machine learning by improving the predictive ability of linear regression without depreciating its interpretability.

BIG-bench Machine Learning Econometrics +1

Exploring and Predicting Transferability across NLP Tasks

1 code implementation EMNLP 2020 Tu Vu, Tong Wang, Tsendsuren Munkhdalai, Alessandro Sordoni, Adam Trischler, Andrew Mattarella-Micke, Subhransu Maji, Mohit Iyyer

We also develop task embeddings that can be used to predict the most transferable source tasks for a given target task, and we validate their effectiveness in experiments controlled for source and target data size.

Language Modelling Part-Of-Speech Tagging +4

Metalearned Neural Memory

1 code implementation NeurIPS 2019 Tsendsuren Munkhdalai, Alessandro Sordoni, Tong Wang, Adam Trischler

We augment recurrent neural networks with an external memory mechanism that builds upon recent progress in metalearning.

Question Answering reinforcement-learning +1

Building Dynamic Knowledge Graphs from Text using Machine Reading Comprehension

no code implementations ICLR 2019 Rajarshi Das, Tsendsuren Munkhdalai, Xingdi Yuan, Adam Trischler, Andrew McCallum

We harness and extend a recently proposed machine reading comprehension (MRC) model to query for entity states, since these states are generally communicated in spans of text and MRC models perform well in extracting entity-centric spans.

Knowledge Graphs Machine Reading Comprehension +2

Metalearning with Hebbian Fast Weights

no code implementations12 Jul 2018 Tsendsuren Munkhdalai, Adam Trischler

We unify recent neural approaches to one-shot learning with older ideas of associative memory in a model for metalearning.

One-Shot Learning

Sentence Simplification with Memory-Augmented Neural Networks

no code implementations NAACL 2018 Tu Vu, Baotian Hu, Tsendsuren Munkhdalai, Hong Yu

Sentence simplification aims to simplify the content and structure of complex sentences, and thus make them easier to interpret for human readers, and easier to process for downstream NLP applications.

Machine Translation Sentence +2

Rapid Adaptation with Conditionally Shifted Neurons

no code implementations ICML 2018 Tsendsuren Munkhdalai, Xingdi Yuan, Soroush Mehri, Adam Trischler

We describe a mechanism by which artificial neural networks can learn rapid adaptation - the ability to adapt on the fly, with little data, to new tasks - that we call conditionally shifted neurons.

Few-Shot Image Classification

Meta Networks

1 code implementation ICML 2017 Tsendsuren Munkhdalai, Hong Yu

Neural networks have been successfully applied in applications with a large amount of labeled data.

Continual Learning Meta-Learning

Neural Semantic Encoders

3 code implementations EACL 2017 Tsendsuren Munkhdalai, Hong Yu

We present a memory augmented neural network for natural language understanding: Neural Semantic Encoders.

General Classification Machine Translation +7

Cannot find the paper you are looking for? You can Submit a new open access paper.