Search Results for author: Shankar Ananthakrishnan

Found 6 papers, 1 papers with code

Design Considerations For Hypothesis Rejection Modules In Spoken Language Understanding Systems

no code implementations31 Oct 2022 Aman Alok, Rahul Gupta, Shankar Ananthakrishnan

Hypothesis rejection modules in both schemes reject/accept a hypothesis based on features drawn from the utterance directed to the SLU system, the associated SLU hypothesis and SLU confidence score.

Spoken Language Understanding

AlexaTM 20B: Few-Shot Learning Using a Large-Scale Multilingual Seq2Seq Model

1 code implementation2 Aug 2022 Saleh Soltan, Shankar Ananthakrishnan, Jack FitzGerald, Rahul Gupta, Wael Hamza, Haidar Khan, Charith Peris, Stephen Rawls, Andy Rosenbaum, Anna Rumshisky, Chandana Satya Prakash, Mukund Sridhar, Fabian Triefenbach, Apurv Verma, Gokhan Tur, Prem Natarajan

In this work, we demonstrate that multilingual large-scale sequence-to-sequence (seq2seq) models, pre-trained on a mixture of denoising and Causal Language Modeling (CLM) tasks, are more efficient few-shot learners than decoder-only models on various tasks.

Causal Language Modeling Common Sense Reasoning +8

One-vs-All Models for Asynchronous Training: An Empirical Analysis

no code implementations20 Jun 2019 Rahul Gupta, Aman Alok, Shankar Ananthakrishnan

An OVA system consists of as many OVA models as the number of classes, providing the advantage of asynchrony, where each OVA model can be re-trained independent of other models.

General Classification Natural Language Understanding +1

A Re-ranker Scheme for Integrating Large Scale NLU models

no code implementations25 Sep 2018 Chengwei Su, Rahul Gupta, Shankar Ananthakrishnan, Spyros Matsoukas

An ideal re-ranker will exhibit the following two properties: (a) it should prefer the most relevant hypothesis for the given input as the top hypothesis and, (b) the interpretation scores corresponding to each hypothesis produced by the re-ranker should be calibrated.

Natural Language Understanding

Cannot find the paper you are looking for? You can Submit a new open access paper.