no code implementations • 23 May 2024 • Zi Yang, Samridhi Choudhary, Xinfeng Xie, Cao Gao, Siegfried Kunzmann, Zheng Zhang
CoMERA achieves end-to-end rank-adaptive tensor-compressed training via a multi-objective optimization formulation, and improves the training to provide both a high compression ratio and excellent accuracy in the training process.
no code implementations • 1 Jun 2023 • Zi Yang, Samridhi Choudhary, Siegfried Kunzmann, Zheng Zhang
To improve the convergence, a layer-by-layer distillation is applied to distill a quantized and tensor-compressed student model from a pre-trained transformer.
no code implementations • 16 Jun 2021 • Michael Saxon, Samridhi Choudhary, Joseph P. McKenna, Athanasios Mouchtaris
End-to-end (E2E) spoken language understanding (SLU) systems predict utterance semantics directly from speech using a single model.
Ranked #10 on Spoken Language Understanding on Fluent Speech Commands (using extra training data)
no code implementations • COLING 2020 • Kanthashree Mysore Sathyendra, Samridhi Choudhary, Leah Nicolich-Henkin
In this paper, we propose and experiment with techniques for extreme compression of neural natural language understanding (NLU) models, making them suitable for execution on resource-constrained devices.
no code implementations • 18 Nov 2020 • Bhuvan Agrawal, Markus Müller, Martin Radfar, Samridhi Choudhary, Athanasios Mouchtaris, Siegfried Kunzmann
In this paper, we treat an E2E system as a multi-modal model, with audio and text functioning as its two modalities, and use a cross-modal latent space (CMLS) architecture, where a shared latent space is learned between the `acoustic' and `text' embeddings.
no code implementations • 6 Aug 2020 • Joseph P. McKenna, Samridhi Choudhary, Michael Saxon, Grant P. Strimel, Athanasios Mouchtaris
We perform experiments where we vary the semantic complexity of a large, proprietary dataset and show that STI model performance correlates with our semantic complexity measures, such that performance increases as complexity values decrease.
no code implementations • ACL 2019 • James Fiacco, Samridhi Choudhary, Carolyn Rose
We introduce a general method for the interpretation and comparison of neural models.
no code implementations • WS 2017 • Shrimai Prabhumoye, Samridhi Choudhary, Evangelia Spiliopoulou, Christopher Bogart, Carolyn Penstein Rose, Alan W. black
There has been a long standing interest in understanding `Social Influence' both in Social Sciences and in Computational Linguistics.