Search Results for author: Samridhi Choudhary

Found 7 papers, 0 papers with code

Semantic Complexity in End-to-End Spoken Language Understanding

no code implementations6 Aug 2020 Joseph P. McKenna, Samridhi Choudhary, Michael Saxon, Grant P. Strimel, Athanasios Mouchtaris

We perform experiments where we vary the semantic complexity of a large, proprietary dataset and show that STI model performance correlates with our semantic complexity measures, such that performance increases as complexity values decrease.

Spoken Language Understanding

Tie Your Embeddings Down: Cross-Modal Latent Spaces for End-to-end Spoken Language Understanding

no code implementations18 Nov 2020 Bhuvan Agrawal, Markus Müller, Martin Radfar, Samridhi Choudhary, Athanasios Mouchtaris, Siegfried Kunzmann

In this paper, we treat an E2E system as a multi-modal model, with audio and text functioning as its two modalities, and use a cross-modal latent space (CMLS) architecture, where a shared latent space is learned between the `acoustic' and `text' embeddings.

Spoken Language Understanding

Extreme Model Compression for On-device Natural Language Understanding

no code implementations COLING 2020 Kanthashree Mysore Sathyendra, Samridhi Choudhary, Leah Nicolich-Henkin

In this paper, we propose and experiment with techniques for extreme compression of neural natural language understanding (NLU) models, making them suitable for execution on resource-constrained devices.

Model Compression Natural Language Understanding

End-to-End Spoken Language Understanding for Generalized Voice Assistants

no code implementations16 Jun 2021 Michael Saxon, Samridhi Choudhary, Joseph P. McKenna, Athanasios Mouchtaris

End-to-end (E2E) spoken language understanding (SLU) systems predict utterance semantics directly from speech using a single model.

Ranked #10 on Spoken Language Understanding on Fluent Speech Commands (using extra training data)

Spoken Language Understanding

Quantization-Aware and Tensor-Compressed Training of Transformers for Natural Language Understanding

no code implementations1 Jun 2023 Zi Yang, Samridhi Choudhary, Siegfried Kunzmann, Zheng Zhang

To improve the convergence, a layer-by-layer distillation is applied to distill a quantized and tensor-compressed student model from a pre-trained transformer.

Natural Language Understanding Quantization

Cannot find the paper you are looking for? You can Submit a new open access paper.