Search Results for author: David Qiu

Found 9 papers, 1 papers with code

2-bit Conformer quantization for automatic speech recognition

no code implementations26 May 2023 Oleg Rybakov, Phoenix Meadowlark, Shaojin Ding, David Qiu, Jian Li, David Rim, Yanzhang He

With the large-scale training data, we obtain a 2-bit Conformer model with over 40% model size reduction against the 4-bit version at the cost of 17% relative word error rate degradation

Automatic Speech Recognition Automatic Speech Recognition (ASR) +3

RAND: Robustness Aware Norm Decay For Quantized Seq2seq Models

no code implementations24 May 2023 David Qiu, David Rim, Shaojin Ding, Oleg Rybakov, Yanzhang He

With the rapid increase in the size of neural networks, model compression has become an important area of research.

Machine Translation Model Compression +3

Improving Confidence Estimation on Out-of-Domain Data for End-to-End Speech Recognition

no code implementations7 Oct 2021 Qiujia Li, Yu Zhang, David Qiu, Yanzhang He, Liangliang Cao, Philip C. Woodland

As end-to-end automatic speech recognition (ASR) models reach promising performance, various downstream tasks rely on good confidence estimators for these systems.

Automatic Speech Recognition Automatic Speech Recognition (ASR) +2

Large-scale ASR Domain Adaptation using Self- and Semi-supervised Learning

no code implementations1 Oct 2021 Dongseong Hwang, Ananya Misra, Zhouyuan Huo, Nikhil Siddhartha, Shefali Garg, David Qiu, Khe Chai Sim, Trevor Strohman, Françoise Beaufays, Yanzhang He

Self- and semi-supervised learning methods have been actively investigated to reduce labeled training data or enhance the model performance.

Domain Adaptation

Probabilistic Clustering Using Maximal Matrix Norm Couplings

no code implementations10 Oct 2018 David Qiu, Anuran Makur, Lizhong Zheng

In this paper, we present a local information theoretic approach to explicitly learn probabilistic clustering of a discrete random variable.

Clustering Sentence +1

Cannot find the paper you are looking for? You can Submit a new open access paper.