Search Results for author: Brett Meyer

Found 2 papers, 1 papers with code

Standard Deviation-Based Quantization for Deep Neural Networks

no code implementations24 Feb 2022 Amir Ardakani, Arash Ardakani, Brett Meyer, James J. Clark, Warren J. Gross

Quantization of deep neural networks is a promising approach that reduces the inference cost, making it feasible to run deep networks on resource-restricted devices.

Quantization

Using Speech Synthesis to Train End-to-End Spoken Language Understanding Models

2 code implementations21 Oct 2019 Loren Lugosch, Brett Meyer, Derek Nowrouzezahrai, Mirco Ravanelli

End-to-end models are an attractive new approach to spoken language understanding (SLU) in which the meaning of an utterance is inferred directly from the raw audio without employing the standard pipeline composed of a separately trained speech recognizer and natural language understanding module.

Data Augmentation Natural Language Understanding +2

Cannot find the paper you are looking for? You can Submit a new open access paper.