Efficient keyword spotting using dilated convolutions and gating

19 Nov 2018Alice CouckeMohammed ChliehThibault GisselbrechtDavid LeroyMathieu PoumeyrolThibaut Lavril

We explore the application of end-to-end stateless temporal modeling to small-footprint keyword spotting as opposed to recurrent networks that model long-term temporal dependencies using internal states. We propose a model inspired by the recent success of dilated convolutions in sequence modeling applications, allowing to train deeper architectures in resource-constrained configurations... (read more)

PDF Abstract

Evaluation results from the paper


  Submit results from this paper to get state-of-the-art GitHub badges and help community compare results to other papers.