To address this challenge, neural architecture search (NAS) promises to help design accurate ML models that meet the tight MCU memory, latency and energy constraints.
Ranked #1 on Keyword Spotting on Google Speech Commands V2 12
Tuning hyperparameters for machine learning algorithms is a tedious task, one that is typically done manually.
Modern speech enhancement algorithms achieve remarkable noise suppression by means of large recurrent neural networks (RNNs).
This paper introduces a method to compress RNNs for resource constrained environments using Kronecker product (KP).
Recurrent Neural Networks (RNN) can be difficult to deploy on resource constrained devices due to their size. As a result, there is a need for compression techniques that can significantly compress RNNs without negatively impacting task accuracy.
The vast majority of processors in the world are actually microcontroller units (MCUs), which find widespread use performing simple control tasks in applications ranging from automobiles to medical devices and office equipment.
We propose a novel method called the Relevance Subject Machine (RSM) to solve the person re-identification (re-id) problem.
In this paper, we present a novel Bayesian approach to recover simultaneously block sparse signals in the presence of outliers.
We show that the proposed framework encompasses a large class of S-NNLS algorithms and provide a computationally efficient inference procedure based on multiplicative update rules.