no code implementations • 20 Mar 2022 • Zuzana Jelčicová, Marian Verhelst
Moreover, a reduction of ~87-94% operations can be achieved when only degrading the accuracy by 1-4%, speeding up the multi-head self-attention inference by a factor of ~7. 5-16.
1 code implementation • 27 Feb 2021 • Nimish Shah, Laura I. Galindez Olascoaga, Wannes Meert, Marian Verhelst
Bayesian reasoning is a powerful mechanism for probabilistic inference in smart edge-devices.
no code implementations • 21 Sep 2020 • Robby Neven, Marian Verhelst, Tinne Tuytelaars, Toon Goedemé
By first training the SGMs in a meta-learning manner on a set of common objects, during fine-tuning, the SGMs provided the model with accurate gradients to successfully learn to grasp new objects.
1 code implementation • 22 Jul 2020 • Linyan Mei, Pouya Houshmand, Vikram Jain, Sebastian Giraldo, Marian Verhelst
This work introduces ZigZag, a memory-centric rapid DNN accelerator DSE framework which extends the DSE with uneven mapping opportunities, in which operands at shared memory levels are no longer bound to use the same memory levels for each loop index.
Distributed, Parallel, and Cluster Computing C.1.4; C.3; C.4
1 code implementation • 10 Mar 2020 • Colby R. Banbury, Vijay Janapa Reddi, Max Lam, William Fu, Amin Fazel, Jeremy Holleman, Xinyuan Huang, Robert Hurtado, David Kanter, Anton Lokhmotov, David Patterson, Danilo Pau, Jae-sun Seo, Jeff Sieracki, Urmish Thakker, Marian Verhelst, Poonam Yadav
In this position paper, we present the current landscape of TinyML and discuss the challenges and direction towards developing a fair and useful hardware benchmark for TinyML workloads.
1 code implementation • NeurIPS 2019 • Laura I. Galindez Olascoaga, Wannes Meert, Nimish Shah, Marian Verhelst, Guy Van Den Broeck
We showcase our framework on a mobile activity recognition scenario, and on a variety of benchmark datasets representative of the field of tractable learning and of the applications of interest.
1 code implementation • 17 Dec 2018 • Gert Dekkers, Fernando Rosas, Steven Lauwereins, Sreeraj Rajendran, Sofie Pollin, Bart Vanrumste, Toon van Waterschoot, Marian Verhelst, Peter Karsmakers
This model provides a first step of exploration prior to custom design of a smart wireless acoustic sensor, and also can be used to compare the energy consumption of different protocols.
no code implementations • 16 Apr 2018 • Bert Moons, Daniel Bankman, Lita Yang, Boris Murmann, Marian Verhelst
This paper introduces BinarEye: a digital processor for always-on Binary Convolutional Neural Networks.
no code implementations • 13 Mar 2018 • Matthijs Van keirsbilck, Bert Moons, Marian Verhelst
Performing multi-modal speech recognition - processing acoustic speech signals and lip-reading video simultaneously - significantly enhances the performance of such systems, especially in noisy environments.
no code implementations • 1 Nov 2017 • Bert Moons, Koen Goetschalckx, Nick Van Berckelaer, Marian Verhelst
To this end, the energy consumption of inference is modeled for a generic hardware platform.
no code implementations • 22 Mar 2016 • Bert Moons, Bert de Brabandere, Luc van Gool, Marian Verhelst
Recently ConvNets or convolutional neural networks (CNN) have come up as state-of-the-art classification and detection algorithms, achieving near-human performance in visual detection.