Search Results for author: Bernhard Klein

Found 5 papers, 1 papers with code

On the Non-Associativity of Analog Computations

no code implementations25 Sep 2023 Lisa Kuhn, Bernhard Klein, Holger Fröning

With this model we assess the importance of ordering by comparing the test accuracy of a neural network for keyword spotting, which is trained based either on an ordered model, on a non-ordered variant, and on real hardware.

Keyword Spotting

Walking Noise: Understanding Implications of Noisy Computations on Classification Tasks

no code implementations20 Dec 2022 Hendrik Borras, Bernhard Klein, Holger Fröning

We then investigate the implications of additive and multiplicative noise for different classification tasks and model architectures, with and without batch normalization.

Towards Hardware-Specific Automatic Compression of Neural Networks

no code implementations15 Dec 2022 Torben Krieger, Bernhard Klein, Holger Fröning

Moreover, we can demonstrate that a joint search and compression using pruning and quantization is superior to an individual search for policies using a single compression method.

Quantization reinforcement-learning +1

Understanding Cache Boundness of ML Operators on ARM Processors

1 code implementation1 Feb 2021 Bernhard Klein, Christoph Gratl, Manfred Mücke, Holger Fröning

Machine Learning compilers like TVM allow a fast and flexible deployment on embedded CPUs.

Quantization

Resource-Efficient Neural Networks for Embedded Systems

no code implementations7 Jan 2020 Wolfgang Roth, Günther Schindler, Bernhard Klein, Robert Peharz, Sebastian Tschiatschek, Holger Fröning, Franz Pernkopf, Zoubin Ghahramani

While machine learning is traditionally a resource intensive task, embedded systems, autonomous navigation, and the vision of the Internet of Things fuel the interest in resource-efficient approaches.

Autonomous Navigation BIG-bench Machine Learning +2

Cannot find the paper you are looking for? You can Submit a new open access paper.