Search Results for author: Lukas Enderich

Found 3 papers, 0 papers with code

Holistic Filter Pruning for Efficient Deep Neural Networks

no code implementations17 Sep 2020 Lukas Enderich, Fabian Timm, Wolfram Burgard

Deep neural networks (DNNs) are usually over-parameterized to increase the likelihood of getting adequate initial weights by random initialization.

SYMOG: learning symmetric mixture of Gaussian modes for improved fixed-point quantization

no code implementations19 Feb 2020 Lukas Enderich, Fabian Timm, Wolfram Burgard

We propose SYMOG (symmetric mixture of Gaussian modes), which significantly decreases the complexity of DNNs through low-bit fixed-point quantization.

Quantization

Learning Multimodal Fixed-Point Weights using Gradient Descent

no code implementations16 Jul 2019 Lukas Enderich, Fabian Timm, Lars Rosenbaum, Wolfram Burgard

Due to their high computational complexity, deep neural networks are still limited to powerful processing units.

Quantization

Cannot find the paper you are looking for? You can Submit a new open access paper.