no code implementations • 17 Sep 2020 • Lukas Enderich, Fabian Timm, Wolfram Burgard
Deep neural networks (DNNs) are usually over-parameterized to increase the likelihood of getting adequate initial weights by random initialization.
no code implementations • 19 Feb 2020 • Lukas Enderich, Fabian Timm, Wolfram Burgard
We propose SYMOG (symmetric mixture of Gaussian modes), which significantly decreases the complexity of DNNs through low-bit fixed-point quantization.
no code implementations • 16 Jul 2019 • Lukas Enderich, Fabian Timm, Lars Rosenbaum, Wolfram Burgard
Due to their high computational complexity, deep neural networks are still limited to powerful processing units.