no code implementations • 16 Jan 2024 • Daniel Coquelin, Katharina Flügel, Marie Weiel, Nicholas Kiefer, Charlotte Debus, Achim Streit, Markus Götz
This study explores the learning dynamics of neural networks by analyzing the singular value decomposition (SVD) of their weights throughout training.
1 code implementation • 26 Apr 2023 • Katharina Flügel, Daniel Coquelin, Marie Weiel, Charlotte Debus, Achim Streit, Markus Götz
Backpropagation has long been criticized for being biologically implausible, relying on concepts that are not viable in natural learning processes.
1 code implementation • 20 Jan 2023 • Oskar Taubert, Marie Weiel, Daniel Coquelin, Anis Farshian, Charlotte Debus, Alexander Schug, Achim Streit, Markus Götz
We present Propulate, an evolutionary optimization algorithm and software package for global optimization and in particular hyperparameter search.
1 code implementation • 14 Apr 2022 • Daniel Coquelin, Behnood Rasti, Markus Götz, Pedram Ghamisi, Richard Gloaguen, Achim Streit
Furthermore, we present a method for training DNNs for denoising HSIs which are not spatially related to the training dataset, i. e., training on ground-level HSIs for denoising HSIs with other perspectives including airborne, drone-borne, and space-borne.
no code implementations • 12 Apr 2021 • Daniel Coquelin, Charlotte Debus, Markus Götz, Fabrice von der Lehr, James Kahn, Martin Siggel, Achim Streit
With increasing data and model complexities, the time required to train neural networks has become prohibitively large.
1 code implementation • 27 Jul 2020 • Markus Götz, Daniel Coquelin, Charlotte Debus, Kai Krajsek, Claudia Comito, Philipp Knechtges, Björn Hagemeier, Michael Tarnawa, Simon Hanselmann, Martin Siggel, Achim Basermann, Achim Streit
With HeAT, it is possible for a NumPy user to take full advantage of their available resources, significantly lowering the barrier to distributed data analysis.