no code implementations • 16 Jan 2024 • Daniel Coquelin, Katharina Flügel, Marie Weiel, Nicholas Kiefer, Charlotte Debus, Achim Streit, Markus Götz
This study explores the learning dynamics of neural networks by analyzing the singular value decomposition (SVD) of their weights throughout training.
1 code implementation • 26 Apr 2023 • Katharina Flügel, Daniel Coquelin, Marie Weiel, Charlotte Debus, Achim Streit, Markus Götz
Backpropagation has long been criticized for being biologically implausible, relying on concepts that are not viable in natural learning processes.
1 code implementation • 20 Jan 2023 • Oskar Taubert, Marie Weiel, Daniel Coquelin, Anis Farshian, Charlotte Debus, Alexander Schug, Achim Streit, Markus Götz
We present Propulate, an evolutionary optimization algorithm and software package for global optimization and in particular hyperparameter search.
no code implementations • 3 Dec 2022 • René Caspart, Sebastian Ziegler, Arvid Weyrauch, Holger Obermaier, Simon Raffeiner, Leon Pascal Schuhmacher, Jan Scholtyssek, Darya Trofimova, Marco Nolden, Ines Reinartz, Fabian Isensee, Markus Götz, Charlotte Debus
Therefore, accurate measurements of the power draw of AI workflows on different types of compute nodes is key to algorithmic improvements and the design of future compute clusters and hardware.
no code implementations • 12 Apr 2021 • Daniel Coquelin, Charlotte Debus, Markus Götz, Fabrice von der Lehr, James Kahn, Martin Siggel, Achim Streit
With increasing data and model complexities, the time required to train neural networks has become prohibitively large.
1 code implementation • 27 Jul 2020 • Markus Götz, Daniel Coquelin, Charlotte Debus, Kai Krajsek, Claudia Comito, Philipp Knechtges, Björn Hagemeier, Michael Tarnawa, Simon Hanselmann, Martin Siggel, Achim Basermann, Achim Streit
With HeAT, it is possible for a NumPy user to take full advantage of their available resources, significantly lowering the barrier to distributed data analysis.