1 code implementation • 20 Jun 2023 • Jiawei Zhao, Yifei Zhang, Beidi Chen, Florian Schäfer, Anima Anandkumar
To remedy this, we design a new training algorithm Incremental Low-Rank Learning (InRank), which explicitly expresses cumulative weight updates as low-rank matrices while incrementally augmenting their ranks during training.
1 code implementation • 3 Apr 2023 • Yifan Chen, Houman Owhadi, Florian Schäfer
The primary goal of this paper is to provide a near-linear complexity algorithm for working with such kernel matrices.
1 code implementation • 23 Apr 2022 • Qi Zeng, Yash Kothari, Spencer H. Bryngelson, Florian Schäfer
Neural networks can be trained to solve partial differential equations (PDEs) by using the PDE residual as the loss function.
no code implementations • 16 Nov 2021 • Jeffrey Ma, Alistair Letcher, Florian Schäfer, Yuanyuan Shi, Anima Anandkumar
In this work we propose polymatrix competitive gradient descent (PCGD) as a method for solving general sum competitive optimization involving arbitrary numbers of agents.
1 code implementation • 25 Oct 2021 • Jiawei Zhao, Florian Schäfer, Anima Anandkumar
Deep neural networks are usually initialized with random weights, with adequately selected initial variance to ensure stable signal propagation during training.
3 code implementations • 17 Jun 2020 • Florian Schäfer, Anima Anandkumar, Houman Owhadi
Finally, we obtain the next iterate by following this direction according to the dual geometry induced by the Bregman potential.
1 code implementation • 29 Apr 2020 • Florian Schäfer, Matthias Katzfuss, Houman Owhadi
We propose to compute a sparse approximate inverse Cholesky factor $L$ of a dense covariance matrix $\Theta$ by minimizing the Kullback-Leibler divergence between the Gaussian distributions $\mathcal{N}(0, \Theta)$ and $\mathcal{N}(0, L^{-\top} L^{-1})$, subject to a sparsity constraint.
Numerical Analysis Numerical Analysis Optimization and Control Statistics Theory Computation Statistics Theory
3 code implementations • ICML 2020 • Florian Schäfer, Hongkai Zheng, Anima Anandkumar
We show that opponent-aware modelling of generator and discriminator, as present in competitive gradient descent (CGD), can significantly strengthen ICR and thus stabilize GAN training without explicit regularization.
8 code implementations • NeurIPS 2019 • Florian Schäfer, Anima Anandkumar
We introduce a new algorithm for the numerical computation of Nash equilibria of competitive two-player games.
3 code implementations • 20 Sep 2017 • Leon Thurner, Alexander Scheidler, Florian Schäfer, Jan-Hendrik Menke, Julian Dollichon, Friederike Meier, Steffen Meinecke, Martin Braun
pandapower is a Python based, BSD-licensed power system analysis tool aimed at automation of static and quasi-static analysis and optimization of balanced power systems.
Computational Engineering, Finance, and Science
1 code implementation • 7 Jun 2017 • Florian Schäfer, T. J. Sullivan, Houman Owhadi
This block-factorisation can provably be obtained in complexity $\mathcal{O} ( N \log( N ) \log^{d}( N /\epsilon) )$ in space and $\mathcal{O} ( N \log^{2}( N ) \log^{2d}( N /\epsilon) )$ in time.
Numerical Analysis Computational Complexity Data Structures and Algorithms Probability 65F30, 42C40, 65F50, 65N55, 65N75, 60G42, 68Q25, 68W40