no code implementations • 14 Nov 2022 • Ryan Abbott, Michael S. Albergo, Aleksandar Botev, Denis Boyda, Kyle Cranmer, Daniel C. Hackett, Alexander G. D. G. Matthews, Sébastien Racanière, Ali Razavi, Danilo J. Rezende, Fernando Romero-López, Phiala E. Shanahan, Julian M. Urban
Recent applications of machine-learned normalizing flows to sampling in lattice field theory suggest that such methods may be able to mitigate critical slowing down and topological freezing.
1 code implementation • ICLR 2022 • Guodong Zhang, Aleksandar Botev, James Martens
However, this method (called Deep Kernel Shaping) isn't fully compatible with ReLUs, and produces networks that overfit significantly more than ResNets on ImageNet.
1 code implementation • NeurIPS 2021 • Irina Higgins, Peter Wirnsberger, Andrew Jaegle, Aleksandar Botev
Using SyMetric, we identify a set of architectural choices that significantly improve the performance of a previously proposed model for inferring latent dynamics from pixels, the Hamiltonian Generative Network (HGN).
1 code implementation • 9 Nov 2021 • Aleksandar Botev, Andrew Jaegle, Peter Wirnsberger, Daniel Hennes, Irina Higgins
Learning dynamics is at the heart of many important applications of machine learning (ML), such as robotics and autonomous driving.
1 code implementation • 13 Nov 2020 • James S. Spencer, David Pfau, Aleksandar Botev, W. M. C. Foulkes
The Fermionic Neural Network (FermiNet) is a recently-developed neural network architecture that can be used as a wavefunction Ansatz for many-electron systems, and has already demonstrated high accuracy on small systems.
1 code implementation • NeurIPS 2020 • David Pfau, Irina Higgins, Aleksandar Botev, Sébastien Racanière
We present a novel nonparametric algorithm for symmetry-based disentangling of data manifolds, the Geometric Manifold Component Estimator (GEOMANCER).
1 code implementation • ICLR 2020 • Peter Toth, Danilo Jimenez Rezende, Andrew Jaegle, Sébastien Racanière, Aleksandar Botev, Irina Higgins
The Hamiltonian formalism plays a central role in classical and quantum physics.
no code implementations • NeurIPS 2018 • Hippolyt Ritter, Aleksandar Botev, David Barber
In order to make our method scalable, we leverage recent block-diagonal Kronecker factored approximations to the curvature.
1 code implementation • ICLR 2018 • Hippolyt Ritter, Aleksandar Botev, David Barber
Pytorch implementations of Bayes By Backprop, MC Dropout, SGLD, the Local Reparametrization Trick, KF-Laplace and more
no code implementations • ICML 2017 • Aleksandar Botev, Hippolyt Ritter, David Barber
We present an efficient block-diagonal ap- proximation to the Gauss-Newton matrix for feedforward neural networks.
no code implementations • 7 Jul 2016 • Aleksandar Botev, Guy Lever, David Barber
We present a unifying framework for adapting the update direction in gradient-based iterative optimization methods.
no code implementations • 22 Jun 2016 • David Barber, Aleksandar Botev
We consider training probabilistic classifiers in the case of a large number of classes.