2 code implementations • 31 Aug 2023 • David Pfau, Simon Axelrod, Halvard Sutterud, Ingrid von Glehn, James S. Spencer
We present a variational Monte Carlo algorithm for estimating the lowest excited states of a quantum system which is a natural generalization of the estimation of ground states.
no code implementations • 11 May 2023 • Wan Tong Lou, Halvard Sutterud, Gino Cassella, W. M. C. Foulkes, Johannes Knolle, David Pfau, James S. Spencer
We demonstrate key limitations of the FermiNet Ansatz in studying the unitary Fermi gas and propose a simple modification based on the idea of an antisymmetric geminal power singlet (AGPs) wave function.
3 code implementations • 24 Nov 2022 • Ingrid von Glehn, James S. Spencer, David Pfau
In recent years, deep neural networks like the FermiNet and PauliNet have been used to significantly improve the accuracy of these first-principle calculations, but they lack an attention-like mechanism for gating interactions between electrons.
no code implementations • 26 Aug 2022 • Jan Hermann, James Spencer, Kenny Choo, Antonio Mezzacapo, W. M. C. Foulkes, David Pfau, Giuseppe Carleo, Frank Noé
Machine learning and specifically deep-learning methods have outperformed human capabilities in many pattern recognition and data processing problems, in game playing, and now also play an increasingly important role in scientific discovery.
no code implementations • 3 Dec 2020 • David Pfau, Danilo Rezende
This reverses the conventional task of normalizing flows -- rather than being given samples from a unknown target distribution and learning a flow that approximates the distribution, we are given a perturbation to an initial distribution and aim to reconstruct a flow that would generate samples from the known perturbed distribution.
2 code implementations • 13 Nov 2020 • James S. Spencer, David Pfau, Aleksandar Botev, W. M. C. Foulkes
The Fermionic Neural Network (FermiNet) is a recently-developed neural network architecture that can be used as a wavefunction Ansatz for many-electron systems, and has already demonstrated high accuracy on small systems.
1 code implementation • NeurIPS 2020 • David Pfau, Irina Higgins, Aleksandar Botev, Sébastien Racanière
We present a novel nonparametric algorithm for symmetry-based disentangling of data manifolds, the Geometric Manifold Component Estimator (GEOMANCER).
1 code implementation • 5 Sep 2019 • David Pfau, James S. Spencer, Alexander G. de G. Matthews, W. M. C. Foulkes
Here we introduce a novel deep learning architecture, the Fermionic Neural Network, as a powerful wavefunction Ansatz for many-electron systems.
1 code implementation • 5 Dec 2018 • Irina Higgins, David Amos, David Pfau, Sebastien Racaniere, Loic Matthey, Danilo Rezende, Alexander Lerchner
Here we propose that a principled solution to characterising disentangled representations can be found by focusing on the transformation properties of the world.
2 code implementations • ICLR 2019 • David Pfau, Stig Petersen, Ashish Agarwal, David G. T. Barrett, Kimberly L. Stachenfeld
We present Spectral Inference Networks, a framework for learning eigenfunctions of linear operators by stochastic optimization.
9 code implementations • 7 Nov 2016 • Luke Metz, Ben Poole, David Pfau, Jascha Sohl-Dickstein
We introduce a method to stabilize Generative Adversarial Networks (GANs) by defining the generator objective with respect to an unrolled optimization of the discriminator.
no code implementations • 6 Oct 2016 • David Pfau, Oriol Vinyals
Both generative adversarial networks (GAN) in unsupervised learning and actor-critic methods in reinforcement learning (RL) have gained a reputation for being difficult to optimize.
8 code implementations • NeurIPS 2016 • Marcin Andrychowicz, Misha Denil, Sergio Gomez, Matthew W. Hoffman, David Pfau, Tom Schaul, Brendan Shillingford, Nando de Freitas
The move from hand-designed features to learned features in machine learning has been wildly successful.
no code implementations • 8 Jun 2016 • Chrisantha Fernando, Dylan Banarse, Malcolm Reynolds, Frederic Besse, David Pfau, Max Jaderberg, Marc Lanctot, Daan Wierstra
In this work we introduce a differentiable version of the Compositional Pattern Producing Network, called the DPPN.
11 code implementations • 9 Sep 2014 • Eftychios A. Pnevmatikakis, Yuanjun Gao, Daniel Soudry, David Pfau, Clay Lacefield, Kira Poskanzer, Randy Bruno, Rafael Yuste, Liam Paninski
We present a structured matrix factorization approach to analyzing calcium imaging recordings of large neuronal ensembles.
Neurons and Cognition Quantitative Methods Applications
no code implementations • NeurIPS 2013 • David Pfau, Eftychios A. Pnevmatikakis, Liam Paninski
We show on model data that the parameters of latent linear dynamical systems can be recovered, and that even if the dynamics are not stationary we can still recover the true latent subspace.
no code implementations • NeurIPS 2010 • David Pfau, Nicholas Bartlett, Frank Wood
We suggest that our method for averaging over PDFAs is a novel approach to predictive distribution smoothing.