Search Results for author: Andres Potapczynski

Found 8 papers, 8 papers with code

CoLA: Exploiting Compositional Structure for Automatic and Efficient Numerical Linear Algebra

1 code implementation NeurIPS 2023 Andres Potapczynski, Marc Finzi, Geoff Pleiss, Andrew Gordon Wilson

In this paper, we propose a simple but general framework for large-scale linear algebra problems in machine learning, named CoLA (Compositional Linear Algebra).

CoLA Gaussian Processes

Simple and Fast Group Robustness by Automatic Feature Reweighting

1 code implementation19 Jun 2023 Shikai Qiu, Andres Potapczynski, Pavel Izmailov, Andrew Gordon Wilson

A major challenge to out-of-distribution generalization is reliance on spurious features -- patterns that are predictive of the class label in the training data distribution, but not causally related to the target.

Out-of-Distribution Generalization

A Stable and Scalable Method for Solving Initial Value PDEs with Neural Networks

1 code implementation28 Apr 2023 Marc Finzi, Andres Potapczynski, Matthew Choptuik, Andrew Gordon Wilson

Unlike conventional grid and mesh based methods for solving partial differential equations (PDEs), neural networks have the potential to break the curse of dimensionality, providing approximate solutions to problems where using classical solvers is difficult or impossible.

PAC-Bayes Compression Bounds So Tight That They Can Explain Generalization

1 code implementation24 Nov 2022 Sanae Lotfi, Marc Finzi, Sanyam Kapoor, Andres Potapczynski, Micah Goldblum, Andrew Gordon Wilson

While there has been progress in developing non-vacuous generalization bounds for deep neural networks, these bounds tend to be uninformative about why deep learning works.

Generalization Bounds Transfer Learning

Low-Precision Arithmetic for Fast Gaussian Processes

1 code implementation14 Jul 2022 Wesley J. Maddox, Andres Potapczynski, Andrew Gordon Wilson

Low-precision arithmetic has had a transformative effect on the training of neural networks, reducing computation, memory and energy requirements.

Gaussian Processes

On the Normalizing Constant of the Continuous Categorical Distribution

2 code implementations28 Apr 2022 Elliott Gordon-Rodriguez, Gabriel Loaiza-Ganem, Andres Potapczynski, John P. Cunningham

This family enjoys remarkable mathematical simplicity; its density function resembles that of the Dirichlet distribution, but with a normalizing constant that can be written in closed form using elementary functions only.

Bias-Free Scalable Gaussian Processes via Randomized Truncations

1 code implementation12 Feb 2021 Andres Potapczynski, Luhuan Wu, Dan Biderman, Geoff Pleiss, John P. Cunningham

In the case of RFF, we show that the bias-to-variance conversion is indeed a trade-off: the additional variance proves detrimental to optimization.

Gaussian Processes

Invertible Gaussian Reparameterization: Revisiting the Gumbel-Softmax

1 code implementation NeurIPS 2020 Andres Potapczynski, Gabriel Loaiza-Ganem, John P. Cunningham

The Gumbel-Softmax is a continuous distribution over the simplex that is often used as a relaxation of discrete distributions.

Cannot find the paper you are looking for? You can Submit a new open access paper.