no code implementations • 13 Sep 2015 • Andrew B. Berger, Mayur Mudigonda, Michael R. DeWeese, Jascha Sohl-Dickstein
In most sampling algorithms, including Hamiltonian Monte Carlo, transition rates between states correspond to the probability of making a transition in a single time step, and are constrained to be less than or equal to 1.
no code implementations • 18 Apr 2015 • Sarah E. Marzen, Michael R. DeWeese, James P. Crutchfield
A first step towards that larger goal is to develop information measures for individual output processes, including information generation (entropy rate), stored information (statistical complexity), predictable information (excess entropy), and active information accumulation (bound information rate).
no code implementations • 29 Jan 2019 • Charles G. Frye, Neha S. Wadia, Michael R. DeWeese, Kristofer E. Bouchard
Numerically locating the critical points of non-convex surfaces is a long-standing problem central to many fields.
no code implementations • 23 Mar 2020 • Charles G. Frye, James Simon, Neha S. Wadia, Andrew Ligeralde, Michael R. DeWeese, Kristofer E. Bouchard
Despite the fact that the loss functions of deep neural networks are highly non-convex, gradient-based optimization algorithms converge to approximately the same performance from many random initial points.
no code implementations • 15 Dec 2020 • Adam G. Frim, Adrianne Zhong, Shi-Fan Chen, Dibyendu Mandal, Michael R. DeWeese
Engineered swift equilibration (ESE) is a class of driving protocols that enforce an equilibrium distribution with respect to external control parameters at the beginning and end of rapid state transformations of open, classical non-equilibrium systems.
Statistical Mechanics
2 code implementations • 6 Jun 2021 • James B. Simon, Sajant Anand, Michael R. DeWeese
The development of methods to guide the design of neural networks is an important open challenge for deep learning theory.
1 code implementation • 13 Dec 2019 • Michael Y. -S. Fang, Sasikanth Manipatruni, Casimir Wierzynski, Amir Khosrowshahi, Michael R. DeWeese
For the benefit of designing scalable, fault resistant optical neural networks (ONNs), we investigate the effects architectural designs have on the ONNs' robustness to imprecise components.
2 code implementations • 18 Sep 2014 • Jascha Sohl-Dickstein, Mayur Mudigonda, Michael R. DeWeese
We present a method for performing Hamiltonian Monte Carlo that largely eliminates sample rejection for typical hyperparameters.
1 code implementation • 8 Oct 2021 • James B. Simon, Madeline Dickens, Dhruva Karkada, Michael R. DeWeese
We derive simple closed-form estimates for the test risk and other generalization metrics of kernel ridge regression (KRR).
1 code implementation • 17 Jul 2020 • Jascha Sohl-Dickstein, Peter Battaglino, Michael R. DeWeese
Fitting probabilistic models to data is often difficult, due to the general intractability of the partition function.
1 code implementation • 25 Jun 2009 • Jascha Sohl-Dickstein, Peter Battaglino, Michael R. DeWeese
Fitting probabilistic models to data is often difficult, due to the general intractability of the partition function and its derivatives.