no code implementations • 30 Sep 2019 • Danilo Jimenez Rezende, Sébastien Racanière, Irina Higgins, Peter Toth
This paper introduces equivariant hamiltonian flows, a method for learning expressive densities that are invariant with respect to a known Lie-algebra of local symmetry transformations while providing an equivariant representation of the data.
1 code implementation • 30 Sep 2019 • Joel Veness, Tor Lattimore, David Budden, Avishkar Bhoopchand, Christopher Mattern, Agnieszka Grabska-Barwinska, Eren Sezener, Jianan Wang, Peter Toth, Simon Schmitt, Marcus Hutter
This paper presents a new family of backpropagation-free neural architectures, Gated Linear Networks (GLNs).
1 code implementation • ICLR 2020 • Peter Toth, Danilo Jimenez Rezende, Andrew Jaegle, Sébastien Racanière, Aleksandar Botev, Irina Higgins
The Hamiltonian formalism plays a central role in classical and quantum physics.
no code implementations • 5 Dec 2017 • Joel Veness, Tor Lattimore, Avishkar Bhoopchand, Agnieszka Grabska-Barwinska, Christopher Mattern, Peter Toth
This paper describes a family of probabilistic architectures designed for online learning under the logarithmic loss.
no code implementations • 31 May 2017 • Dan Oprisa, Peter Toth
Guided by critical systems found in nature we develop a novel mechanism consisting of inhomogeneous polynomial regularisation via which we can induce scale invariance in deep learning systems.
no code implementations • 26 Feb 2017 • Dan Oprisa, Peter Toth
Motivated by the idea that criticality and universality of phase transitions might play a crucial role in achieving and sustaining learning and intelligent behaviour in biological and artificial networks, we analyse a theoretical and a pragmatic experimental set up for critical phenomena in deep learning.