no code implementations • 10 Jul 2023 • Alison Pouplin, Hrittik Roy, Sidak Pal Singh, Georgios Arvanitidis
In this work, we consider the loss landscape as an embedded Riemannian manifold and show that the differential geometric properties of the manifold can be used when analyzing the generalization abilities of a deep net.
1 code implementation • 20 Dec 2022 • Alison Pouplin, David Eklund, Carl Henrik Ek, Søren Hauberg
Generative models are often stochastic, causing the data space, the Riemannian metric, and the geodesics, to be stochastic as well.
1 code implementation • 23 May 2022 • Paul Scherer, Thomas Gaudelet, Alison Pouplin, Alice Del Vecchio, Suraj M S, Oliver Bolton, Jyothish Soman, Jake P. Taylor-King, Lindsay Edwards
Active learning (AL) is a sub-field of ML focused on the development of methods to iteratively and economically acquire data through strategically querying new data points that are the most useful for a particular task.
1 code implementation • 9 Jun 2021 • Georgios Arvanitidis, Miguel González-Duque, Alison Pouplin, Dimitris Kalatzis, Søren Hauberg
Latent space geometry has shown itself to provide a rich and rigorous framework for interacting with the latent variables of deep generative models.
no code implementations • 7 Jun 2021 • Dimitris Kalatzis, Johan Ziruo Ye, Alison Pouplin, Jesper Wohlert, Søren Hauberg
We present a framework for learning probability distributions on topologically non-trivial manifolds, utilizing normalizing flows.
no code implementations • 2 Jan 2018 • Antonia Creswell, Alison Pouplin, Anil A. Bharath
We propose a novel deep learning model for classifying medical images in the setting where there is a large amount of unlabelled medical data available, but labelled data is in limited supply.