1 code implementation • 15 Dec 2023 • Peter Sorrenson, Felix Draxler, Armand Rousselot, Sander Hummerich, Ullrich Köthe
Many real world data, particularly in the natural sciences and computer vision, lie on known Riemannian manifolds such as spheres, tori or the group of rotation matrices.
1 code implementation • 25 Oct 2023 • Felix Draxler, Peter Sorrenson, Lea Zimmermann, Armand Rousselot, Ullrich Köthe
Normalizing Flows are generative models that directly maximize the likelihood.
1 code implementation • 23 Jun 2023 • Felix Draxler, Lars Kühmichel, Armand Rousselot, Jens Müller, Christoph Schnörr, Ullrich Köthe
Gaussianization is a simple generative model that can be trained without backpropagation.
2 code implementations • 2 Jun 2023 • Peter Sorrenson, Felix Draxler, Armand Rousselot, Sander Hummerich, Lea Zimmermann, Ullrich Köthe
Normalizing Flows explicitly maximize a full-dimensional likelihood on the training data.
1 code implementation • 24 Feb 2023 • Armand Rousselot, Michael Spannowsky
Invertible Neural Networks (INN) have become established tools for the simulation and generation of highly complex data.
no code implementations • 22 Oct 2021 • Anja Butter, Theo Heimel, Sander Hummerich, Tobias Krebs, Tilman Plehn, Armand Rousselot, Sophia Vent
Generative networks are opening new avenues in fast event generation for the LHC.