no code implementations • 9 Jan 2025 • Kristian G. Barman, Sascha Caron, Emily Sullivan, Henk W. de Regt, Roberto Ruiz de Austri, Mieke Boon, Michael Färber, Stefan Fröse, Faegheh Hasibi, Andreas Ipp, Rukshak Kapoor, Gregor Kasieczka, Daniel Kostić, Michael Krämer, Tobias Golling, Luis G. Lopez, Jesus Marco, Sydney Otten, Pawel Pawlowski, Pietro Vischia, Erik Weber, Christoph Weniger
This paper explores ideas and provides a potential roadmap for the development and evaluation of physics-specific large-scale AI models, which we call Large Physics Models (LPMs).
no code implementations • 14 Nov 2024 • Franck Rothen, Samuel Klein, Matthew Leigh, Tobias Golling
This study aims to enhance the generalization properties of supervised models by reducing the sharpness of local minima.
no code implementations • 29 Oct 2024 • Malte Algren, Christopher Pollard, John Andrew Raine, Tobias Golling
In this paper, we present a novel method for pile-up removal of pp interactions using variational inference with diffusion models, called Vipr.
no code implementations • 19 Sep 2024 • Matthew Leigh, Samuel Klein, François Charton, Tobias Golling, Lukas Heinrich, Michael Kagan, Inês Ochoa, Margarita Osadchy
In this work, we significantly enhance masked particle modeling (MPM), a self-supervised learning scheme for constructing highly expressive representations of unordered sets relevant to developing foundation models for high-energy physics.
1 code implementation • 18 Jun 2024 • Guillaume Quétant, John Andrew Raine, Matthew Leigh, Debajyoti Sengupta, Tobias Golling
This paper presents a novel approach for directly generating full events at detector-level from parton-level information, leveraging cutting-edge machine learning techniques.
1 code implementation • 24 Jan 2024 • Tobias Golling, Lukas Heinrich, Michael Kagan, Samuel Klein, Matthew Leigh, Margarita Osadchy, John Andrew Raine
We propose masked particle modeling (MPM) as a self-supervised method for learning generic, transferable, and reusable representations on unordered sets of inputs for use in high energy physics (HEP) scientific data.
no code implementations • 15 Dec 2023 • Debajyoti Sengupta, Matthew Leigh, John Andrew Raine, Samuel Klein, Tobias Golling
We introduce a new technique called Drapes to enhance the sensitivity in searches for new physics at the LHC.
no code implementations • 29 Sep 2023 • Erik Buhmann, Cedric Ewen, Darius A. Faroughy, Tobias Golling, Gregor Kasieczka, Matthew Leigh, Guillaume Quétant, John Andrew Raine, Debajyoti Sengupta, David Shih
In addition, we introduce \epcfm, the first permutation equivariant continuous normalizing flow (CNF) for particle cloud generation.
no code implementations • 12 Sep 2023 • Tobias Golling, Samuel Klein, Radha Mastandrea, Benjamin Nachman, John Andrew Raine
We propose a protocol called flows for flows for training normalizing flows to morph one dataset into another even if the underlying probability density of neither dataset is known explicitly.
no code implementations • 13 Jul 2023 • Matthew Leigh, Debajyoti Sengupta, John Andrew Raine, Guillaume Quétant, Tobias Golling
Building on the success of PC-JeDi we introduce PC-Droid, a substantially improved diffusion model for the generation of jet particle clouds.
1 code implementation • 11 Jul 2023 • Malte Algren, John Andrew Raine, Tobias Golling
Being able to decorrelate a feature space from protected attributes is an area of active research and study in ethics, fairness, and also natural sciences.
1 code implementation • 5 Jul 2023 • John Andrew Raine, Matthew Leigh, Knut Zoch, Tobias Golling
In this work we introduce $\nu^2$-Flows, an extension of the $\nu$-Flows method to final states containing multiple neutrinos.
no code implementations • 8 May 2023 • Debajyoti Sengupta, Samuel Klein, John Andrew Raine, Tobias Golling
Model independent techniques for constructing background data templates using generative models have shown great promise for use in searches for new physics processes at the LHC.
no code implementations • 28 Apr 2023 • Malte Algren, Tobias Golling, Manuel Guth, Chris Pollard, John Andrew Raine
We present an alternative to reweighting techniques for modifying distributions to account for a desired change in an underlying conditional distribution, as is often needed to correct for mis-modelling in a simulated sample.
1 code implementation • 24 Mar 2023 • Lukas Ehrke, John Andrew Raine, Knut Zoch, Manuel Guth, Tobias Golling
We present a new approach, the Topograph, which reconstructs underlying physics processes, including the intermediary particles, by leveraging underlying priors from the nature of particle physics decays and the flexibility of message passing graph neural networks.
1 code implementation • 9 Mar 2023 • Matthew Leigh, Debajyoti Sengupta, Guillaume Quétant, John Andrew Raine, Knut Zoch, Tobias Golling
In this paper, we present a new method to efficiently generate jets in High Energy Physics called PC-JeDi.
1 code implementation • 4 Nov 2022 • Samuel Klein, Tobias Golling
The sensitivity of many physics analyses can be enhanced by constructing discriminants that preferentially select signal events.
1 code implementation • 4 Nov 2022 • Samuel Klein, John Andrew Raine, Tobias Golling
Normalizing flows are constructed from a base distribution with a known density and a diffeomorphism with a tractable Jacobian.
1 code implementation • 30 May 2022 • Bálint Máté, Samuel Klein, Tobias Golling, François Fleuret
On the other hand, neural networks only perform a forward pass on the input, there is neither a notion of an inverse of a neural network nor is there one of its likelihood contribution.
no code implementations • 20 Dec 2021 • Guillaume Quétant, Mariia Drozdova, Vitaliy Kinakh, Tobias Golling, Slava Voloshynovskiy
We present Turbo-Sim, a generalised autoencoder framework derived from principles of information theory that can be used as a generative model.
1 code implementation • 17 Dec 2021 • Vitaliy Kinakh, Mariia Drozdova, Guillaume Quétant, Tobias Golling, Slava Voloshynovskiy
The InfoSCC-GAN architecture is based on an unsupervised contrastive encoder built on the InfoNCE paradigm, an attribute classifier and an EigenGAN generator.
1 code implementation • 15 Dec 2021 • Samuel Klein, John A. Raine, Sebastian Pina-Otey, Slava Voloshynovskiy, Tobias Golling
Normalizing flows are diffeomorphic, typically dimension-preserving, models trained using the likelihood of the model.
1 code implementation • 3 May 2021 • Sabrina Amrouche, Laurent Basara, Paolo Calafiura, Dmitry Emeliyanov, Victor Estrade, Steven Farrell, Cécile Germain, Vladimir Vava Gligorov, Tobias Golling, Sergey Gorbunov, Heather Gray, Isabelle Guyon, Mikhail Hushchyn, Vincenzo Innocente, Moritz Kiehn, Marcel Kunze, Edward Moyse, David Rousseau, Andreas Salzburger, Andrey Ustyuzhanin, Jean-Roch Vlimant
Both were measured on the Codalab platform where the participants had to upload their software.
no code implementations • 16 Jan 2021 • Sabrina Amrouche, Moritz Kiehn, Tobias Golling, Andreas Salzburger
We propose a novel approach to charged particle tracking at high intensity particle colliders based on Approximate Nearest Neighbors search.
1 code implementation • 3 Jul 2020 • Taoli Cheng, Jean-François Arguin, Julien Leissner-Martin, Jacinthe Pilette, Tobias Golling
To build a performant mass-decorrelated anomalous jet tagger, we propose the Outlier Exposed VAE (OE-VAE), for which some outlier samples are introduced in the training process to guide the learned information.
no code implementations • 8 Jul 2018 • Kim Albertsson, Piero Altoe, Dustin Anderson, John Anderson, Michael Andrews, Juan Pedro Araque Espinosa, Adam Aurisano, Laurent Basara, Adrian Bevan, Wahid Bhimji, Daniele Bonacorsi, Bjorn Burkle, Paolo Calafiura, Mario Campanelli, Louis Capps, Federico Carminati, Stefano Carrazza, Yi-fan Chen, Taylor Childers, Yann Coadou, Elias Coniavitis, Kyle Cranmer, Claire David, Douglas Davis, Andrea De Simone, Javier Duarte, Martin Erdmann, Jonas Eschle, Amir Farbin, Matthew Feickert, Nuno Filipe Castro, Conor Fitzpatrick, Michele Floris, Alessandra Forti, Jordi Garra-Tico, Jochen Gemmler, Maria Girone, Paul Glaysher, Sergei Gleyzer, Vladimir Gligorov, Tobias Golling, Jonas Graw, Lindsey Gray, Dick Greenwood, Thomas Hacker, John Harvey, Benedikt Hegner, Lukas Heinrich, Ulrich Heintz, Ben Hooberman, Johannes Junggeburth, Michael Kagan, Meghan Kane, Konstantin Kanishchev, Przemysław Karpiński, Zahari Kassabov, Gaurav Kaul, Dorian Kcira, Thomas Keck, Alexei Klimentov, Jim Kowalkowski, Luke Kreczko, Alexander Kurepin, Rob Kutschke, Valentin Kuznetsov, Nicolas Köhler, Igor Lakomov, Kevin Lannon, Mario Lassnig, Antonio Limosani, Gilles Louppe, Aashrita Mangu, Pere Mato, Narain Meenakshi, Helge Meinhard, Dario Menasce, Lorenzo Moneta, Seth Moortgat, Mark Neubauer, Harvey Newman, Sydney Otten, Hans Pabst, Michela Paganini, Manfred Paulini, Gabriel Perdue, Uzziel Perez, Attilio Picazio, Jim Pivarski, Harrison Prosper, Fernanda Psihas, Alexander Radovic, Ryan Reece, Aurelius Rinkevicius, Eduardo Rodrigues, Jamal Rorie, David Rousseau, Aaron Sauers, Steven Schramm, Ariel Schwartzman, Horst Severini, Paul Seyfert, Filip Siroky, Konstantin Skazytkin, Mike Sokoloff, Graeme Stewart, Bob Stienen, Ian Stockdale, Giles Strong, Wei Sun, Savannah Thais, Karen Tomko, Eli Upfal, Emanuele Usai, Andrey Ustyuzhanin, Martin Vala, Justin Vasel, Sofia Vallecorsa, Mauro Verzetti, Xavier Vilasís-Cardona, Jean-Roch Vlimant, Ilija Vukotic, Sean-Jiun Wang, Gordon Watts, Michael Williams, Wenjing Wu, Stefan Wunsch, Kun Yang, Omar Zapata
In this document we discuss promising future research and development areas for machine learning in particle physics.
BIG-bench Machine Learning Vocal Bursts Intensity Prediction