Search Results for author: Michael Kagan

Found 19 papers, 7 papers with code

Learning to Pivot with Adversarial Networks

5 code implementations NeurIPS 2017 Gilles Louppe, Michael Kagan, Kyle Cranmer

Several techniques for domain adaptation have been proposed to account for differences in the distribution of the data used for training and testing.

Domain Adaptation Fairness

Jet-Images -- Deep Learning Edition

1 code implementation16 Nov 2015 Luke de Oliveira, Michael Kagan, Lester Mackey, Benjamin Nachman, Ariel Schwartzman

Building on the notion of a particle physics detector as a camera and the collimated streams of high energy particles, or jets, it measures as an image, we investigate the potential of machine learning techniques based on deep learning architectures to identify highly boosted W bosons.

Jet Tagging

Neural Empirical Bayes: Source Distribution Estimation and its Applications to Simulation-Based Inference

1 code implementation11 Nov 2020 Maxime Vandegar, Michael Kagan, Antoine Wehenkel, Gilles Louppe

We revisit empirical Bayes in the absence of a tractable likelihood function, as is typical in scientific domains relying on computer simulations.

Black-Box Optimization with Local Generative Surrogates

1 code implementation NeurIPS 2020 Sergey Shirobokov, Vladislav Belavin, Michael Kagan, Andrey Ustyuzhanin, Atılım Güneş Baydin

To address such cases, we introduce the use of deep generative models to iteratively approximate the simulator in local neighborhoods of the parameter space.

Bayesian Optimization

A Roadmap for HEP Software and Computing R&D for the 2020s

1 code implementation18 Dec 2017 Johannes Albrecht, Antonio Augusto Alves Jr, Guilherme Amadio, Giuseppe Andronico, Nguyen Anh-Ky, Laurent Aphecetche, John Apostolakis, Makoto Asai, Luca Atzori, Marian Babik, Giuseppe Bagliesi, Marilena Bandieramonte, Sunanda Banerjee, Martin Barisits, Lothar A. T. Bauerdick, Stefano Belforte, Douglas Benjamin, Catrin Bernius, Wahid Bhimji, Riccardo Maria Bianchi, Ian Bird, Catherine Biscarat, Jakob Blomer, Kenneth Bloom, Tommaso Boccali, Brian Bockelman, Tomasz Bold, Daniele Bonacorsi, Antonio Boveia, Concezio Bozzi, Marko Bracko, David Britton, Andy Buckley, Predrag Buncic, Paolo Calafiura, Simone Campana, Philippe Canal, Luca Canali, Gianpaolo Carlino, Nuno Castro, Marco Cattaneo, Gianluca Cerminara, Javier Cervantes Villanueva, Philip Chang, John Chapman, Gang Chen, Taylor Childers, Peter Clarke, Marco Clemencic, Eric Cogneras, Jeremy Coles, Ian Collier, David Colling, Gloria Corti, Gabriele Cosmo, Davide Costanzo, Ben Couturier, Kyle Cranmer, Jack Cranshaw, Leonardo Cristella, David Crooks, Sabine Crépé-Renaudin, Robert Currie, Sünje Dallmeier-Tiessen, Kaushik De, Michel De Cian, Albert De Roeck, Antonio Delgado Peris, Frédéric Derue, Alessandro Di Girolamo, Salvatore Di Guida, Gancho Dimitrov, Caterina Doglioni, Andrea Dotti, Dirk Duellmann, Laurent Duflot, Dave Dykstra, Katarzyna Dziedziniewicz-Wojcik, Agnieszka Dziurda, Ulrik Egede, Peter Elmer, Johannes Elmsheuser, V. Daniel Elvira, Giulio Eulisse, Steven Farrell, Torben Ferber, Andrej Filipcic, Ian Fisk, Conor Fitzpatrick, José Flix, Andrea Formica, Alessandra Forti, Giovanni Franzoni, James Frost, Stu Fuess, Frank Gaede, Gerardo Ganis, Robert Gardner, Vincent Garonne, Andreas Gellrich, Krzysztof Genser, Simon George, Frank Geurts, Andrei Gheata, Mihaela Gheata, Francesco Giacomini, Stefano Giagu, Manuel Giffels, Douglas Gingrich, Maria Girone, Vladimir V. Gligorov, Ivan Glushkov, Wesley Gohn, Jose Benito Gonzalez Lopez, Isidro González Caballero, Juan R. González Fernández, Giacomo Govi, Claudio Grandi, Hadrien Grasland, Heather Gray, Lucia Grillo, Wen Guan, Oliver Gutsche, Vardan Gyurjyan, Andrew Hanushevsky, Farah Hariri, Thomas Hartmann, John Harvey, Thomas Hauth, Benedikt Hegner, Beate Heinemann, Lukas Heinrich, Andreas Heiss, José M. Hernández, Michael Hildreth, Mark Hodgkinson, Stefan Hoeche, Burt Holzman, Peter Hristov, Xingtao Huang, Vladimir N. Ivanchenko, Todor Ivanov, Jan Iven, Brij Jashal, Bodhitha Jayatilaka, Roger Jones, Michel Jouvin, Soon Yung Jun, Michael Kagan, Charles William Kalderon, Meghan Kane, Edward Karavakis, Daniel S. Katz, Dorian Kcira, Oliver Keeble, Borut Paul Kersevan, Michael Kirby, Alexei Klimentov, Markus Klute, Ilya Komarov, Dmitri Konstantinov, Patrick Koppenburg, Jim Kowalkowski, Luke Kreczko, Thomas Kuhr, Robert Kutschke, Valentin Kuznetsov, Walter Lampl, Eric Lancon, David Lange, Mario Lassnig, Paul Laycock, Charles Leggett, James Letts, Birgit Lewendel, Teng Li, Guilherme Lima, Jacob Linacre, Tomas Linden, Miron Livny, Giuseppe Lo Presti, Sebastian Lopienski, Peter Love, Adam Lyon, Nicolò Magini, Zachary L. Marshall, Edoardo Martelli, Stewart Martin-Haugh, Pere Mato, Kajari Mazumdar, Thomas McCauley, Josh McFayden, Shawn McKee, Andrew McNab, Rashid Mehdiyev, Helge Meinhard, Dario Menasce, Patricia Mendez Lorenzo, Alaettin Serhan Mete, Michele Michelotto, Jovan Mitrevski, Lorenzo Moneta, Ben Morgan, Richard Mount, Edward Moyse, Sean Murray, Armin Nairz, Mark S. Neubauer, Andrew Norman, Sérgio Novaes, Mihaly Novak, Arantza Oyanguren, Nurcan Ozturk, Andres Pacheco Pages, Michela Paganini, Jerome Pansanel, Vincent R. Pascuzzi, Glenn Patrick, Alex Pearce, Ben Pearson, Kevin Pedro, Gabriel Perdue, Antonio Perez-Calero Yzquierdo, Luca Perrozzi, Troels Petersen, Marko Petric, Andreas Petzold, Jónatan Piedra, Leo Piilonen, Danilo Piparo, Jim Pivarski, Witold Pokorski, Francesco Polci, Karolos Potamianos, Fernanda Psihas, Albert Puig Navarro, Günter Quast, Gerhard Raven, Jürgen Reuter, Alberto Ribon, Lorenzo Rinaldi, Martin Ritter, James Robinson, Eduardo Rodrigues, Stefan Roiser, David Rousseau, Gareth Roy, Grigori Rybkine, Andre Sailer, Tai Sakuma, Renato Santana, Andrea Sartirana, Heidi Schellman, Jaroslava Schovancová, Steven Schramm, Markus Schulz, Andrea Sciabà, Sally Seidel, Sezen Sekmen, Cedric Serfon, Horst Severini, Elizabeth Sexton-Kennedy, Michael Seymour, Davide Sgalaberna, Illya Shapoval, Jamie Shiers, Jing-Ge Shiu, Hannah Short, Gian Piero Siroli, Sam Skipsey, Tim Smith, Scott Snyder, Michael D. Sokoloff, Panagiotis Spentzouris, Hartmut Stadie, Giordon Stark, Gordon Stewart, Graeme A. Stewart, Arturo Sánchez, Alberto Sánchez-Hernández, Anyes Taffard, Umberto Tamponi, Jeff Templon, Giacomo Tenaglia, Vakhtang Tsulaia, Christopher Tunnell, Eric Vaandering, Andrea Valassi, Sofia Vallecorsa, Liviu Valsan, Peter Van Gemmeren, Renaud Vernet, Brett Viren, Jean-Roch Vlimant, Christian Voss, Margaret Votava, Carl Vuosalo, Carlos Vázquez Sierra, Romain Wartel, Gordon T. Watts, Torre Wenaus, Sandro Wenzel, Mike Williams, Frank Winklmeier, Christoph Wissing, Frank Wuerthwein, Benjamin Wynne, Zhang Xiaomei, Wei Yang, Efe Yazgan

Particle physics has an ambitious and broad experimental programme for the coming decades.

Computational Physics High Energy Physics - Experiment

Continual Learning via Neural Pruning

1 code implementation11 Mar 2019 Siavash Golkar, Michael Kagan, Kyunghyun Cho

We introduce Continual Learning via Neural Pruning (CLNP), a new method aimed at lifelong learning in fixed capacity models based on neuronal model sparsification.

Continual Learning

Differentiable Vertex Fitting for Jet Flavour Tagging

1 code implementation19 Oct 2023 Rachel E. C. Smith, Inês Ochoa, Rúben Inácio, Jonathan Shoemaker, Michael Kagan

We propose a differentiable vertex fitting algorithm that can be used for secondary vertex fitting, and that can be seamlessly integrated into neural networks for jet flavour tagging.

Machine Learning in High Energy Physics Community White Paper

no code implementations8 Jul 2018 Kim Albertsson, Piero Altoe, Dustin Anderson, John Anderson, Michael Andrews, Juan Pedro Araque Espinosa, Adam Aurisano, Laurent Basara, Adrian Bevan, Wahid Bhimji, Daniele Bonacorsi, Bjorn Burkle, Paolo Calafiura, Mario Campanelli, Louis Capps, Federico Carminati, Stefano Carrazza, Yi-fan Chen, Taylor Childers, Yann Coadou, Elias Coniavitis, Kyle Cranmer, Claire David, Douglas Davis, Andrea De Simone, Javier Duarte, Martin Erdmann, Jonas Eschle, Amir Farbin, Matthew Feickert, Nuno Filipe Castro, Conor Fitzpatrick, Michele Floris, Alessandra Forti, Jordi Garra-Tico, Jochen Gemmler, Maria Girone, Paul Glaysher, Sergei Gleyzer, Vladimir Gligorov, Tobias Golling, Jonas Graw, Lindsey Gray, Dick Greenwood, Thomas Hacker, John Harvey, Benedikt Hegner, Lukas Heinrich, Ulrich Heintz, Ben Hooberman, Johannes Junggeburth, Michael Kagan, Meghan Kane, Konstantin Kanishchev, Przemysław Karpiński, Zahari Kassabov, Gaurav Kaul, Dorian Kcira, Thomas Keck, Alexei Klimentov, Jim Kowalkowski, Luke Kreczko, Alexander Kurepin, Rob Kutschke, Valentin Kuznetsov, Nicolas Köhler, Igor Lakomov, Kevin Lannon, Mario Lassnig, Antonio Limosani, Gilles Louppe, Aashrita Mangu, Pere Mato, Narain Meenakshi, Helge Meinhard, Dario Menasce, Lorenzo Moneta, Seth Moortgat, Mark Neubauer, Harvey Newman, Sydney Otten, Hans Pabst, Michela Paganini, Manfred Paulini, Gabriel Perdue, Uzziel Perez, Attilio Picazio, Jim Pivarski, Harrison Prosper, Fernanda Psihas, Alexander Radovic, Ryan Reece, Aurelius Rinkevicius, Eduardo Rodrigues, Jamal Rorie, David Rousseau, Aaron Sauers, Steven Schramm, Ariel Schwartzman, Horst Severini, Paul Seyfert, Filip Siroky, Konstantin Skazytkin, Mike Sokoloff, Graeme Stewart, Bob Stienen, Ian Stockdale, Giles Strong, Wei Sun, Savannah Thais, Karen Tomko, Eli Upfal, Emanuele Usai, Andrey Ustyuzhanin, Martin Vala, Justin Vasel, Sofia Vallecorsa, Mauro Verzetti, Xavier Vilasís-Cardona, Jean-Roch Vlimant, Ilija Vukotic, Sean-Jiun Wang, Gordon Watts, Michael Williams, Wenjing Wu, Stefan Wunsch, Kun Yang, Omar Zapata

In this document we discuss promising future research and development areas for machine learning in particle physics.

BIG-bench Machine Learning Vocal Bursts Intensity Prediction

Image-Based Jet Analysis

no code implementations17 Dec 2020 Michael Kagan

Image-based jet analysis is built upon the jet image representation of jets that enables a direct connection between high energy physics and the fields of computer vision and deep learning.

Anomaly Detection General Classification

Differentiable Matrix Elements with MadJax

no code implementations28 Feb 2022 Lukas Heinrich, Michael Kagan

MadJax is a tool for generating and evaluating differentiable matrix elements of high energy scattering processes.

Graph Neural Networks in Particle Physics: Implementations, Innovations, and Challenges

no code implementations23 Mar 2022 Savannah Thais, Paolo Calafiura, Grigorios Chachamis, Gage DeZoort, Javier Duarte, Sanmay Ganguly, Michael Kagan, Daniel Murnane, Mark S. Neubauer, Kazuhiro Terao

Where previously these sets of data have been formulated as series or image data to match the available machine learning architectures, with the advent of graph neural networks (GNNs), these systems can be learned natively as graphs.

Novel Light Field Imaging Device with Enhanced Light Collection for Cold Atom Clouds

no code implementations23 May 2022 Sanha Cheong, Josef C. Frisch, Sean Gasiorowski, Jason M. Hogan, Michael Kagan, Murtaza Safdari, Ariel Schwartzman, Maxime Vandegar

In particular, for atom clouds used in atom interferometry experiments, the system can reconstruct 3D fringe patterns with size $\mathcal{O}$(100 $\mu$m).

3D Reconstruction

Interpretable Uncertainty Quantification in AI for HEP

no code implementations5 Aug 2022 Thomas Y. Chen, Biprateep Dey, Aishik Ghosh, Michael Kagan, Brian Nord, Nesar Ramachandra

Estimating uncertainty is at the core of performing scientific measurements in HEP: a measurement is not useful without an estimate of its uncertainty.

Decision Making Uncertainty Quantification

Branches of a Tree: Taking Derivatives of Programs with Discrete and Branching Randomness in High Energy Physics

no code implementations31 Aug 2023 Michael Kagan, Lukas Heinrich

We propose to apply several gradient estimation techniques to enable the differentiation of programs with discrete randomness in High Energy Physics.

Clustering

Masked Particle Modeling on Sets: Towards Self-Supervised High Energy Physics Foundation Models

no code implementations24 Jan 2024 Lukas Heinrich, Tobias Golling, Michael Kagan, Samuel Klein, Matthew Leigh, Margarita Osadchy, John Andrew Raine

We propose masked particle modeling (MPM) as a self-supervised method for learning generic, transferable, and reusable representations on unordered sets of inputs for use in high energy physics (HEP) scientific data.

Self-Supervised Learning

Re-Simulation-based Self-Supervised Learning for Pre-Training Foundation Models

no code implementations11 Mar 2024 Philip Harris, Michael Kagan, Jeffrey Krupa, Benedikt Maier, Nathaniel Woodward

Self-Supervised Learning (SSL) is at the core of training modern large machine learning models, providing a scheme for learning powerful representations that can be used in a variety of downstream tasks.

Contrastive Learning Data Augmentation +1

Cannot find the paper you are looking for? You can Submit a new open access paper.