no code implementations • 9 May 2024 • Tianji Cai, Garrett W. Merz, François Charton, Niklas Nolte, Matthias Wilhelm, Kyle Cranmer, Lance J. Dixon
We pursue the use of deep learning methods to improve state-of-the-art computations in theoretical high-energy physics.
no code implementations • 16 Jan 2024 • Abhijith Gandrakota, Lily Zhang, Aahlad Puli, Kyle Cranmer, Jennifer Ngadiuba, Rajesh Ranganath, Nhan Tran
Anomaly, or out-of-distribution, detection is a promising tool for aiding discoveries of new particles or processes in particle physics.
no code implementations • 3 Sep 2023 • Kyle Cranmer, Gurtej Kanwar, Sébastien Racanière, Danilo J. Rezende, Phiala E. Shanahan
This Perspective outlines the advances in ML-based sampling motivated by lattice quantum field theory, in particular for the theory of quantum chromodynamics.
no code implementations • 3 May 2023 • Ryan Abbott, Michael S. Albergo, Aleksandar Botev, Denis Boyda, Kyle Cranmer, Daniel C. Hackett, Gurtej Kanwar, Alexander G. D. G. Matthews, Sébastien Racanière, Ali Razavi, Danilo J. Rezende, Fernando Romero-López, Phiala E. Shanahan, Julian M. Urban
Applications of normalizing flows to the sampling of field configurations in lattice gauge theory have so far been explored almost exclusively in two space-time dimensions.
no code implementations • 7 Mar 2023 • Philipp Berens, Kyle Cranmer, Neil D. Lawrence, Ulrike Von Luxburg, Jessica Montgomery
This report summarises the discussions from the seminar and provides a roadmap to suggest how different communities can collaborate to deliver a new wave of progress in AI and its application for scientific discovery.
no code implementations • 3 Mar 2023 • Francesco Armando Di Bello, Anton Charkin-Gorbulin, Kyle Cranmer, Etienne Dreyer, Sanmay Ganguly, Eilam Gross, Lukas Heinrich, Lorenzo Santi, Marumi Kado, Nilotpal Kakati, Patrick Rieck, Matteo Tusoni
A configurable calorimeter simulation for AI (COCOA) applications is presented, based on the Geant4 toolkit and interfaced with the Pythia event generator.
no code implementations • 14 Nov 2022 • Ryan Abbott, Michael S. Albergo, Aleksandar Botev, Denis Boyda, Kyle Cranmer, Daniel C. Hackett, Alexander G. D. G. Matthews, Sébastien Racanière, Ali Razavi, Danilo J. Rezende, Fernando Romero-López, Phiala E. Shanahan, Julian M. Urban
Recent applications of machine-learned normalizing flows to sampling in lattice field theory suggest that such methods may be able to mitigate critical slowing down and topological freezing.
no code implementations • 18 Jul 2022 • Ryan Abbott, Michael S. Albergo, Denis Boyda, Kyle Cranmer, Daniel C. Hackett, Gurtej Kanwar, Sébastien Racanière, Danilo J. Rezende, Fernando Romero-López, Phiala E. Shanahan, Betsy Tian, Julian M. Urban
This work presents gauge-equivariant architectures for flow-based sampling in fermionic lattice field theories using pseudofermions as stochastic estimators for the fermionic determinant.
no code implementations • 23 Feb 2022 • Michael S. Albergo, Denis Boyda, Kyle Cranmer, Daniel C. Hackett, Gurtej Kanwar, Sébastien Racanière, Danilo J. Rezende, Fernando Romero-López, Phiala E. Shanahan, Julian M. Urban
In this work, we provide a numerical demonstration of robust flow-based sampling in the Schwinger model at the critical value of the fermion mass.
no code implementations • 6 Dec 2021 • Alexander Lavin, David Krakauer, Hector Zenil, Justin Gottschlich, Tim Mattson, Johann Brehmer, Anima Anandkumar, Sanjay Choudry, Kamil Rocki, Atılım Güneş Baydin, Carina Prunkl, Brooks Paige, Olexandr Isayev, Erik Peterson, Peter L. McMahon, Jakob Macke, Kyle Cranmer, Jiaxin Zhang, Haruko Wainwright, Adi Hanuka, Manuela Veloso, Samuel Assefa, Stephan Zheng, Avi Pfeffer
We present the "Nine Motifs of Simulation Intelligence", a roadmap for the development and integration of the essential algorithms necessary for a merger of scientific computing, scientific simulation, and artificial intelligence.
1 code implementation • 13 Oct 2021 • Siddharth Mishra-Sharma, Kyle Cranmer
The nature of the Fermi gamma-ray Galactic Center Excess (GCE) has remained a persistent mystery for over a decade.
no code implementations • 1 Jul 2021 • Daniel C. Hackett, Chung-Chun Hsieh, Michael S. Albergo, Denis Boyda, Jiunn-Wei Chen, Kai-Feng Chen, Kyle Cranmer, Gurtej Kanwar, Phiala E. Shanahan
Recent results have demonstrated that samplers constructed with flow-based generative models are a promising new approach for configuration generation in lattice field theory.
no code implementations • 10 Jun 2021 • Michael S. Albergo, Gurtej Kanwar, Sébastien Racanière, Danilo J. Rezende, Julian M. Urban, Denis Boyda, Kyle Cranmer, Daniel C. Hackett, Phiala E. Shanahan
Algorithms based on normalizing flows are emerging as promising machine learning approaches to sampling complicated probability distributions in a way that can be made asymptotically exact.
no code implementations • 14 Apr 2021 • Craig S. Greenberg, Sebastian Macaluso, Nicholas Monath, Avinava Dubey, Patrick Flaherty, Manzil Zaheer, Amr Ahmed, Kyle Cranmer, Andrew McCallum
In those cases, hierarchical clustering can be seen as a combinatorial optimization problem.
no code implementations • 20 Jan 2021 • Michael S. Albergo, Denis Boyda, Daniel C. Hackett, Gurtej Kanwar, Kyle Cranmer, Sébastien Racanière, Danilo Jimenez Rezende, Phiala E. Shanahan
This notebook tutorial demonstrates a method for sampling Boltzmann distributions of lattice field theories using a class of machine learning models known as normalizing flows.
1 code implementation • 16 Nov 2020 • Johann Brehmer, Sebastian Macaluso, Duccio Pappadopulo, Kyle Cranmer
Particle physics experiments often require the reconstruction of decay patterns through a hierarchical clustering of the observed final-state particles.
1 code implementation • 20 Oct 2020 • Siddharth Mishra-Sharma, Kyle Cranmer
Mismodeling the uncertain, diffuse emission of Galactic origin can seriously bias the characterization of astrophysical gamma-ray data, particularly in the region of the Inner Milky Way where such emission can make up over 80% of the photon counts observed at ~GeV energies.
no code implementations • 13 Oct 2020 • Johann Brehmer, Kyle Cranmer
Our predictions for particle physics processes are realized in a chain of complex simulators.
no code implementations • 12 Aug 2020 • Denis Boyda, Gurtej Kanwar, Sébastien Racanière, Danilo Jimenez Rezende, Michael S. Albergo, Kyle Cranmer, Daniel C. Hackett, Phiala E. Shanahan
We develop a flow-based sampling algorithm for $SU(N)$ lattice gauge theories that is gauge-invariant by construction.
1 code implementation • 6 Aug 2020 • Jonathan Shlomi, Sanmay Ganguly, Eilam Gross, Kyle Cranmer, Yaron Lipman, Hadar Serviansky, Haggai Maron, Nimrod Segol
Jet classification is an important ingredient in measurements and searches for new physics at particle coliders, and secondary vertex reconstruction is a key intermediate step in building powerful jet classifiers.
High Energy Physics - Experiment High Energy Physics - Phenomenology
3 code implementations • NeurIPS 2020 • Miles Cranmer, Alvaro Sanchez-Gonzalez, Peter Battaglia, Rui Xu, Kyle Cranmer, David Spergel, Shirley Ho
The technique works as follows: we first encourage sparse latent representations when we train a GNN in a supervised setting, then we apply symbolic regression to components of the learned model to extract explicit physical relations.
2 code implementations • NeurIPS 2020 • Johann Brehmer, Kyle Cranmer
We introduce manifold-learning flows (M-flows), a new class of generative models that simultaneously learn the data manifold as well as a tractable probability density on that manifold.
no code implementations • 13 Mar 2020 • Gurtej Kanwar, Michael S. Albergo, Denis Boyda, Kyle Cranmer, Daniel C. Hackett, Sébastien Racanière, Danilo Jimenez Rezende, Phiala E. Shanahan
We define a class of machine-learned flow-based sampling algorithms for lattice gauge theories that are gauge-invariant by construction.
1 code implementation • 26 Feb 2020 • Craig S. Greenberg, Sebastian Macaluso, Nicholas Monath, Ji-Ah Lee, Patrick Flaherty, Kyle Cranmer, Andrew Mcgregor, Andrew McCallum
In contrast to existing methods, we present novel dynamic-programming algorithms for \emph{exact} inference in hierarchical clustering based on a novel trellis data structure, and we prove that we can exactly compute the partition function, maximum likelihood hierarchy, and marginal probabilities of sub-hierarchies and clusters.
1 code implementation • NeurIPS 2020 • Hadar Serviansky, Nimrod Segol, Jonathan Shlomi, Kyle Cranmer, Eilam Gross, Haggai Maron, Yaron Lipman
Many problems in machine learning can be cast as learning functions from sets to graphs, or more generally to hypergraphs; in short, Set2Graph functions.
4 code implementations • ICML 2020 • Danilo Jimenez Rezende, George Papamakarios, Sébastien Racanière, Michael S. Albergo, Gurtej Kanwar, Phiala E. Shanahan, Kyle Cranmer
Normalizing flows are a powerful tool for building expressive distributions in high dimensions.
no code implementations • 4 Nov 2019 • Kyle Cranmer, Johann Brehmer, Gilles Louppe
Many domains of science have developed complex simulations to describe phenomena of interest.
no code implementations • 27 Sep 2019 • Alvaro Sanchez-Gonzalez, Victor Bapst, Kyle Cranmer, Peter Battaglia
We introduce an approach for imposing physically informed inductive biases in learned simulation models.
3 code implementations • 4 Sep 2019 • Johann Brehmer, Siddharth Mishra-Sharma, Joeri Hermans, Gilles Louppe, Kyle Cranmer
The subtle and unique imprint of dark matter substructure on extended arcs in strong lensing systems contains a wealth of information about the properties and distribution of dark matter on small scales and, consequently, about the underlying particle physics.
5 code implementations • 24 Jul 2019 • Johann Brehmer, Felix Kling, Irina Espejo, Kyle Cranmer
Precision measurements at the LHC often require analyzing high-dimensional event data for subtle kinematic signatures, which is challenging for established analysis methods.
3 code implementations • 8 Jul 2019 • Atılım Güneş Baydin, Lei Shao, Wahid Bhimji, Lukas Heinrich, Lawrence Meadows, Jialin Liu, Andreas Munk, Saeid Naderiparizi, Bradley Gram-Hansen, Gilles Louppe, Mingfei Ma, Xiaohui Zhao, Philip Torr, Victor Lee, Kyle Cranmer, Prabhat, Frank Wood
Probabilistic programming languages (PPLs) are receiving widespread attention for performing Bayesian inference in complex generative models.
no code implementations • 4 Jun 2019 • Johann Brehmer, Kyle Cranmer, Irina Espejo, Felix Kling, Gilles Louppe, Juan Pavez
One major challenge for the legacy measurements at the LHC is that the likelihood function is not tractable when the collected data is high-dimensional and the detector response has to be modeled.
no code implementations • 11 Apr 2019 • Kyle Cranmer, Siavash Golkar, Duccio Pappadopulo
We also introduce quantum flows, the quantum analog of normalizing flows, which can be used to increase the expressivity of this variational family.
1 code implementation • 25 Mar 2019 • Giuseppe Carleo, Ignacio Cirac, Kyle Cranmer, Laurent Daudet, Maria Schuld, Naftali Tishby, Leslie Vogt-Maranto, Lenka Zdeborová
Machine learning encompasses a broad range of algorithms and modeling tools used for a vast array of data processing tasks, which has entered most scientific disciplines in recent years.
Computational Physics Cosmology and Nongalactic Astrophysics Disordered Systems and Neural Networks High Energy Physics - Theory Quantum Physics
no code implementations • 2 Aug 2018 • Markus Stoye, Johann Brehmer, Gilles Louppe, Juan Pavez, Kyle Cranmer
We extend recent work (Brehmer, et.
3 code implementations • NeurIPS 2019 • Atılım Güneş Baydin, Lukas Heinrich, Wahid Bhimji, Lei Shao, Saeid Naderiparizi, Andreas Munk, Jialin Liu, Bradley Gram-Hansen, Gilles Louppe, Lawrence Meadows, Philip Torr, Victor Lee, Prabhat, Kyle Cranmer, Frank Wood
We present a novel probabilistic programming framework that couples directly to existing large-scale simulators through a cross-platform probabilistic execution protocol, which allows general-purpose inference engines to record and control random number draws within simulators in a language-agnostic way.
no code implementations • 8 Jul 2018 • Kim Albertsson, Piero Altoe, Dustin Anderson, John Anderson, Michael Andrews, Juan Pedro Araque Espinosa, Adam Aurisano, Laurent Basara, Adrian Bevan, Wahid Bhimji, Daniele Bonacorsi, Bjorn Burkle, Paolo Calafiura, Mario Campanelli, Louis Capps, Federico Carminati, Stefano Carrazza, Yi-fan Chen, Taylor Childers, Yann Coadou, Elias Coniavitis, Kyle Cranmer, Claire David, Douglas Davis, Andrea De Simone, Javier Duarte, Martin Erdmann, Jonas Eschle, Amir Farbin, Matthew Feickert, Nuno Filipe Castro, Conor Fitzpatrick, Michele Floris, Alessandra Forti, Jordi Garra-Tico, Jochen Gemmler, Maria Girone, Paul Glaysher, Sergei Gleyzer, Vladimir Gligorov, Tobias Golling, Jonas Graw, Lindsey Gray, Dick Greenwood, Thomas Hacker, John Harvey, Benedikt Hegner, Lukas Heinrich, Ulrich Heintz, Ben Hooberman, Johannes Junggeburth, Michael Kagan, Meghan Kane, Konstantin Kanishchev, Przemysław Karpiński, Zahari Kassabov, Gaurav Kaul, Dorian Kcira, Thomas Keck, Alexei Klimentov, Jim Kowalkowski, Luke Kreczko, Alexander Kurepin, Rob Kutschke, Valentin Kuznetsov, Nicolas Köhler, Igor Lakomov, Kevin Lannon, Mario Lassnig, Antonio Limosani, Gilles Louppe, Aashrita Mangu, Pere Mato, Narain Meenakshi, Helge Meinhard, Dario Menasce, Lorenzo Moneta, Seth Moortgat, Mark Neubauer, Harvey Newman, Sydney Otten, Hans Pabst, Michela Paganini, Manfred Paulini, Gabriel Perdue, Uzziel Perez, Attilio Picazio, Jim Pivarski, Harrison Prosper, Fernanda Psihas, Alexander Radovic, Ryan Reece, Aurelius Rinkevicius, Eduardo Rodrigues, Jamal Rorie, David Rousseau, Aaron Sauers, Steven Schramm, Ariel Schwartzman, Horst Severini, Paul Seyfert, Filip Siroky, Konstantin Skazytkin, Mike Sokoloff, Graeme Stewart, Bob Stienen, Ian Stockdale, Giles Strong, Wei Sun, Savannah Thais, Karen Tomko, Eli Upfal, Emanuele Usai, Andrey Ustyuzhanin, Martin Vala, Justin Vasel, Sofia Vallecorsa, Mauro Verzetti, Xavier Vilasís-Cardona, Jean-Roch Vlimant, Ilija Vukotic, Sean-Jiun Wang, Gordon Watts, Michael Williams, Wenjing Wu, Stefan Wunsch, Kun Yang, Omar Zapata
In this document we discuss promising future research and development areas for machine learning in particle physics.
BIG-bench Machine Learning Vocal Bursts Intensity Prediction
1 code implementation • ICLR 2019 • Siavash Golkar, Kyle Cranmer
We introduce backdrop, a flexible and simple-to-implement method, intuitively described as dropout acting only along the backpropagation pipeline.
5 code implementations • 30 May 2018 • Johann Brehmer, Gilles Louppe, Juan Pavez, Kyle Cranmer
Simulators often provide the best description of real-world phenomena.
1 code implementation • 30 Apr 2018 • Johann Brehmer, Kyle Cranmer, Gilles Louppe, Juan Pavez
We present powerful new analysis techniques to constrain effective field theories at the LHC.
2 code implementations • 30 Apr 2018 • Johann Brehmer, Kyle Cranmer, Gilles Louppe, Juan Pavez
We develop, discuss, and compare several inference techniques to constrain theory parameters in collider experiments.
no code implementations • 21 Dec 2017 • Mario Lezcano Casado, Atilim Gunes Baydin, David Martinez Rubio, Tuan Anh Le, Frank Wood, Lukas Heinrich, Gilles Louppe, Kyle Cranmer, Karen Ng, Wahid Bhimji, Prabhat
We consider the problem of Bayesian inference in the family of probabilistic models implicitly defined by stochastic generative models of data.
1 code implementation • 18 Dec 2017 • Johannes Albrecht, Antonio Augusto Alves Jr, Guilherme Amadio, Giuseppe Andronico, Nguyen Anh-Ky, Laurent Aphecetche, John Apostolakis, Makoto Asai, Luca Atzori, Marian Babik, Giuseppe Bagliesi, Marilena Bandieramonte, Sunanda Banerjee, Martin Barisits, Lothar A. T. Bauerdick, Stefano Belforte, Douglas Benjamin, Catrin Bernius, Wahid Bhimji, Riccardo Maria Bianchi, Ian Bird, Catherine Biscarat, Jakob Blomer, Kenneth Bloom, Tommaso Boccali, Brian Bockelman, Tomasz Bold, Daniele Bonacorsi, Antonio Boveia, Concezio Bozzi, Marko Bracko, David Britton, Andy Buckley, Predrag Buncic, Paolo Calafiura, Simone Campana, Philippe Canal, Luca Canali, Gianpaolo Carlino, Nuno Castro, Marco Cattaneo, Gianluca Cerminara, Javier Cervantes Villanueva, Philip Chang, John Chapman, Gang Chen, Taylor Childers, Peter Clarke, Marco Clemencic, Eric Cogneras, Jeremy Coles, Ian Collier, David Colling, Gloria Corti, Gabriele Cosmo, Davide Costanzo, Ben Couturier, Kyle Cranmer, Jack Cranshaw, Leonardo Cristella, David Crooks, Sabine Crépé-Renaudin, Robert Currie, Sünje Dallmeier-Tiessen, Kaushik De, Michel De Cian, Albert De Roeck, Antonio Delgado Peris, Frédéric Derue, Alessandro Di Girolamo, Salvatore Di Guida, Gancho Dimitrov, Caterina Doglioni, Andrea Dotti, Dirk Duellmann, Laurent Duflot, Dave Dykstra, Katarzyna Dziedziniewicz-Wojcik, Agnieszka Dziurda, Ulrik Egede, Peter Elmer, Johannes Elmsheuser, V. Daniel Elvira, Giulio Eulisse, Steven Farrell, Torben Ferber, Andrej Filipcic, Ian Fisk, Conor Fitzpatrick, José Flix, Andrea Formica, Alessandra Forti, Giovanni Franzoni, James Frost, Stu Fuess, Frank Gaede, Gerardo Ganis, Robert Gardner, Vincent Garonne, Andreas Gellrich, Krzysztof Genser, Simon George, Frank Geurts, Andrei Gheata, Mihaela Gheata, Francesco Giacomini, Stefano Giagu, Manuel Giffels, Douglas Gingrich, Maria Girone, Vladimir V. Gligorov, Ivan Glushkov, Wesley Gohn, Jose Benito Gonzalez Lopez, Isidro González Caballero, Juan R. González Fernández, Giacomo Govi, Claudio Grandi, Hadrien Grasland, Heather Gray, Lucia Grillo, Wen Guan, Oliver Gutsche, Vardan Gyurjyan, Andrew Hanushevsky, Farah Hariri, Thomas Hartmann, John Harvey, Thomas Hauth, Benedikt Hegner, Beate Heinemann, Lukas Heinrich, Andreas Heiss, José M. Hernández, Michael Hildreth, Mark Hodgkinson, Stefan Hoeche, Burt Holzman, Peter Hristov, Xingtao Huang, Vladimir N. Ivanchenko, Todor Ivanov, Jan Iven, Brij Jashal, Bodhitha Jayatilaka, Roger Jones, Michel Jouvin, Soon Yung Jun, Michael Kagan, Charles William Kalderon, Meghan Kane, Edward Karavakis, Daniel S. Katz, Dorian Kcira, Oliver Keeble, Borut Paul Kersevan, Michael Kirby, Alexei Klimentov, Markus Klute, Ilya Komarov, Dmitri Konstantinov, Patrick Koppenburg, Jim Kowalkowski, Luke Kreczko, Thomas Kuhr, Robert Kutschke, Valentin Kuznetsov, Walter Lampl, Eric Lancon, David Lange, Mario Lassnig, Paul Laycock, Charles Leggett, James Letts, Birgit Lewendel, Teng Li, Guilherme Lima, Jacob Linacre, Tomas Linden, Miron Livny, Giuseppe Lo Presti, Sebastian Lopienski, Peter Love, Adam Lyon, Nicolò Magini, Zachary L. Marshall, Edoardo Martelli, Stewart Martin-Haugh, Pere Mato, Kajari Mazumdar, Thomas McCauley, Josh McFayden, Shawn McKee, Andrew McNab, Rashid Mehdiyev, Helge Meinhard, Dario Menasce, Patricia Mendez Lorenzo, Alaettin Serhan Mete, Michele Michelotto, Jovan Mitrevski, Lorenzo Moneta, Ben Morgan, Richard Mount, Edward Moyse, Sean Murray, Armin Nairz, Mark S. Neubauer, Andrew Norman, Sérgio Novaes, Mihaly Novak, Arantza Oyanguren, Nurcan Ozturk, Andres Pacheco Pages, Michela Paganini, Jerome Pansanel, Vincent R. Pascuzzi, Glenn Patrick, Alex Pearce, Ben Pearson, Kevin Pedro, Gabriel Perdue, Antonio Perez-Calero Yzquierdo, Luca Perrozzi, Troels Petersen, Marko Petric, Andreas Petzold, Jónatan Piedra, Leo Piilonen, Danilo Piparo, Jim Pivarski, Witold Pokorski, Francesco Polci, Karolos Potamianos, Fernanda Psihas, Albert Puig Navarro, Günter Quast, Gerhard Raven, Jürgen Reuter, Alberto Ribon, Lorenzo Rinaldi, Martin Ritter, James Robinson, Eduardo Rodrigues, Stefan Roiser, David Rousseau, Gareth Roy, Grigori Rybkine, Andre Sailer, Tai Sakuma, Renato Santana, Andrea Sartirana, Heidi Schellman, Jaroslava Schovancová, Steven Schramm, Markus Schulz, Andrea Sciabà, Sally Seidel, Sezen Sekmen, Cedric Serfon, Horst Severini, Elizabeth Sexton-Kennedy, Michael Seymour, Davide Sgalaberna, Illya Shapoval, Jamie Shiers, Jing-Ge Shiu, Hannah Short, Gian Piero Siroli, Sam Skipsey, Tim Smith, Scott Snyder, Michael D. Sokoloff, Panagiotis Spentzouris, Hartmut Stadie, Giordon Stark, Gordon Stewart, Graeme A. Stewart, Arturo Sánchez, Alberto Sánchez-Hernández, Anyes Taffard, Umberto Tamponi, Jeff Templon, Giacomo Tenaglia, Vakhtang Tsulaia, Christopher Tunnell, Eric Vaandering, Andrea Valassi, Sofia Vallecorsa, Liviu Valsan, Peter Van Gemmeren, Renaud Vernet, Brett Viren, Jean-Roch Vlimant, Christian Voss, Margaret Votava, Carl Vuosalo, Carlos Vázquez Sierra, Romain Wartel, Gordon T. Watts, Torre Wenaus, Sandro Wenzel, Mike Williams, Frank Winklmeier, Christoph Wissing, Frank Wuerthwein, Benjamin Wynne, Zhang Xiaomei, Wei Yang, Efe Yazgan
Particle physics has an ambitious and broad experimental programme for the coming decades.
Computational Physics High Energy Physics - Experiment
no code implementations • 17 Sep 2017 • Meghan Frate, Kyle Cranmer, Saarik Kalia, Alexander Vandenberg-Rodes, Daniel Whiteson
We demonstrate the application of this approach to modeling the background to searches for dijet resonances at the Large Hadron Collider and describe how the approach can be used in the search for generic localized signals.
Data Analysis, Statistics and Probability High Energy Physics - Experiment High Energy Physics - Phenomenology
2 code implementations • 22 Jul 2017 • Gilles Louppe, Joeri Hermans, Kyle Cranmer
We adapt the training procedure of generative adversarial networks by replacing the differentiable generative network with a domain-specific simulator.
6 code implementations • 2 Feb 2017 • Gilles Louppe, Kyunghyun Cho, Cyril Becot, Kyle Cranmer
Recent progress in applying machine learning for jet physics has been built upon an analogy between calorimeters and images.
5 code implementations • NeurIPS 2017 • Gilles Louppe, Michael Kagan, Kyle Cranmer
Several techniques for domain adaptation have been proposed to account for differences in the distribution of the data used for training and testing.
2 code implementations • 28 Jan 2016 • Pierre Baldi, Kyle Cranmer, Taylor Faucett, Peter Sadowski, Daniel Whiteson
We investigate a new structure for machine learning classifiers applied to problems in high-energy physics by expanding the inputs to include not only measured features but also physics parameters.
BIG-bench Machine Learning Vocal Bursts Intensity Prediction
2 code implementations • 6 Jun 2015 • Kyle Cranmer, Juan Pavez, Gilles Louppe
This leads to a new machine learning-based approach to likelihood-free inference that is complementary to Approximate Bayesian Computation, and which does not require a prior on the model parameters.
1 code implementation • 12 Oct 2010 • Kyle Cranmer, Itay Yavin
Searches for new physics by experimental collaborations represent a significant investment in time and resources.
High Energy Physics - Experiment High Energy Physics - Phenomenology Data Analysis, Statistics and Probability
9 code implementations • 10 Jul 2010 • Glen Cowan, Kyle Cranmer, Eilam Gross, Ofer Vitells
We describe likelihood-based statistical tests for use in high energy physics for the discovery of new phenomena and for construction of confidence intervals on model parameters.
Data Analysis, Statistics and Probability High Energy Physics - Experiment