Search Results for author: Kyle Cranmer

Found 49 papers, 27 papers with code

Advances in machine-learning-based sampling motivated by lattice quantum chromodynamics

no code implementations3 Sep 2023 Kyle Cranmer, Gurtej Kanwar, Sébastien Racanière, Danilo J. Rezende, Phiala E. Shanahan

This Perspective outlines the advances in ML-based sampling motivated by lattice quantum field theory, in particular for the theory of quantum chromodynamics.

Audio Generation

AI for Science: An Emerging Agenda

no code implementations7 Mar 2023 Philipp Berens, Kyle Cranmer, Neil D. Lawrence, Ulrike Von Luxburg, Jessica Montgomery

This report summarises the discussions from the seminar and provides a roadmap to suggest how different communities can collaborate to deliver a new wave of progress in AI and its application for scientific discovery.

Aspects of scaling and scalability for flow-based sampling of lattice QCD

no code implementations14 Nov 2022 Ryan Abbott, Michael S. Albergo, Aleksandar Botev, Denis Boyda, Kyle Cranmer, Daniel C. Hackett, Alexander G. D. G. Matthews, Sébastien Racanière, Ali Razavi, Danilo J. Rezende, Fernando Romero-López, Phiala E. Shanahan, Julian M. Urban

Recent applications of machine-learned normalizing flows to sampling in lattice field theory suggest that such methods may be able to mitigate critical slowing down and topological freezing.

Gauge-equivariant flow models for sampling in lattice field theories with pseudofermions

no code implementations18 Jul 2022 Ryan Abbott, Michael S. Albergo, Denis Boyda, Kyle Cranmer, Daniel C. Hackett, Gurtej Kanwar, Sébastien Racanière, Danilo J. Rezende, Fernando Romero-López, Phiala E. Shanahan, Betsy Tian, Julian M. Urban

This work presents gauge-equivariant architectures for flow-based sampling in fermionic lattice field theories using pseudofermions as stochastic estimators for the fermionic determinant.

Flow-based sampling in the lattice Schwinger model at criticality

no code implementations23 Feb 2022 Michael S. Albergo, Denis Boyda, Kyle Cranmer, Daniel C. Hackett, Gurtej Kanwar, Sébastien Racanière, Danilo J. Rezende, Fernando Romero-López, Phiala E. Shanahan, Julian M. Urban

In this work, we provide a numerical demonstration of robust flow-based sampling in the Schwinger model at the critical value of the fermion mass.

A neural simulation-based inference approach for characterizing the Galactic Center $γ$-ray excess

1 code implementation13 Oct 2021 Siddharth Mishra-Sharma, Kyle Cranmer

The nature of the Fermi gamma-ray Galactic Center Excess (GCE) has remained a persistent mystery for over a decade.

Density Estimation

Flow-based sampling for multimodal distributions in lattice field theory

no code implementations1 Jul 2021 Daniel C. Hackett, Chung-Chun Hsieh, Michael S. Albergo, Denis Boyda, Jiunn-Wei Chen, Kai-Feng Chen, Kyle Cranmer, Gurtej Kanwar, Phiala E. Shanahan

Recent results have demonstrated that samplers constructed with flow-based generative models are a promising new approach for configuration generation in lattice field theory.

Flow-based sampling for fermionic lattice field theories

no code implementations10 Jun 2021 Michael S. Albergo, Gurtej Kanwar, Sébastien Racanière, Danilo J. Rezende, Julian M. Urban, Denis Boyda, Kyle Cranmer, Daniel C. Hackett, Phiala E. Shanahan

Algorithms based on normalizing flows are emerging as promising machine learning approaches to sampling complicated probability distributions in a way that can be made asymptotically exact.

Introduction to Normalizing Flows for Lattice Field Theory

no code implementations20 Jan 2021 Michael S. Albergo, Denis Boyda, Daniel C. Hackett, Gurtej Kanwar, Kyle Cranmer, Sébastien Racanière, Danilo Jimenez Rezende, Phiala E. Shanahan

This notebook tutorial demonstrates a method for sampling Boltzmann distributions of lattice field theories using a class of machine learning models known as normalizing flows.

BIG-bench Machine Learning

Hierarchical clustering in particle physics through reinforcement learning

1 code implementation16 Nov 2020 Johann Brehmer, Sebastian Macaluso, Duccio Pappadopulo, Kyle Cranmer

Particle physics experiments often require the reconstruction of decay patterns through a hierarchical clustering of the observed final-state particles.

Clustering reinforcement-learning +1

Semi-parametric $γ$-ray modeling with Gaussian processes and variational inference

1 code implementation20 Oct 2020 Siddharth Mishra-Sharma, Kyle Cranmer

Mismodeling the uncertain, diffuse emission of Galactic origin can seriously bias the characterization of astrophysical gamma-ray data, particularly in the region of the Inner Milky Way where such emission can make up over 80% of the photon counts observed at ~GeV energies.

Gaussian Processes Variational Inference

Simulation-based inference methods for particle physics

no code implementations13 Oct 2020 Johann Brehmer, Kyle Cranmer

Our predictions for particle physics processes are realized in a chain of complex simulators.

Probabilistic Programming

Sampling using $SU(N)$ gauge equivariant flows

no code implementations12 Aug 2020 Denis Boyda, Gurtej Kanwar, Sébastien Racanière, Danilo Jimenez Rezende, Michael S. Albergo, Kyle Cranmer, Daniel C. Hackett, Phiala E. Shanahan

We develop a flow-based sampling algorithm for $SU(N)$ lattice gauge theories that is gauge-invariant by construction.

Secondary Vertex Finding in Jets with Neural Networks

1 code implementation6 Aug 2020 Jonathan Shlomi, Sanmay Ganguly, Eilam Gross, Kyle Cranmer, Yaron Lipman, Hadar Serviansky, Haggai Maron, Nimrod Segol

Jet classification is an important ingredient in measurements and searches for new physics at particle coliders, and secondary vertex reconstruction is a key intermediate step in building powerful jet classifiers.

High Energy Physics - Experiment High Energy Physics - Phenomenology

Discovering Symbolic Models from Deep Learning with Inductive Biases

3 code implementations NeurIPS 2020 Miles Cranmer, Alvaro Sanchez-Gonzalez, Peter Battaglia, Rui Xu, Kyle Cranmer, David Spergel, Shirley Ho

The technique works as follows: we first encourage sparse latent representations when we train a GNN in a supervised setting, then we apply symbolic regression to components of the learned model to extract explicit physical relations.

Symbolic Regression

Flows for simultaneous manifold learning and density estimation

2 code implementations NeurIPS 2020 Johann Brehmer, Kyle Cranmer

We introduce manifold-learning flows (M-flows), a new class of generative models that simultaneously learn the data manifold as well as a tractable probability density on that manifold.

Denoising Density Estimation +2

Equivariant flow-based sampling for lattice gauge theory

no code implementations13 Mar 2020 Gurtej Kanwar, Michael S. Albergo, Denis Boyda, Kyle Cranmer, Daniel C. Hackett, Sébastien Racanière, Danilo Jimenez Rezende, Phiala E. Shanahan

We define a class of machine-learned flow-based sampling algorithms for lattice gauge theories that are gauge-invariant by construction.

Data Structures & Algorithms for Exact Inference in Hierarchical Clustering

1 code implementation26 Feb 2020 Craig S. Greenberg, Sebastian Macaluso, Nicholas Monath, Ji-Ah Lee, Patrick Flaherty, Kyle Cranmer, Andrew Mcgregor, Andrew McCallum

In contrast to existing methods, we present novel dynamic-programming algorithms for \emph{exact} inference in hierarchical clustering based on a novel trellis data structure, and we prove that we can exactly compute the partition function, maximum likelihood hierarchy, and marginal probabilities of sub-hierarchies and clusters.

Clustering Small Data Image Classification

Set2Graph: Learning Graphs From Sets

1 code implementation NeurIPS 2020 Hadar Serviansky, Nimrod Segol, Jonathan Shlomi, Kyle Cranmer, Eilam Gross, Haggai Maron, Yaron Lipman

Many problems in machine learning can be cast as learning functions from sets to graphs, or more generally to hypergraphs; in short, Set2Graph functions.

BIG-bench Machine Learning Clustering

The frontier of simulation-based inference

no code implementations4 Nov 2019 Kyle Cranmer, Johann Brehmer, Gilles Louppe

Many domains of science have developed complex simulations to describe phenomena of interest.

Hamiltonian Graph Networks with ODE Integrators

no code implementations27 Sep 2019 Alvaro Sanchez-Gonzalez, Victor Bapst, Kyle Cranmer, Peter Battaglia

We introduce an approach for imposing physically informed inductive biases in learned simulation models.

Mining for Dark Matter Substructure: Inferring subhalo population properties from strong lenses with machine learning

3 code implementations4 Sep 2019 Johann Brehmer, Siddharth Mishra-Sharma, Joeri Hermans, Gilles Louppe, Kyle Cranmer

The subtle and unique imprint of dark matter substructure on extended arcs in strong lensing systems contains a wealth of information about the properties and distribution of dark matter on small scales and, consequently, about the underlying particle physics.

BIG-bench Machine Learning

MadMiner: Machine learning-based inference for particle physics

5 code implementations24 Jul 2019 Johann Brehmer, Felix Kling, Irina Espejo, Kyle Cranmer

Precision measurements at the LHC often require analyzing high-dimensional event data for subtle kinematic signatures, which is challenging for established analysis methods.

BIG-bench Machine Learning

Effective LHC measurements with matrix elements and machine learning

no code implementations4 Jun 2019 Johann Brehmer, Kyle Cranmer, Irina Espejo, Felix Kling, Gilles Louppe, Juan Pavez

One major challenge for the legacy measurements at the LHC is that the likelihood function is not tractable when the collected data is high-dimensional and the detector response has to be modeled.

BIG-bench Machine Learning Density Estimation

Inferring the quantum density matrix with machine learning

no code implementations11 Apr 2019 Kyle Cranmer, Siavash Golkar, Duccio Pappadopulo

We also introduce quantum flows, the quantum analog of normalizing flows, which can be used to increase the expressivity of this variational family.

BIG-bench Machine Learning Variational Inference

Machine learning and the physical sciences

1 code implementation25 Mar 2019 Giuseppe Carleo, Ignacio Cirac, Kyle Cranmer, Laurent Daudet, Maria Schuld, Naftali Tishby, Leslie Vogt-Maranto, Lenka Zdeborová

Machine learning encompasses a broad range of algorithms and modeling tools used for a vast array of data processing tasks, which has entered most scientific disciplines in recent years.

Computational Physics Cosmology and Nongalactic Astrophysics Disordered Systems and Neural Networks High Energy Physics - Theory Quantum Physics

Efficient Probabilistic Inference in the Quest for Physics Beyond the Standard Model

3 code implementations NeurIPS 2019 Atılım Güneş Baydin, Lukas Heinrich, Wahid Bhimji, Lei Shao, Saeid Naderiparizi, Andreas Munk, Jialin Liu, Bradley Gram-Hansen, Gilles Louppe, Lawrence Meadows, Philip Torr, Victor Lee, Prabhat, Kyle Cranmer, Frank Wood

We present a novel probabilistic programming framework that couples directly to existing large-scale simulators through a cross-platform probabilistic execution protocol, which allows general-purpose inference engines to record and control random number draws within simulators in a language-agnostic way.

Probabilistic Programming

Machine Learning in High Energy Physics Community White Paper

no code implementations8 Jul 2018 Kim Albertsson, Piero Altoe, Dustin Anderson, John Anderson, Michael Andrews, Juan Pedro Araque Espinosa, Adam Aurisano, Laurent Basara, Adrian Bevan, Wahid Bhimji, Daniele Bonacorsi, Bjorn Burkle, Paolo Calafiura, Mario Campanelli, Louis Capps, Federico Carminati, Stefano Carrazza, Yi-fan Chen, Taylor Childers, Yann Coadou, Elias Coniavitis, Kyle Cranmer, Claire David, Douglas Davis, Andrea De Simone, Javier Duarte, Martin Erdmann, Jonas Eschle, Amir Farbin, Matthew Feickert, Nuno Filipe Castro, Conor Fitzpatrick, Michele Floris, Alessandra Forti, Jordi Garra-Tico, Jochen Gemmler, Maria Girone, Paul Glaysher, Sergei Gleyzer, Vladimir Gligorov, Tobias Golling, Jonas Graw, Lindsey Gray, Dick Greenwood, Thomas Hacker, John Harvey, Benedikt Hegner, Lukas Heinrich, Ulrich Heintz, Ben Hooberman, Johannes Junggeburth, Michael Kagan, Meghan Kane, Konstantin Kanishchev, Przemysław Karpiński, Zahari Kassabov, Gaurav Kaul, Dorian Kcira, Thomas Keck, Alexei Klimentov, Jim Kowalkowski, Luke Kreczko, Alexander Kurepin, Rob Kutschke, Valentin Kuznetsov, Nicolas Köhler, Igor Lakomov, Kevin Lannon, Mario Lassnig, Antonio Limosani, Gilles Louppe, Aashrita Mangu, Pere Mato, Narain Meenakshi, Helge Meinhard, Dario Menasce, Lorenzo Moneta, Seth Moortgat, Mark Neubauer, Harvey Newman, Sydney Otten, Hans Pabst, Michela Paganini, Manfred Paulini, Gabriel Perdue, Uzziel Perez, Attilio Picazio, Jim Pivarski, Harrison Prosper, Fernanda Psihas, Alexander Radovic, Ryan Reece, Aurelius Rinkevicius, Eduardo Rodrigues, Jamal Rorie, David Rousseau, Aaron Sauers, Steven Schramm, Ariel Schwartzman, Horst Severini, Paul Seyfert, Filip Siroky, Konstantin Skazytkin, Mike Sokoloff, Graeme Stewart, Bob Stienen, Ian Stockdale, Giles Strong, Wei Sun, Savannah Thais, Karen Tomko, Eli Upfal, Emanuele Usai, Andrey Ustyuzhanin, Martin Vala, Justin Vasel, Sofia Vallecorsa, Mauro Verzetti, Xavier Vilasís-Cardona, Jean-Roch Vlimant, Ilija Vukotic, Sean-Jiun Wang, Gordon Watts, Michael Williams, Wenjing Wu, Stefan Wunsch, Kun Yang, Omar Zapata

In this document we discuss promising future research and development areas for machine learning in particle physics.

BIG-bench Machine Learning Vocal Bursts Intensity Prediction

Backdrop: Stochastic Backpropagation

1 code implementation ICLR 2019 Siavash Golkar, Kyle Cranmer

We introduce backdrop, a flexible and simple-to-implement method, intuitively described as dropout acting only along the backpropagation pipeline.

A Guide to Constraining Effective Field Theories with Machine Learning

2 code implementations30 Apr 2018 Johann Brehmer, Kyle Cranmer, Gilles Louppe, Juan Pavez

We develop, discuss, and compare several inference techniques to constrain theory parameters in collider experiments.

BIG-bench Machine Learning

Constraining Effective Field Theories with Machine Learning

1 code implementation30 Apr 2018 Johann Brehmer, Kyle Cranmer, Gilles Louppe, Juan Pavez

We present powerful new analysis techniques to constrain effective field theories at the LHC.

BIG-bench Machine Learning

A Roadmap for HEP Software and Computing R&D for the 2020s

1 code implementation18 Dec 2017 Johannes Albrecht, Antonio Augusto Alves Jr, Guilherme Amadio, Giuseppe Andronico, Nguyen Anh-Ky, Laurent Aphecetche, John Apostolakis, Makoto Asai, Luca Atzori, Marian Babik, Giuseppe Bagliesi, Marilena Bandieramonte, Sunanda Banerjee, Martin Barisits, Lothar A. T. Bauerdick, Stefano Belforte, Douglas Benjamin, Catrin Bernius, Wahid Bhimji, Riccardo Maria Bianchi, Ian Bird, Catherine Biscarat, Jakob Blomer, Kenneth Bloom, Tommaso Boccali, Brian Bockelman, Tomasz Bold, Daniele Bonacorsi, Antonio Boveia, Concezio Bozzi, Marko Bracko, David Britton, Andy Buckley, Predrag Buncic, Paolo Calafiura, Simone Campana, Philippe Canal, Luca Canali, Gianpaolo Carlino, Nuno Castro, Marco Cattaneo, Gianluca Cerminara, Javier Cervantes Villanueva, Philip Chang, John Chapman, Gang Chen, Taylor Childers, Peter Clarke, Marco Clemencic, Eric Cogneras, Jeremy Coles, Ian Collier, David Colling, Gloria Corti, Gabriele Cosmo, Davide Costanzo, Ben Couturier, Kyle Cranmer, Jack Cranshaw, Leonardo Cristella, David Crooks, Sabine Crépé-Renaudin, Robert Currie, Sünje Dallmeier-Tiessen, Kaushik De, Michel De Cian, Albert De Roeck, Antonio Delgado Peris, Frédéric Derue, Alessandro Di Girolamo, Salvatore Di Guida, Gancho Dimitrov, Caterina Doglioni, Andrea Dotti, Dirk Duellmann, Laurent Duflot, Dave Dykstra, Katarzyna Dziedziniewicz-Wojcik, Agnieszka Dziurda, Ulrik Egede, Peter Elmer, Johannes Elmsheuser, V. Daniel Elvira, Giulio Eulisse, Steven Farrell, Torben Ferber, Andrej Filipcic, Ian Fisk, Conor Fitzpatrick, José Flix, Andrea Formica, Alessandra Forti, Giovanni Franzoni, James Frost, Stu Fuess, Frank Gaede, Gerardo Ganis, Robert Gardner, Vincent Garonne, Andreas Gellrich, Krzysztof Genser, Simon George, Frank Geurts, Andrei Gheata, Mihaela Gheata, Francesco Giacomini, Stefano Giagu, Manuel Giffels, Douglas Gingrich, Maria Girone, Vladimir V. Gligorov, Ivan Glushkov, Wesley Gohn, Jose Benito Gonzalez Lopez, Isidro González Caballero, Juan R. González Fernández, Giacomo Govi, Claudio Grandi, Hadrien Grasland, Heather Gray, Lucia Grillo, Wen Guan, Oliver Gutsche, Vardan Gyurjyan, Andrew Hanushevsky, Farah Hariri, Thomas Hartmann, John Harvey, Thomas Hauth, Benedikt Hegner, Beate Heinemann, Lukas Heinrich, Andreas Heiss, José M. Hernández, Michael Hildreth, Mark Hodgkinson, Stefan Hoeche, Burt Holzman, Peter Hristov, Xingtao Huang, Vladimir N. Ivanchenko, Todor Ivanov, Jan Iven, Brij Jashal, Bodhitha Jayatilaka, Roger Jones, Michel Jouvin, Soon Yung Jun, Michael Kagan, Charles William Kalderon, Meghan Kane, Edward Karavakis, Daniel S. Katz, Dorian Kcira, Oliver Keeble, Borut Paul Kersevan, Michael Kirby, Alexei Klimentov, Markus Klute, Ilya Komarov, Dmitri Konstantinov, Patrick Koppenburg, Jim Kowalkowski, Luke Kreczko, Thomas Kuhr, Robert Kutschke, Valentin Kuznetsov, Walter Lampl, Eric Lancon, David Lange, Mario Lassnig, Paul Laycock, Charles Leggett, James Letts, Birgit Lewendel, Teng Li, Guilherme Lima, Jacob Linacre, Tomas Linden, Miron Livny, Giuseppe Lo Presti, Sebastian Lopienski, Peter Love, Adam Lyon, Nicolò Magini, Zachary L. Marshall, Edoardo Martelli, Stewart Martin-Haugh, Pere Mato, Kajari Mazumdar, Thomas McCauley, Josh McFayden, Shawn McKee, Andrew McNab, Rashid Mehdiyev, Helge Meinhard, Dario Menasce, Patricia Mendez Lorenzo, Alaettin Serhan Mete, Michele Michelotto, Jovan Mitrevski, Lorenzo Moneta, Ben Morgan, Richard Mount, Edward Moyse, Sean Murray, Armin Nairz, Mark S. Neubauer, Andrew Norman, Sérgio Novaes, Mihaly Novak, Arantza Oyanguren, Nurcan Ozturk, Andres Pacheco Pages, Michela Paganini, Jerome Pansanel, Vincent R. Pascuzzi, Glenn Patrick, Alex Pearce, Ben Pearson, Kevin Pedro, Gabriel Perdue, Antonio Perez-Calero Yzquierdo, Luca Perrozzi, Troels Petersen, Marko Petric, Andreas Petzold, Jónatan Piedra, Leo Piilonen, Danilo Piparo, Jim Pivarski, Witold Pokorski, Francesco Polci, Karolos Potamianos, Fernanda Psihas, Albert Puig Navarro, Günter Quast, Gerhard Raven, Jürgen Reuter, Alberto Ribon, Lorenzo Rinaldi, Martin Ritter, James Robinson, Eduardo Rodrigues, Stefan Roiser, David Rousseau, Gareth Roy, Grigori Rybkine, Andre Sailer, Tai Sakuma, Renato Santana, Andrea Sartirana, Heidi Schellman, Jaroslava Schovancová, Steven Schramm, Markus Schulz, Andrea Sciabà, Sally Seidel, Sezen Sekmen, Cedric Serfon, Horst Severini, Elizabeth Sexton-Kennedy, Michael Seymour, Davide Sgalaberna, Illya Shapoval, Jamie Shiers, Jing-Ge Shiu, Hannah Short, Gian Piero Siroli, Sam Skipsey, Tim Smith, Scott Snyder, Michael D. Sokoloff, Panagiotis Spentzouris, Hartmut Stadie, Giordon Stark, Gordon Stewart, Graeme A. Stewart, Arturo Sánchez, Alberto Sánchez-Hernández, Anyes Taffard, Umberto Tamponi, Jeff Templon, Giacomo Tenaglia, Vakhtang Tsulaia, Christopher Tunnell, Eric Vaandering, Andrea Valassi, Sofia Vallecorsa, Liviu Valsan, Peter Van Gemmeren, Renaud Vernet, Brett Viren, Jean-Roch Vlimant, Christian Voss, Margaret Votava, Carl Vuosalo, Carlos Vázquez Sierra, Romain Wartel, Gordon T. Watts, Torre Wenaus, Sandro Wenzel, Mike Williams, Frank Winklmeier, Christoph Wissing, Frank Wuerthwein, Benjamin Wynne, Zhang Xiaomei, Wei Yang, Efe Yazgan

Particle physics has an ambitious and broad experimental programme for the coming decades.

Computational Physics High Energy Physics - Experiment

Modeling Smooth Backgrounds and Generic Localized Signals with Gaussian Processes

no code implementations17 Sep 2017 Meghan Frate, Kyle Cranmer, Saarik Kalia, Alexander Vandenberg-Rodes, Daniel Whiteson

We demonstrate the application of this approach to modeling the background to searches for dijet resonances at the Large Hadron Collider and describe how the approach can be used in the search for generic localized signals.

Data Analysis, Statistics and Probability High Energy Physics - Experiment High Energy Physics - Phenomenology

Adversarial Variational Optimization of Non-Differentiable Simulators

2 code implementations22 Jul 2017 Gilles Louppe, Joeri Hermans, Kyle Cranmer

We adapt the training procedure of generative adversarial networks by replacing the differentiable generative network with a domain-specific simulator.

QCD-Aware Recursive Neural Networks for Jet Physics

5 code implementations2 Feb 2017 Gilles Louppe, Kyunghyun Cho, Cyril Becot, Kyle Cranmer

Recent progress in applying machine learning for jet physics has been built upon an analogy between calorimeters and images.


Learning to Pivot with Adversarial Networks

5 code implementations NeurIPS 2017 Gilles Louppe, Michael Kagan, Kyle Cranmer

Several techniques for domain adaptation have been proposed to account for differences in the distribution of the data used for training and testing.

Domain Adaptation Fairness

Parameterized Machine Learning for High-Energy Physics

2 code implementations28 Jan 2016 Pierre Baldi, Kyle Cranmer, Taylor Faucett, Peter Sadowski, Daniel Whiteson

We investigate a new structure for machine learning classifiers applied to problems in high-energy physics by expanding the inputs to include not only measured features but also physics parameters.

BIG-bench Machine Learning Vocal Bursts Intensity Prediction

Approximating Likelihood Ratios with Calibrated Discriminative Classifiers

2 code implementations6 Jun 2015 Kyle Cranmer, Juan Pavez, Gilles Louppe

This leads to a new machine learning-based approach to likelihood-free inference that is complementary to Approximate Bayesian Computation, and which does not require a prior on the model parameters.

Dimensionality Reduction

RECAST: Extending the Impact of Existing Analyses

1 code implementation12 Oct 2010 Kyle Cranmer, Itay Yavin

Searches for new physics by experimental collaborations represent a significant investment in time and resources.

High Energy Physics - Experiment High Energy Physics - Phenomenology Data Analysis, Statistics and Probability

Asymptotic formulae for likelihood-based tests of new physics

9 code implementations10 Jul 2010 Glen Cowan, Kyle Cranmer, Eilam Gross, Ofer Vitells

We describe likelihood-based statistical tests for use in high energy physics for the discovery of new phenomena and for construction of confidence intervals on model parameters.

Data Analysis, Statistics and Probability High Energy Physics - Experiment

Cannot find the paper you are looking for? You can Submit a new open access paper.