1 code implementation • 1 Nov 2024 • Johann Brehmer, Víctor Bresó, Pim de Haan, Tilman Plehn, Huilin Qu, Jonas Spinner, Jesse Thaler
We show that the Lorentz-Equivariant Geometric Algebra Transformer (L-GATr) yields state-of-the-art performance for a wide range of machine learning tasks at the Large Hadron Collider.
no code implementations • 30 Oct 2024 • Johann Brehmer, Sönke Behrends, Pim de Haan, Taco Cohen
Given large data sets and sufficient compute, is it beneficial to design neural architectures for the structure and symmetries of each problem?
no code implementations • 21 Jun 2024 • Thomas Hehn, Markus Peschl, Tribhuvanesh Orekondy, Arash Behboodi, Johann Brehmer
Modelling the propagation of electromagnetic wireless signals is critical for designing modern communication systems.
1 code implementation • 23 May 2024 • Jonas Spinner, Victor Bresó, Pim de Haan, Tilman Plehn, Jesse Thaler, Johann Brehmer
We propose the Lorentz Geometric Algebra Transformer (L-GATr), a new multi-purpose architecture for high-energy physics.
no code implementations • 6 Dec 2023 • Ekdeep Singh Lubana, Johann Brehmer, Pim de Haan, Taco Cohen
We explore the viability of casting foundation models as generic reward functions for reinforcement learning.
1 code implementation • 8 Nov 2023 • Pim de Haan, Taco Cohen, Johann Brehmer
The Geometric Algebra Transformer (GATr) is a versatile architecture for geometric deep learning based on projective geometric algebra.
2 code implementations • NeurIPS 2023 • Johann Brehmer, Pim de Haan, Sönke Behrends, Taco Cohen
In this paper we introduce the Geometric Algebra Transformer (GATr), a general-purpose architecture for geometric data.
1 code implementation • 4 Nov 2022 • Risto Vuorio, Pim de Haan, Johann Brehmer, Hanno Ackermann, Daniel Dijkman, Taco Cohen
Standard imitation learning can fail when the expert demonstrators have different sensory inputs than the imitating agent.
no code implementations • 30 Mar 2022 • Johann Brehmer, Pim de Haan, Phillip Lippe, Taco Cohen
Learning high-level causal representations together with a causal model from unstructured low-level data such as pixels is impossible from observational data alone.
no code implementations • 6 Dec 2021 • Alexander Lavin, David Krakauer, Hector Zenil, Justin Gottschlich, Tim Mattson, Johann Brehmer, Anima Anandkumar, Sanjay Choudry, Kamil Rocki, Atılım Güneş Baydin, Carina Prunkl, Brooks Paige, Olexandr Isayev, Erik Peterson, Peter L. McMahon, Jakob Macke, Kyle Cranmer, Jiaxin Zhang, Haruko Wainwright, Adi Hanuka, Manuela Veloso, Samuel Assefa, Stephan Zheng, Avi Pfeffer
We present the "Nine Motifs of Simulation Intelligence", a roadmap for the development and integration of the essential algorithms necessary for a merger of scientific computing, scientific simulation, and artificial intelligence.
no code implementations • 19 Nov 2021 • Ties van Rozendaal, Johann Brehmer, Yunfan Zhang, Reza Pourreza, Auke Wiggers, Taco S. Cohen
We introduce a video compression algorithm based on instance-adaptive learning.
1 code implementation • 16 Nov 2020 • Johann Brehmer, Sebastian Macaluso, Duccio Pappadopulo, Kyle Cranmer
Particle physics experiments often require the reconstruction of decay patterns through a hierarchical clustering of the observed final-state particles.
no code implementations • 13 Oct 2020 • Johann Brehmer, Kyle Cranmer
Our predictions for particle physics processes are realized in a chain of complex simulators.
2 code implementations • NeurIPS 2020 • Johann Brehmer, Kyle Cranmer
We introduce manifold-learning flows (M-flows), a new class of generative models that simultaneously learn the data manifold as well as a tractable probability density on that manifold.
no code implementations • 4 Nov 2019 • Kyle Cranmer, Johann Brehmer, Gilles Louppe
Many domains of science have developed complex simulations to describe phenomena of interest.
3 code implementations • 4 Sep 2019 • Johann Brehmer, Siddharth Mishra-Sharma, Joeri Hermans, Gilles Louppe, Kyle Cranmer
The subtle and unique imprint of dark matter substructure on extended arcs in strong lensing systems contains a wealth of information about the properties and distribution of dark matter on small scales and, consequently, about the underlying particle physics.
5 code implementations • 24 Jul 2019 • Johann Brehmer, Felix Kling, Irina Espejo, Kyle Cranmer
Precision measurements at the LHC often require analyzing high-dimensional event data for subtle kinematic signatures, which is challenging for established analysis methods.
no code implementations • 4 Jun 2019 • Johann Brehmer, Kyle Cranmer, Irina Espejo, Felix Kling, Gilles Louppe, Juan Pavez
One major challenge for the legacy measurements at the LHC is that the likelihood function is not tractable when the collected data is high-dimensional and the detector response has to be modeled.
no code implementations • 2 Aug 2018 • Markus Stoye, Johann Brehmer, Gilles Louppe, Juan Pavez, Kyle Cranmer
We extend recent work (Brehmer, et.
5 code implementations • 30 May 2018 • Johann Brehmer, Gilles Louppe, Juan Pavez, Kyle Cranmer
Simulators often provide the best description of real-world phenomena.
2 code implementations • 30 Apr 2018 • Johann Brehmer, Kyle Cranmer, Gilles Louppe, Juan Pavez
We develop, discuss, and compare several inference techniques to constrain theory parameters in collider experiments.
1 code implementation • 30 Apr 2018 • Johann Brehmer, Kyle Cranmer, Gilles Louppe, Juan Pavez
We present powerful new analysis techniques to constrain effective field theories at the LHC.