Search Results for author: Johann Brehmer

Found 22 papers, 12 papers with code

A Lorentz-Equivariant Transformer for All of the LHC

1 code implementation1 Nov 2024 Johann Brehmer, Víctor Bresó, Pim de Haan, Tilman Plehn, Huilin Qu, Jonas Spinner, Jesse Thaler

We show that the Lorentz-Equivariant Geometric Algebra Transformer (L-GATr) yields state-of-the-art performance for a wide range of machine learning tasks at the Large Hadron Collider.

Does equivariance matter at scale?

no code implementations30 Oct 2024 Johann Brehmer, Sönke Behrends, Pim de Haan, Taco Cohen

Given large data sets and sufficient compute, is it beneficial to design neural architectures for the structure and symmetries of each problem?

Data Augmentation

Differentiable and Learnable Wireless Simulation with Geometric Transformers

no code implementations21 Jun 2024 Thomas Hehn, Markus Peschl, Tribhuvanesh Orekondy, Arash Behboodi, Johann Brehmer

Modelling the propagation of electromagnetic wireless signals is critical for designing modern communication systems.

3D geometry

Lorentz-Equivariant Geometric Algebra Transformers for High-Energy Physics

1 code implementation23 May 2024 Jonas Spinner, Victor Bresó, Pim de Haan, Tilman Plehn, Jesse Thaler, Johann Brehmer

We propose the Lorentz Geometric Algebra Transformer (L-GATr), a new multi-purpose architecture for high-energy physics.

FoMo Rewards: Can we cast foundation models as reward functions?

no code implementations6 Dec 2023 Ekdeep Singh Lubana, Johann Brehmer, Pim de Haan, Taco Cohen

We explore the viability of casting foundation models as generic reward functions for reinforcement learning.

Language Modeling Language Modelling +1

Euclidean, Projective, Conformal: Choosing a Geometric Algebra for Equivariant Transformers

1 code implementation8 Nov 2023 Pim de Haan, Taco Cohen, Johann Brehmer

The Geometric Algebra Transformer (GATr) is a versatile architecture for geometric deep learning based on projective geometric algebra.

Geometric Algebra Transformer

2 code implementations NeurIPS 2023 Johann Brehmer, Pim de Haan, Sönke Behrends, Taco Cohen

In this paper we introduce the Geometric Algebra Transformer (GATr), a general-purpose architecture for geometric data.

Motion Planning

Deconfounding Imitation Learning with Variational Inference

1 code implementation4 Nov 2022 Risto Vuorio, Pim de Haan, Johann Brehmer, Hanno Ackermann, Daniel Dijkman, Taco Cohen

Standard imitation learning can fail when the expert demonstrators have different sensory inputs than the imitating agent.

Imitation Learning Variational Inference

Weakly supervised causal representation learning

no code implementations30 Mar 2022 Johann Brehmer, Pim de Haan, Phillip Lippe, Taco Cohen

Learning high-level causal representations together with a causal model from unstructured low-level data such as pixels is impossible from observational data alone.

Representation Learning

Hierarchical clustering in particle physics through reinforcement learning

1 code implementation16 Nov 2020 Johann Brehmer, Sebastian Macaluso, Duccio Pappadopulo, Kyle Cranmer

Particle physics experiments often require the reconstruction of decay patterns through a hierarchical clustering of the observed final-state particles.

Clustering reinforcement-learning +2

Simulation-based inference methods for particle physics

no code implementations13 Oct 2020 Johann Brehmer, Kyle Cranmer

Our predictions for particle physics processes are realized in a chain of complex simulators.

Probabilistic Programming

Flows for simultaneous manifold learning and density estimation

2 code implementations NeurIPS 2020 Johann Brehmer, Kyle Cranmer

We introduce manifold-learning flows (M-flows), a new class of generative models that simultaneously learn the data manifold as well as a tractable probability density on that manifold.

Denoising Density Estimation +2

The frontier of simulation-based inference

no code implementations4 Nov 2019 Kyle Cranmer, Johann Brehmer, Gilles Louppe

Many domains of science have developed complex simulations to describe phenomena of interest.

Mining for Dark Matter Substructure: Inferring subhalo population properties from strong lenses with machine learning

3 code implementations4 Sep 2019 Johann Brehmer, Siddharth Mishra-Sharma, Joeri Hermans, Gilles Louppe, Kyle Cranmer

The subtle and unique imprint of dark matter substructure on extended arcs in strong lensing systems contains a wealth of information about the properties and distribution of dark matter on small scales and, consequently, about the underlying particle physics.

BIG-bench Machine Learning

MadMiner: Machine learning-based inference for particle physics

5 code implementations24 Jul 2019 Johann Brehmer, Felix Kling, Irina Espejo, Kyle Cranmer

Precision measurements at the LHC often require analyzing high-dimensional event data for subtle kinematic signatures, which is challenging for established analysis methods.

BIG-bench Machine Learning

Effective LHC measurements with matrix elements and machine learning

no code implementations4 Jun 2019 Johann Brehmer, Kyle Cranmer, Irina Espejo, Felix Kling, Gilles Louppe, Juan Pavez

One major challenge for the legacy measurements at the LHC is that the likelihood function is not tractable when the collected data is high-dimensional and the detector response has to be modeled.

BIG-bench Machine Learning Density Estimation

A Guide to Constraining Effective Field Theories with Machine Learning

2 code implementations30 Apr 2018 Johann Brehmer, Kyle Cranmer, Gilles Louppe, Juan Pavez

We develop, discuss, and compare several inference techniques to constrain theory parameters in collider experiments.

BIG-bench Machine Learning

Constraining Effective Field Theories with Machine Learning

1 code implementation30 Apr 2018 Johann Brehmer, Kyle Cranmer, Gilles Louppe, Juan Pavez

We present powerful new analysis techniques to constrain effective field theories at the LHC.

BIG-bench Machine Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.