Search Results for author: Johannes Brandstetter

Found 35 papers, 26 papers with code

Vision-LSTM: xLSTM as Generic Vision Backbone

1 code implementation6 Jun 2024 Benedikt Alkin, Maximilian Beck, Korbinian Pöppel, Sepp Hochreiter, Johannes Brandstetter

Transformers are widely used as generic backbones in computer vision, despite initially introduced for natural language processing.

Aurora: A Foundation Model of the Atmosphere

no code implementations20 May 2024 Cristian Bodnar, Wessel P. Bruinsma, Ana Lucic, Megan Stanley, Johannes Brandstetter, Patrick Garvan, Maik Riechert, Jonathan Weyn, Haiyu Dong, Anna Vaughan, Jayesh K. Gupta, Kit Tambiratnam, Alex Archibald, Elizabeth Heider, Max Welling, Richard E. Turner, Paris Perdikaris

Deep learning foundation models are revolutionizing many facets of science by leveraging vast amounts of data to learn general-purpose representations that can be adapted to tackle diverse downstream tasks.

xLSTM: Extended Long Short-Term Memory

1 code implementation7 May 2024 Maximilian Beck, Korbinian Pöppel, Markus Spanring, Andreas Auer, Oleksandra Prudnikova, Michael Kopp, Günter Klambauer, Johannes Brandstetter, Sepp Hochreiter

In the 1990s, the constant error carousel and gating were introduced as the central ideas of the Long Short-Term Memory (LSTM).

Language Modelling

VN-EGNN: E(3)-Equivariant Graph Neural Networks with Virtual Nodes Enhance Protein Binding Site Identification

1 code implementation10 Apr 2024 Florian Sestak, Lisa Schneckenreiter, Johannes Brandstetter, Sepp Hochreiter, Andreas Mayr, Günter Klambauer

However, the performance of GNNs at binding site identification is still limited potentially due to the lack of dedicated nodes that model hidden geometric entities, such as binding pockets.

Trajectory Prediction

JAX-SPH: A Differentiable Smoothed Particle Hydrodynamics Framework

1 code implementation7 Mar 2024 Artur P. Toshev, Harish Ramachandran, Jonas A. Erbesdobler, Gianluca Galletti, Johannes Brandstetter, Nikolaus A. Adams

Particle-based fluid simulations have emerged as a powerful tool for solving the Navier-Stokes equations, especially in cases that include intricate physics and free surfaces.

GNN-VPA: A Variance-Preserving Aggregation Strategy for Graph Neural Networks

1 code implementation7 Mar 2024 Lisa Schneckenreiter, Richard Freinschlag, Florian Sestak, Johannes Brandstetter, Günter Klambauer, Andreas Mayr

Graph neural networks (GNNs), and especially message-passing neural networks, excel in various domains such as physics, drug discovery, and molecular modeling.

Drug Discovery

Clifford-Steerable Convolutional Neural Networks

1 code implementation22 Feb 2024 Maksim Zhdanov, David Ruhe, Maurice Weiler, Ana Lucic, Johannes Brandstetter, Patrick Forré

We present Clifford-Steerable Convolutional Neural Networks (CS-CNNs), a novel class of $\mathrm{E}(p, q)$-equivariant CNNs.

Geometry-Informed Neural Networks

no code implementations21 Feb 2024 Arturs Berzins, Andreas Radler, Sebastian Sanokowski, Sepp Hochreiter, Johannes Brandstetter

To this end, we introduce geometry-informed neural networks (GINNs) to train shape generative models \emph{without any data}.

Universal Physics Transformers: A Framework For Efficiently Scaling Neural Operators

1 code implementation19 Feb 2024 Benedikt Alkin, Andreas Fürst, Simon Schmid, Lukas Gruber, Markus Holzleitner, Johannes Brandstetter

This is of special interest since, akin to their numerical counterparts, different techniques are used across applications, even if the underlying dynamics of the systems are similar.

Neural SPH: Improved Neural Modeling of Lagrangian Fluid Dynamics

2 code implementations9 Feb 2024 Artur P. Toshev, Jonas A. Erbesdobler, Nikolaus A. Adams, Johannes Brandstetter

Smoothed particle hydrodynamics (SPH) is omnipresent in modern engineering and scientific disciplines.

Lie Point Symmetry and Physics Informed Networks

no code implementations7 Nov 2023 Tara Akhound-Sadegh, Laurence Perreault-Levasseur, Johannes Brandstetter, Max Welling, Siamak Ravanbakhsh

Symmetries have been leveraged to improve the generalization of neural networks through different mechanisms from data augmentation to equivariant architectures.

Data Augmentation Inductive Bias

Learning Lagrangian Fluid Mechanics with E($3$)-Equivariant Graph Neural Networks

2 code implementations24 May 2023 Artur P. Toshev, Gianluca Galletti, Johannes Brandstetter, Stefan Adami, Nikolaus A. Adams

We contribute to the vastly growing field of machine learning for engineering systems by demonstrating that equivariant graph neural networks have the potential to learn more accurate dynamic-interaction models than their non-equivariant counterparts.

Clifford Group Equivariant Neural Networks

1 code implementation NeurIPS 2023 David Ruhe, Johannes Brandstetter, Patrick Forré

We introduce Clifford Group Equivariant Neural Networks: a novel approach for constructing $\mathrm{O}(n)$- and $\mathrm{E}(n)$-equivariant models.

E($3$) Equivariant Graph Neural Networks for Particle-Based Fluid Mechanics

no code implementations31 Mar 2023 Artur P. Toshev, Gianluca Galletti, Johannes Brandstetter, Stefan Adami, Nikolaus A. Adams

We contribute to the vastly growing field of machine learning for engineering systems by demonstrating that equivariant graph neural networks have the potential to learn more accurate dynamic-interaction models than their non-equivariant counterparts.

G-Signatures: Global Graph Propagation With Randomized Signatures

no code implementations17 Feb 2023 Bernhard Schäfl, Lukas Gruber, Johannes Brandstetter, Sepp Hochreiter

Graph neural networks (GNNs) have evolved into one of the most popular deep learning architectures.

Graph Learning

Geometric Clifford Algebra Networks

2 code implementations13 Feb 2023 David Ruhe, Jayesh K. Gupta, Steven de Keninck, Max Welling, Johannes Brandstetter

GCANs are based on symmetry group transformations using geometric (Clifford) algebras.

ClimaX: A foundation model for weather and climate

1 code implementation24 Jan 2023 Tung Nguyen, Johannes Brandstetter, Ashish Kapoor, Jayesh K. Gupta, Aditya Grover

We develop and demonstrate ClimaX, a flexible and generalizable deep learning model for weather and climate science that can be trained using heterogeneous datasets spanning different variables, spatio-temporal coverage, and physical groundings.

Self-Supervised Learning Weather Forecasting

Towards Multi-spatiotemporal-scale Generalized PDE Modeling

2 code implementations30 Sep 2022 Jayesh K. Gupta, Johannes Brandstetter

Finally, we show promising results on generalization to different PDE parameters and time-scales with a single surrogate model.

PDE Surrogate Modeling

Clifford Neural Layers for PDE Modeling

2 code implementations8 Sep 2022 Johannes Brandstetter, Rianne van den Berg, Max Welling, Jayesh K. Gupta

We empirically evaluate the benefit of Clifford neural layers by replacing convolution and Fourier operations in common neural PDE surrogates by their Clifford counterparts on 2D Navier-Stokes and weather modeling tasks, as well as 3D Maxwell equations.

Weather Forecasting

Few-Shot Learning by Dimensionality Reduction in Gradient Space

1 code implementation7 Jun 2022 Martin Gauch, Maximilian Beck, Thomas Adler, Dmytro Kotsur, Stefan Fiel, Hamid Eghbal-zadeh, Johannes Brandstetter, Johannes Kofler, Markus Holzleitner, Werner Zellinger, Daniel Klotz, Sepp Hochreiter, Sebastian Lehner

We introduce SubGD, a novel few-shot learning method which is based on the recent finding that stochastic gradient descent updates tend to live in a low-dimensional parameter subspace.

Dimensionality Reduction Few-Shot Learning

Lie Point Symmetry Data Augmentation for Neural PDE Solvers

1 code implementation15 Feb 2022 Johannes Brandstetter, Max Welling, Daniel E. Worrall

In this paper, we present a method, which can partially alleviate this problem, by improving neural PDE solver sample complexity -- Lie point symmetry data augmentation (LPSDA).

Data Augmentation

Message Passing Neural PDE Solvers

1 code implementation ICLR 2022 Johannes Brandstetter, Daniel Worrall, Max Welling

The numerical solution of partial differential equations (PDEs) is difficult, having led to a century of research so far.

Domain Adaptation

Geometric and Physical Quantities Improve E(3) Equivariant Message Passing

2 code implementations ICLR 2022 Johannes Brandstetter, Rob Hesselink, Elise van der Pol, Erik J Bekkers, Max Welling

Including covariant information, such as position, force, velocity or spin is important in many tasks in computational physics and chemistry.

Boundary Graph Neural Networks for 3D Simulations

1 code implementation21 Jun 2021 Andreas Mayr, Sebastian Lehner, Arno Mayrhofer, Christoph Kloss, Sepp Hochreiter, Johannes Brandstetter

However, it is notoriously difficult to integrate them into machine learning approaches due to their heterogeneity with respect to size and orientation.

Computational Efficiency

Learning 3D Granular Flow Simulations

1 code implementation4 May 2021 Andreas Mayr, Sebastian Lehner, Arno Mayrhofer, Christoph Kloss, Sepp Hochreiter, Johannes Brandstetter

Recently, the application of machine learning models has gained momentum in natural sciences and engineering, which is a natural fit due to the abundance of data in these fields.

BIG-bench Machine Learning

Convergence Proof for Actor-Critic Methods Applied to PPO and RUDDER

no code implementations2 Dec 2020 Markus Holzleitner, Lukas Gruber, José Arjona-Medina, Johannes Brandstetter, Sepp Hochreiter

We prove under commonly used assumptions the convergence of actor-critic reinforcement learning algorithms, which simultaneously learn a policy function, the actor, and a value function, the critic.

Reinforcement Learning (RL) valid

Cross-Domain Few-Shot Learning by Representation Fusion

2 code implementations13 Oct 2020 Thomas Adler, Johannes Brandstetter, Michael Widrich, Andreas Mayr, David Kreil, Michael Kopp, Günter Klambauer, Sepp Hochreiter

On the few-shot datasets miniImagenet and tieredImagenet with small domain shifts, CHEF is competitive with state-of-the-art methods.

cross-domain few-shot learning Drug Discovery

A GAN based solver of black-box inverse problems

no code implementations NeurIPS Workshop Deep_Invers 2019 Michael Gillhofer, Hubert Ramsauer, Johannes Brandstetter, Bernhard Schäfl, Sepp Hochreiter

We propose a GAN based approach to solve inverse problems which have non-differential or non-continuous forward relations.

Cannot find the paper you are looking for? You can Submit a new open access paper.