1 code implementation • 7 Mar 2024 • Lisa Schneckenreiter, Richard Freinschlag, Florian Sestak, Johannes Brandstetter, Günter Klambauer, Andreas Mayr
Graph neural networks (GNNs), and especially message-passing neural networks, excel in various domains such as physics, drug discovery, and molecular modeling.
1 code implementation • 7 Mar 2024 • Artur P. Toshev, Harish Ramachandran, Jonas A. Erbesdobler, Gianluca Galletti, Johannes Brandstetter, Nikolaus A. Adams
Particle-based fluid simulations have emerged as a powerful tool for solving the Navier-Stokes equations, especially in cases that include intricate physics and free surfaces.
1 code implementation • 22 Feb 2024 • Maksim Zhdanov, David Ruhe, Maurice Weiler, Ana Lucic, Johannes Brandstetter, Patrick Forré
We present Clifford-Steerable Convolutional Neural Networks (CS-CNNs), a novel class of $\mathrm{E}(p, q)$-equivariant CNNs.
no code implementations • 21 Feb 2024 • Arturs Berzins, Andreas Radler, Sebastian Sanokowski, Sepp Hochreiter, Johannes Brandstetter
We introduce the concept of geometry-informed neural networks (GINNs), which encompass (i) learning under geometric constraints, (ii) neural fields as a suitable representation, and (iii) generating diverse solutions to under-determined systems often encountered in geometric tasks.
no code implementations • 19 Feb 2024 • Benedikt Alkin, Andreas Fürst, Simon Schmid, Lukas Gruber, Markus Holzleitner, Johannes Brandstetter
Deep neural network based surrogates for partial differential equations have recently gained increased interest.
1 code implementation • 15 Feb 2024 • Benedikt Alkin, Lukas Miklautz, Sepp Hochreiter, Johannes Brandstetter
The motivation behind MIM-Refiner is rooted in the insight that optimal representations within MIM models generally reside in intermediate layers.
Ranked #1 on Image Clustering on ImageNet
2 code implementations • 9 Feb 2024 • Artur P. Toshev, Jonas A. Erbesdobler, Nikolaus A. Adams, Johannes Brandstetter
Smoothed particle hydrodynamics (SPH) is omnipresent in modern engineering and scientific disciplines.
no code implementations • 7 Nov 2023 • Tara Akhound-Sadegh, Laurence Perreault-Levasseur, Johannes Brandstetter, Max Welling, Siamak Ravanbakhsh
Symmetries have been leveraged to improve the generalization of neural networks through different mechanisms from data augmentation to equivariant architectures.
2 code implementations • 24 May 2023 • Artur P. Toshev, Gianluca Galletti, Johannes Brandstetter, Stefan Adami, Nikolaus A. Adams
We contribute to the vastly growing field of machine learning for engineering systems by demonstrating that equivariant graph neural networks have the potential to learn more accurate dynamic-interaction models than their non-equivariant counterparts.
no code implementations • 31 Mar 2023 • Artur P. Toshev, Gianluca Galletti, Johannes Brandstetter, Stefan Adami, Nikolaus A. Adams
We contribute to the vastly growing field of machine learning for engineering systems by demonstrating that equivariant graph neural networks have the potential to learn more accurate dynamic-interaction models than their non-equivariant counterparts.
no code implementations • 17 Feb 2023 • Bernhard Schäfl, Lukas Gruber, Johannes Brandstetter, Sepp Hochreiter
Graph neural networks (GNNs) have evolved into one of the most popular deep learning architectures.
1 code implementation • 13 Feb 2023 • David Ruhe, Jayesh K. Gupta, Steven de Keninck, Max Welling, Johannes Brandstetter
GCANs are based on symmetry group transformations using geometric (Clifford) algebras.
1 code implementation • 24 Jan 2023 • Tung Nguyen, Johannes Brandstetter, Ashish Kapoor, Jayesh K. Gupta, Aditya Grover
We develop and demonstrate ClimaX, a flexible and generalizable deep learning model for weather and climate science that can be trained using heterogeneous datasets spanning different variables, spatio-temporal coverage, and physical groundings.
1 code implementation • 30 Sep 2022 • Jayesh K. Gupta, Johannes Brandstetter
Finally, we show promising results on generalization to different PDE parameters and time-scales with a single surrogate model.
1 code implementation • 8 Sep 2022 • Johannes Brandstetter, Rianne van den Berg, Max Welling, Jayesh K. Gupta
We empirically evaluate the benefit of Clifford neural layers by replacing convolution and Fourier operations in common neural PDE surrogates by their Clifford counterparts on 2D Navier-Stokes and weather modeling tasks, as well as 3D Maxwell equations.
1 code implementation • 7 Jun 2022 • Martin Gauch, Maximilian Beck, Thomas Adler, Dmytro Kotsur, Stefan Fiel, Hamid Eghbal-zadeh, Johannes Brandstetter, Johannes Kofler, Markus Holzleitner, Werner Zellinger, Daniel Klotz, Sepp Hochreiter, Sebastian Lehner
We introduce SubGD, a novel few-shot learning method which is based on the recent finding that stochastic gradient descent updates tend to live in a low-dimensional parameter subspace.
1 code implementation • 15 Feb 2022 • Johannes Brandstetter, Max Welling, Daniel E. Worrall
In this paper, we present a method, which can partially alleviate this problem, by improving neural PDE solver sample complexity -- Lie point symmetry data augmentation (LPSDA).
1 code implementation • ICLR 2022 • Johannes Brandstetter, Daniel Worrall, Max Welling
The numerical solution of partial differential equations (PDEs) is difficult, having led to a century of research so far.
2 code implementations • ICLR 2022 • Johannes Brandstetter, Rob Hesselink, Elise van der Pol, Erik J Bekkers, Max Welling
Including covariant information, such as position, force, velocity or spin is important in many tasks in computational physics and chemistry.
1 code implementation • 21 Jun 2021 • Andreas Mayr, Sebastian Lehner, Arno Mayrhofer, Christoph Kloss, Sepp Hochreiter, Johannes Brandstetter
However, it is notoriously difficult to integrate them into machine learning approaches due to their heterogeneity with respect to size and orientation.
1 code implementation • 4 May 2021 • Andreas Mayr, Sebastian Lehner, Arno Mayrhofer, Christoph Kloss, Sepp Hochreiter, Johannes Brandstetter
Recently, the application of machine learning models has gained momentum in natural sciences and engineering, which is a natural fit due to the abundance of data in these fields.
no code implementations • 2 Dec 2020 • Markus Holzleitner, Lukas Gruber, José Arjona-Medina, Johannes Brandstetter, Sepp Hochreiter
We prove under commonly used assumptions the convergence of actor-critic reinforcement learning algorithms, which simultaneously learn a policy function, the actor, and a value function, the critic.
2 code implementations • 13 Oct 2020 • Thomas Adler, Johannes Brandstetter, Michael Widrich, Andreas Mayr, David Kreil, Michael Kopp, Günter Klambauer, Sepp Hochreiter
On the few-shot datasets miniImagenet and tieredImagenet with small domain shifts, CHEF is competitive with state-of-the-art methods.
1 code implementation • 29 Sep 2020 • Vihang P. Patil, Markus Hofmarcher, Marius-Constantin Dinu, Matthias Dorfer, Patrick M. Blies, Johannes Brandstetter, Jose A. Arjona-Medina, Sepp Hochreiter
For such complex tasks, the recently proposed RUDDER uses reward redistribution to leverage steps in the Q-function that are associated with accomplishing sub-tasks.
General Reinforcement Learning Multiple Sequence Alignment +1
2 code implementations • ICLR 2021 • Hubert Ramsauer, Bernhard Schäfl, Johannes Lehner, Philipp Seidl, Michael Widrich, Thomas Adler, Lukas Gruber, Markus Holzleitner, Milena Pavlović, Geir Kjetil Sandve, Victor Greiff, David Kreil, Michael Kopp, Günter Klambauer, Johannes Brandstetter, Sepp Hochreiter
The new update rule is equivalent to the attention mechanism used in transformers.
Immune Repertoire Classification Multiple Instance Learning +1
1 code implementation • NeurIPS 2020 • Michael Widrich, Bernhard Schäfl, Hubert Ramsauer, Milena Pavlović, Lukas Gruber, Markus Holzleitner, Johannes Brandstetter, Geir Kjetil Sandve, Victor Greiff, Sepp Hochreiter, Günter Klambauer
We show that the attention mechanism of transformer architectures is actually the update rule of modern Hopfield networks that can store exponentially many patterns.
no code implementations • 10 Nov 2019 • Frederik Kratzert, Daniel Klotz, Johannes Brandstetter, Pieter-Jan Hoedt, Grey Nearing, Sepp Hochreiter
Climate change affects occurrences of floods and droughts worldwide.
no code implementations • 30 Oct 2019 • Thomas Adler, Manuel Erhard, Mario Krenn, Johannes Brandstetter, Johannes Kofler, Sepp Hochreiter
In this work, we show that machine learning models can provide significant improvement over random search.
no code implementations • NeurIPS Workshop Deep_Invers 2019 • Michael Gillhofer, Hubert Ramsauer, Johannes Brandstetter, Bernhard Schäfl, Sepp Hochreiter
We propose a GAN based approach to solve inverse problems which have non-differential or non-continuous forward relations.
2 code implementations • NeurIPS 2019 • Jose A. Arjona-Medina, Michael Gillhofer, Michael Widrich, Thomas Unterthiner, Johannes Brandstetter, Sepp Hochreiter
In MDPs the Q-values are equal to the expected immediate reward plus the expected future rewards.
Ranked #9 on Atari Games on Atari 2600 Bowling