no code implementations • 26 Jun 2023 • Sergey Oladyshkin, Timothy Praditia, Ilja Kröker, Farid Mohammadi, Wolfgang Nowak, Sebastian Otte
However, for a majority of deep learning approaches based on DANNs, the kernel structure of neural signal processing remains the same, where the node response is encoded as a linear superposition of neural activity, while the non-linearity is triggered by the activation functions.
1 code implementation • 12 Apr 2022 • Paul-Christian Bürkner, Ilja Kröker, Sergey Oladyshkin, Wolfgang Nowak
Polynomial chaos expansion (PCE) is a versatile tool widely used in uncertainty quantification and machine learning, but its successful application depends strongly on the accuracy and reliability of the resulting PCE-based response surface.
1 code implementation • 23 Nov 2021 • Matthias Karlbauer, Timothy Praditia, Sebastian Otte, Sergey Oladyshkin, Wolfgang Nowak, Martin V. Butz
We introduce a compositional physics-aware FInite volume Neural Network (FINN) for learning spatiotemporal advection-diffusion processes.
1 code implementation • 13 Apr 2021 • Timothy Praditia, Matthias Karlbauer, Sebastian Otte, Sergey Oladyshkin, Martin V. Butz, Wolfgang Nowak
To tackle this issue, we introduce a new approach called the Finite Volume Neural Network (FINN).