no code implementations • 8 Sep 2023 • Daniel Scheliga, Patrick Mäder, Marco Seeland
To preserve the privacy preserving effect of PRECODE, our analysis reveals that variational modeling must be placed early in the network.
no code implementations • 17 Jul 2023 • Sambit Mohapatra, Senthil Yogamani, Varun Ravi Kumar, Stefan Milz, Heinrich Gotzig, Patrick Mäder
We achieve state-of-the-art results for two tasks, semantic and motion segmentation, and close to state-of-the-art performance for 3D object detection.
1 code implementation • 17 Oct 2022 • Philipp Teutsch, Patrick Mäder
Although scheduled sampling seems to be a convincing alternative to FR and TF, we found that, even if parametrized carefully, scheduled sampling may lead to premature termination of the training when applied for time series forecasting.
no code implementations • 26 Aug 2022 • Tim Sonnekalb, Bernd Gruner, Clemens-Alexander Brust, Patrick Mäder
Transformer networks such as CodeBERT already achieve outstanding results for code clone detection in benchmark datasets, so one could assume that this task has already been solved.
1 code implementation • 12 Aug 2022 • Daniel Scheliga, Patrick Mäder, Marco Seeland
We find that state of the art attacks are not able to reconstruct the client data due to the stochasticity induced by dropout during model training.
no code implementations • 9 Aug 2022 • Daniel Scheliga, Patrick Mäder, Marco Seeland
In result, we show that our approach requires less gradient perturbation to effectively preserve privacy without harming model performance.
no code implementations • 26 Feb 2022 • Sandeep Pandey, Philipp Teutsch, Patrick Mäder, Jörg Schumacher
A combined convolutional autoencoder-recurrent neural network machine learning model is presented to analyse and forecast the dynamics and low-order statistics of the local convective heat flux field in a two-dimensional turbulent Rayleigh-B\'{e}nard convection flow at Prandtl number ${\rm Pr}=7$ and Rayleigh number ${\rm Ra}=10^7$.
no code implementations • 29 Sep 2021 • Martin Hofmann, Moritz F. P. Becker, Christian Tetzlaff, Patrick Mäder
Various advancements in artificial neural networks (ANNs) are inspired by biological concepts, e. g., the artificial neuron, an efficient model of biological nerve cells demonstrating learning capabilities on large amounts of data.
1 code implementation • 10 Aug 2021 • Daniel Scheliga, Patrick Mäder, Marco Seeland
We propose a simple yet effective realization of PRECODE using variational modeling.
no code implementations • 9 Apr 2021 • Varun Ravi Kumar, Marvin Klingner, Senthil Yogamani, Markus Bach, Stefan Milz, Tim Fingscheidt, Patrick Mäder
We evaluate our approach on the Fisheye WoodScape surround-view dataset, significantly improving over previous approaches.
1 code implementation • 15 Feb 2021 • Varun Ravi Kumar, Senthil Yogamani, Hazem Rashed, Ganesh Sistu, Christian Witt, Isabelle Leang, Stefan Milz, Patrick Mäder
We obtain the state-of-the-art results on KITTI for depth estimation and pose estimation tasks and competitive performance on the other tasks.