Search Results for author: Stephan Günnemann

Found 129 papers, 68 papers with code

Intensity-Free Learning of Temporal Point Processes

3 code implementations ICLR 2020 Oleksandr Shchur, Marin Biloš, Stephan Günnemann

The standard way of learning in such models is by estimating the conditional intensity function.

Point Processes

Pitfalls of Graph Neural Network Evaluation

2 code implementations14 Nov 2018 Oleksandr Shchur, Maximilian Mumme, Aleksandar Bojchevski, Stephan Günnemann

We perform a thorough empirical evaluation of four prominent GNN models and show that considering different splits of the data leads to dramatically different rankings of models.

Graph Mining Node Classification

Graph Data Augmentation for Graph Machine Learning: A Survey

1 code implementation17 Feb 2022 Tong Zhao, Wei Jin, Yozen Liu, Yingheng Wang, Gang Liu, Stephan Günnemann, Neil Shah, Meng Jiang

Overall, our work aims to clarify the landscape of existing literature in graph data augmentation and motivates additional work in this area, providing a helpful resource for researchers and practitioners in the broader graph machine learning domain.

BIG-bench Machine Learning Data Augmentation

Diffusion Improves Graph Learning

3 code implementations NeurIPS 2019 Johannes Gasteiger, Stefan Weißenberger, Stephan Günnemann

In this work, we remove the restriction of using only the direct neighbors by introducing a powerful, yet spatially localized graph convolution: Graph diffusion convolution (GDC).

Clustering Graph Learning +1

Adversarial Attacks on Neural Networks for Graph Data

1 code implementation21 May 2018 Daniel Zügner, Amir Akbarnejad, Stephan Günnemann

Even more, our attacks are transferable: the learned attacks generalize to other state-of-the-art node classification models and unsupervised approaches, and likewise are successful even when only limited knowledge about the graph is given.

General Classification Node Classification

GemNet: Universal Directional Graph Neural Networks for Molecules

4 code implementations NeurIPS 2021 Johannes Gasteiger, Florian Becker, Stephan Günnemann

Effectively predicting molecular interactions has the potential to accelerate molecular dynamics by multiple orders of magnitude and thus revolutionize chemical simulations.

Translation

NetGAN: Generating Graphs via Random Walks

2 code implementations ICML 2018 Aleksandar Bojchevski, Oleksandr Shchur, Daniel Zügner, Stephan Günnemann

NetGAN is able to produce graphs that exhibit well-known network patterns without explicitly specifying them in the model definition.

Graph Generation Link Prediction

torchode: A Parallel ODE Solver for PyTorch

1 code implementation22 Oct 2022 Marten Lienen, Stephan Günnemann

We introduce an ODE solver for the PyTorch ecosystem that can solve multiple ODEs in parallel independently from each other while achieving significant performance gains.

Deep Gaussian Embedding of Graphs: Unsupervised Inductive Learning via Ranking

1 code implementation ICLR 2018 Aleksandar Bojchevski, Stephan Günnemann

We propose Graph2Gauss - an approach that can efficiently learn versatile node embeddings on large scale (attributed) graphs that show strong performance on tasks such as link prediction and node classification.

Link Prediction Network Embedding +1

Transformers Meet Directed Graphs

1 code implementation31 Jan 2023 Simon Geisler, Yujia Li, Daniel Mankowitz, Ali Taylan Cemgil, Stephan Günnemann, Cosmin Paduraru

Transformers were originally proposed as a sequence-to-sequence model for text but have become vital for a wide range of modalities, including images, audio, video, and undirected graphs.

graph construction Graph Property Prediction

Whole Brain Vessel Graphs: A Dataset and Benchmark for Graph Learning and Neuroscience (VesselGraph)

1 code implementation30 Aug 2021 Johannes C. Paetzold, Julian McGinnis, Suprosanna Shit, Ivan Ezhov, Paul Büschl, Chinmay Prabhakar, Mihail I. Todorov, Anjany Sekuboyina, Georgios Kaissis, Ali Ertürk, Stephan Günnemann, Bjoern H. Menze

Moreover, we benchmark numerous state-of-the-art graph learning algorithms on the biologically relevant tasks of vessel prediction and vessel classification using the introduced vessel graph dataset.

Graph Learning

Predicting Cellular Responses to Novel Drug Perturbations at a Single-Cell Resolution

1 code implementation28 Apr 2022 Leon Hetzel, Simon Böhm, Niki Kilbertus, Stephan Günnemann, Mohammad Lotfollahi, Fabian Theis

Single-cell transcriptomics enabled the study of cellular heterogeneity in response to perturbations at the resolution of individual cells.

Drug Discovery Transfer Learning

Posterior Network: Uncertainty Estimation without OOD Samples via Density-Based Pseudo-Counts

1 code implementation NeurIPS 2020 Bertrand Charpentier, Daniel Zügner, Stephan Günnemann

The posterior distributions learned by PostNet accurately reflect uncertainty for in- and out-of-distribution data -- without requiring access to OOD data at training time.

Out of Distribution (OOD) Detection

Adversarial Attacks on Node Embeddings via Graph Poisoning

1 code implementation ICLR 2019 Aleksandar Bojchevski, Stephan Günnemann

The goal of network representation learning is to learn low-dimensional node embeddings that capture the graph structure and are useful for solving downstream tasks.

Representation Learning

Certifiable Robustness and Robust Training for Graph Convolutional Networks

1 code implementation28 Jun 2019 Daniel Zügner, Stephan Günnemann

Recent works show that Graph Neural Networks (GNNs) are highly non-robust with respect to adversarial attacks on both the graph structure and the node attributes, making their outcomes unreliable.

Node Classification

OODformer: Out-Of-Distribution Detection Transformer

1 code implementation19 Jul 2021 Rajat Koner, Poulami Sinhamahapatra, Karsten Roscher, Stephan Günnemann, Volker Tresp

A serious problem in image classification is that a trained model might perform well for input data that originates from the same distribution as the data available for model training, but performs much worse for out-of-distribution (OOD) samples.

Contrastive Learning Out-of-Distribution Detection +1

Efficient Robustness Certificates for Discrete Data: Sparsity-Aware Randomized Smoothing for Graphs, Images and More

1 code implementation ICML 2020 Aleksandar Bojchevski, Johannes Gasteiger, Stephan Günnemann

Existing techniques for certifying the robustness of models for discrete data either work only for a small class of models or are general at the expense of efficiency or tightness.

Differentiable DAG Sampling

1 code implementation ICLR 2022 Bertrand Charpentier, Simon Kibler, Stephan Günnemann

To this end, DP-DAG samples a DAG by successively (1) sampling a linear ordering of the node and (2) sampling edges consistent with the sampled linear ordering.

valid Variational Inference

Graph Hawkes Neural Network for Forecasting on Temporal Knowledge Graphs

1 code implementation AKBC 2020 Zhen Han, Yunpu Ma, Yuyi Wang, Stephan Günnemann, Volker Tresp

The Hawkes process has become a standard method for modeling self-exciting event sequences with different event types.

Knowledge Graphs

Oktoberfest Food Dataset

1 code implementation22 Nov 2019 Alexander Ziller, Julius Hansjakob, Vitalii Rusinov, Daniel Zügner, Peter Vogel, Stephan Günnemann

We release a realistic, diverse, and challenging dataset for object detection on images.

Object object-detection +1

Sampling-free Inference for Ab-Initio Potential Energy Surface Networks

1 code implementation30 May 2022 Nicholas Gao, Stephan Günnemann

In this work, we address the inference shortcomings by proposing the Potential learning from ab-initio Networks (PlaNet) framework, in which we simultaneously train a surrogate model in addition to the neural wave function.

Inductive Bias Numerical Integration

Ewald-based Long-Range Message Passing for Molecular Graphs

1 code implementation8 Mar 2023 Arthur Kosmala, Johannes Gasteiger, Nicholas Gao, Stephan Günnemann

Neural architectures that learn potential energy surfaces from molecular data have undergone fast improvement in recent years.

Inductive Bias

From Things' Modeling Language (ThingML) to Things' Machine Learning (ThingML2)

1 code implementation22 Sep 2020 Armin Moin, Stephan Rössler, Marouane Sayih, Stephan Günnemann

In this paper, we illustrate how to enhance an existing state-of-the-art modeling language and tool for the Internet of Things (IoT), called ThingML, to support machine learning on the modeling level.

BIG-bench Machine Learning Code Generation

ThingML+ Augmenting Model-Driven Software Engineering for the Internet of Things with Machine Learning

1 code implementation22 Sep 2020 Armin Moin, Stephan Rössler, Stephan Günnemann

In this paper, we present the current position of the research project ML-Quadrat, which aims to extend the methodology, modeling language and tool support of ThingML - an open source modeling tool for IoT/CPS - to address Machine Learning needs for the IoT applications.

BIG-bench Machine Learning Code Generation

ML-Quadrat & DriotData: A Model-Driven Engineering Tool and a Low-Code Platform for Smart IoT Services

1 code implementation6 Jul 2021 Armin Moin, Andrei Mituca, Moharram Challenger, Atta Badii, Stephan Günnemann

In this paper, we present ML-Quadrat, an open-source research prototype that is based on the Eclipse Modeling Framework (EMF) and the state of the art in the literature of Model-Driven Software Engineering (MDSE) for smart Cyber-Physical Systems (CPS) and the Internet of Things (IoT).

A Model-Driven Approach to Machine Learning and Software Modeling for the IoT

1 code implementation6 Jul 2021 Armin Moin, Moharram Challenger, Atta Badii, Stephan Günnemann

In particular, we implement the proposed approach, called ML-Quadrat, based on ThingML, and validate it using a case study from the IoT domain, as well as through an empirical user evaluation.

BIG-bench Machine Learning Decision Making

Robustness of Graph Neural Networks at Scale

2 code implementations NeurIPS 2021 Simon Geisler, Tobias Schmidt, Hakan Şirin, Daniel Zügner, Aleksandar Bojchevski, Stephan Günnemann

Graph Neural Networks (GNNs) are increasingly important given their popularity and the diversity of applications.

Fast and Flexible Temporal Point Processes with Triangular Maps

1 code implementation NeurIPS 2020 Oleksandr Shchur, Nicholas Gao, Marin Biloš, Stephan Günnemann

Temporal point process (TPP) models combined with recurrent neural networks provide a powerful framework for modeling continuous-time event data.

Point Processes Variational Inference

On Out-of-distribution Detection with Energy-based Models

1 code implementation3 Jul 2021 Sven Elflein, Bertrand Charpentier, Daniel Zügner, Stephan Günnemann

Several density estimation methods have shown to fail to detect out-of-distribution (OOD) samples by assigning higher likelihoods to anomalous data.

Density Estimation Out-of-Distribution Detection +1

Reliable Graph Neural Networks via Robust Aggregation

1 code implementation NeurIPS 2020 Simon Geisler, Daniel Zügner, Stephan Günnemann

Perturbations targeting the graph structure have proven to be extremely effective in reducing the performance of Graph Neural Networks (GNNs), and traditional defenses such as adversarial training do not seem to be able to improve robustness.

Graphhopper: Multi-Hop Scene Graph Reasoning for Visual Question Answering

1 code implementation13 Jul 2021 Rajat Koner, Hang Li, Marcel Hildebrandt, Deepan Das, Volker Tresp, Stephan Günnemann

We conduct an experimental study on the challenging dataset GQA, based on both manually curated and automatically generated scene graphs.

Navigate Question Answering +1

Certifiable Robustness to Graph Perturbations

1 code implementation NeurIPS 2019 Aleksandar Bojchevski, Stephan Günnemann

Despite the exploding interest in graph neural networks there has been little effort to verify and improve their robustness.

MAGNet: Motif-Agnostic Generation of Molecules from Shapes

1 code implementation30 May 2023 Leon Hetzel, Johanna Sommer, Bastian Rieck, Fabian Theis, Stephan Günnemann

Recent advances in machine learning for molecules exhibit great potential for facilitating drug discovery from in silico predictions.

Drug Discovery

Adversarial Attacks and Defenses in Large Language Models: Old and New Threats

1 code implementation30 Oct 2023 Leo Schwinn, David Dobre, Stephan Günnemann, Gauthier Gidel

Here, one major impediment has been the overestimation of the robustness of new defense approaches due to faulty defense evaluations.

Training, Architecture, and Prior for Deterministic Uncertainty Methods

1 code implementation10 Mar 2023 Bertrand Charpentier, Chenxiang Zhang, Stephan Günnemann

Accurate and efficient uncertainty estimation is crucial to build reliable Machine Learning (ML) models capable to provide calibrated uncertainty estimates, generalize and detect Out-Of-Distribution (OOD) datasets.

Linearized and Single-Pass Belief Propagation

1 code implementation27 Jun 2014 Wolfgang Gatterbauer, Stephan Günnemann, Danai Koutra, Christos Faloutsos

Often, we can answer such questions and label nodes in a network based on the labels of their neighbors and appropriate assumptions of homophily ("birds of a feather flock together") or heterophily ("opposites attract").

From Zero to Turbulence: Generative Modeling for 3D Flow Simulation

1 code implementation29 May 2023 Marten Lienen, David Lüdke, Jan Hansen-Palmus, Stephan Günnemann

On this dataset, we show that our generative model captures the distribution of turbulent flows caused by unseen objects and generates high-quality, realistic samples amenable for downstream applications without access to any initial state.

United States Politicians' Tone Became More Negative with 2016 Primary Campaigns

1 code implementation17 Jul 2022 Jonathan Külz, Andreas Spitz, Ahmad Abu-Akel, Stephan Günnemann, Robert West

There is a widespread belief that the tone of US political language has become more negative recently, in particular when Donald Trump entered politics.

Randomized Message-Interception Smoothing: Gray-box Certificates for Graph Neural Networks

1 code implementation5 Jan 2023 Yan Scholten, Jan Schuchardt, Simon Geisler, Aleksandar Bojchevski, Stephan Günnemann

To remedy this, we propose novel gray-box certificates that exploit the message-passing principle of GNNs: We randomly intercept messages and carefully analyze the probability that messages from adversarially controlled nodes reach their target nodes.

Adversarial Robustness

Density-based Feasibility Learning with Normalizing Flows for Introspective Robotic Assembly

2 code implementations3 Jul 2023 Jianxiang Feng, Matan Atad, Ismael Rodríguez, Maximilian Durner, Stephan Günnemann, Rudolph Triebel

Machine Learning (ML) models in Robotic Assembly Sequence Planning (RASP) need to be introspective on the predicted solutions, i. e. whether they are feasible or not, to circumvent potential efficiency degradation.

Out of Distribution (OOD) Detection

Preventing Errors in Person Detection: A Part-Based Self-Monitoring Framework

1 code implementation10 Jul 2023 Franziska Schwaiger, Andrea Matic, Karsten Roscher, Stephan Günnemann

The ability to detect learned objects regardless of their appearance is crucial for autonomous systems in real-world applications.

Human Detection object-detection +1

Transition Path Sampling with Boltzmann Generator-based MCMC Moves

1 code implementation8 Dec 2023 Michael Plainer, Hannes Stärk, Charlotte Bunne, Stephan Günnemann

Sampling all possible transition paths between two 3D states of a molecular system has various applications ranging from catalyst design to drug discovery.

Drug Discovery

Generalizing Neural Wave Functions

1 code implementation8 Feb 2023 Nicholas Gao, Stephan Günnemann

To overcome this limitation, we present Graph-learned orbital embeddings (Globe), a neural network-based reparametrization method that can adapt neural wave functions to different molecules.

Dual-Primal Graph Convolutional Networks

no code implementations3 Jun 2018 Federico Monti, Oleksandr Shchur, Aleksandar Bojchevski, Or Litany, Stephan Günnemann, Michael M. Bronstein

In recent years, there has been a surge of interest in developing deep learning methods for non-Euclidean structured data such as graphs.

Graph Attention Recommendation Systems

Mining Contrasting Quasi-Clique Patterns

no code implementations3 Oct 2018 Roberto Alonso, Stephan Günnemann

Mining dense quasi-cliques is a well-known clustering task with applications ranging from social networks over collaboration graphs to document analysis.

Clustering

Multi-Source Neural Variational Inference

no code implementations11 Nov 2018 Richard Kurle, Stephan Günnemann, Patrick van der Smagt

Learning from multiple sources of information is an important problem in machine-learning research.

Variational Inference

Adversarial Attacks on Node Embeddings

no code implementations ICLR 2019 Aleksandar Bojchevski, Stephan Günnemann

The goal of network representation learning is to learn low-dimensional node embeddings that capture the graph structure and are useful for solving downstream tasks.

Representation Learning

Scene Graph Reasoning for Visual Question Answering

no code implementations2 Jul 2020 Marcel Hildebrandt, Hang Li, Rajat Koner, Volker Tresp, Stephan Günnemann

We propose a novel method that approaches the task by performing context-driven, sequential reasoning based on the objects and their semantic and spatial relationships present in the scene.

Navigate Question Answering +1

Deep Representation Learning and Clustering of Traffic Scenarios

no code implementations15 Jul 2020 Nick Harmening, Marin Biloš, Stephan Günnemann

Determining the traffic scenario space is a major challenge for the homologation and coverage assessment of automated driving functions.

Clustering Representation Learning +1

Continual Learning with Bayesian Neural Networks for Non-Stationary Data

no code implementations ICLR 2020 Richard Kurle, Botond Cseke, Alexej Klushyn, Patrick van der Smagt, Stephan Günnemann

We represent the posterior approximation of the network weights by a diagonal Gaussian distribution and a complementary memory of raw data.

Continual Learning

Collective Robustness Certificates

no code implementations ICLR 2021 Jan Schuchardt, Aleksandar Bojchevski, Johannes Klicpera, Stephan Günnemann

In tasks like node classification, image segmentation, and named-entity recognition we have a classifier that simultaneously outputs multiple predictions (a vector of labels) based on a single input, i. e. a single graph, image, or document respectively.

Adversarial Robustness Image Segmentation +5

Scalable Normalizing Flows for Permutation Invariant Densities

no code implementations7 Oct 2020 Marin Biloš, Stephan Günnemann

Modeling sets is an important problem in machine learning since this type of data can be found in many domains.

Point Processes

Deep Rao-Blackwellised Particle Filters for Time Series Forecasting

no code implementations NeurIPS 2020 Richard Kurle, Syama Sundar Rangapuram, Emmanuel de Bézenac, Stephan Günnemann, Jan Gasthaus

We propose a Monte Carlo objective that leverages the conditional linearity by computing the corresponding conditional expectations in closed-form and a suitable proposal distribution that is factorised similarly to the optimal proposal distribution.

Time Series Time Series Forecasting

Neural Temporal Point Processes: A Review

no code implementations8 Apr 2021 Oleksandr Shchur, Ali Caner Türkmen, Tim Januschowski, Stephan Günnemann

Temporal point processes (TPP) are probabilistic generative models for continuous-time event sequences.

Point Processes

Supporting AI Engineering on the IoT Edge through Model-Driven TinyML

no code implementations6 Jul 2021 Armin Moin, Moharram Challenger, Atta Badii, Stephan Günnemann

We focus on a sub-discipline of AI, namely Machine Learning (ML) and propose the delegation of data analytics and ML to the IoT edge.

MDE4QAI: Towards Model-Driven Engineering for Quantum Artificial Intelligence

no code implementations14 Jul 2021 Armin Moin, Moharram Challenger, Atta Badii, Stephan Günnemann

Over the past decade, Artificial Intelligence (AI) has provided enormous new possibilities and opportunities, but also new demands and requirements for software systems.

Code Generation

Scalable Optimal Transport in High Dimensions for Graph Distances, Embedding Alignment, and More

no code implementations14 Jul 2021 Johannes Gasteiger, Marten Lienen, Stephan Günnemann

The current best practice for computing optimal transport (OT) is via entropy regularization and Sinkhorn iterations.

Distance regression

On Second-order Optimization Methods for Federated Learning

no code implementations6 Sep 2021 Sebastian Bischoff, Stephan Günnemann, Martin Jaggi, Sebastian U. Stich

We consider federated learning (FL), where the training data is distributed across a large number of clients.

Federated Learning Specificity

A Study of Joint Graph Inference and Forecasting

no code implementations10 Sep 2021 Daniel Zügner, François-Xavier Aubet, Victor Garcia Satorras, Tim Januschowski, Stephan Günnemann, Jan Gasthaus

We study a recent class of models which uses graph neural networks (GNNs) to improve forecasting in multivariate time series.

Graph Learning Time Series +1

Locality-Based Mini Batching for Graph Neural Networks

no code implementations29 Sep 2021 Johannes Klicpera, Chendi Qian, Stephan Günnemann

Training graph neural networks on large graphs is challenging since there is no clear way of how to extract mini batches from connected data.

Provably Robust Transfer

no code implementations29 Sep 2021 Anna-Kathrin Kopetzki, Jana Obernosterer, Aleksandar Bojchevski, Stephan Günnemann

Our experiments show how adversarial training on the source domain affects robustness on source and target domain, and we propose the first provably robust transfer learning models.

Adversarial Robustness Transfer Learning

Intriguing Properties of Input-dependent Randomized Smoothing

no code implementations11 Oct 2021 Peter Súkeník, Aleksei Kuvshinov, Stephan Günnemann

We show that in general, the input-dependent smoothing suffers from the curse of dimensionality, forcing the variance function to have low semi-elasticity.

Fairness

Directional Message Passing on Molecular Graphs via Synthetic Coordinates

no code implementations NeurIPS 2021 Johannes Gasteiger, Chandan Yeshwanth, Stephan Günnemann

We furthermore set the state of the art on ZINC and coordinate-free QM9 by incorporating synthetic coordinates in the SMP and DimeNet++ models.

Molecular Property Prediction Property Prediction

Equivariant Normalizing Flows for Point Processes and Sets

no code implementations28 Sep 2020 Marin Biloš, Stephan Günnemann

To model this behavior, it is enough to transform the samples from the uniform process with a sufficiently complex equivariant function.

Point Processes

Multi-Objective Model Selection for Time Series Forecasting

no code implementations17 Feb 2022 Oliver Borchert, David Salinas, Valentin Flunkert, Tim Januschowski, Stephan Günnemann

By learning a mapping from forecasting models to performance metrics, we show that our method PARETOSELECT is able to accurately select models from the Pareto front -- alleviating the need to train or evaluate many forecasting models for model selection.

Model Selection Time Series +1

Is it all a cluster game? -- Exploring Out-of-Distribution Detection based on Clustering in the Embedding Space

no code implementations16 Mar 2022 Poulami Sinhamahapatra, Rajat Koner, Karsten Roscher, Stephan Günnemann

It is essential for safety-critical applications of deep neural networks to determine when new inputs are significantly different from the training distribution.

Contrastive Learning Out-of-Distribution Detection +1

Disentangling Epistemic and Aleatoric Uncertainty in Reinforcement Learning

no code implementations3 Jun 2022 Bertrand Charpentier, Ransalu Senanayake, Mykel Kochenderfer, Stephan Günnemann

Characterizing aleatoric and epistemic uncertainty can be used to speed up learning in a training environment, improve generalization to similar testing environments, and flag unfamiliar behavior in anomalous testing environments.

reinforcement-learning Reinforcement Learning (RL)

On the Robustness and Anomaly Detection of Sparse Neural Networks

no code implementations9 Jul 2022 Morgane Ayle, Bertrand Charpentier, John Rachwan, Daniel Zügner, Simon Geisler, Stephan Günnemann

The robustness and anomaly detection capability of neural networks are crucial topics for their safe adoption in the real-world.

Anomaly Detection

MDE for Machine Learning-Enabled Software Systems: A Case Study and Comparison of MontiAnna & ML-Quadrat

no code implementations15 Sep 2022 Jörg Christian Kirchhof, Evgeny Kusmenko, Jonas Ritz, Bernhard Rumpe, Armin Moin, Atta Badii, Stephan Günnemann, Moharram Challenger

In this paper, we propose to adopt the MDE paradigm for the development of Machine Learning (ML)-enabled software systems with a focus on the Internet of Things (IoT) domain.

AutoML

A Systematic Evaluation of Node Embedding Robustness

1 code implementation16 Sep 2022 Alexandru Mara, Jefrey Lijffijt, Stephan Günnemann, Tijl De Bie

We find that node classification results are impacted more than network reconstruction ones, that degree-based and label-based attacks are on average the most damaging and that label heterophily can strongly influence attack performance.

Classification Node Classification

Unveiling the Sampling Density in Non-Uniform Geometric Graphs

no code implementations15 Oct 2022 Raffaele Paolino, Aleksandar Bojchevski, Stephan Günnemann, Gitta Kutyniok, Ron Levie

A powerful framework for studying graphs is to consider them as geometric graphs: nodes are randomly sampled from an underlying metric space, and any pair of nodes is connected if their distance is less than a specified neighborhood radius.

Irregularly-Sampled Time Series Modeling with Spline Networks

no code implementations19 Oct 2022 Marin Biloš, Emanuel Ramneantu, Stephan Günnemann

Observations made in continuous time are often irregular and contain the missing values across different channels.

Time Series Time Series Analysis

Localized Randomized Smoothing for Collective Robustness Certification

no code implementations28 Oct 2022 Jan Schuchardt, Tom Wollschläger, Aleksandar Bojchevski, Stephan Günnemann

We further show that this approach is beneficial for the larger class of softly local models, where each output is dependent on the entire input but assigns different levels of importance to different input regions (e. g. based on their proximity in the image).

Image Segmentation Node Classification +1

Invariance-Aware Randomized Smoothing Certificates

no code implementations25 Nov 2022 Jan Schuchardt, Stephan Günnemann

Building models that comply with the invariances inherent to different domains, such as invariance under translation or rotation, is a key aspect of applying machine learning to real world problems like molecular property prediction, medical imaging, protein folding or LiDAR classification.

Molecular Property Prediction Property Prediction +1

Influence-Based Mini-Batching for Graph Neural Networks

no code implementations18 Dec 2022 Johannes Gasteiger, Chendi Qian, Stephan Günnemann

Using graph neural networks for large graphs is challenging since there is no clear way of constructing mini-batches.

Graph Clustering

Are Defenses for Graph Neural Networks Robust?

no code implementations31 Jan 2023 Felix Mujkanovic, Simon Geisler, Stephan Günnemann, Aleksandar Bojchevski

A cursory reading of the literature suggests that we have made a lot of progress in designing effective adversarial defenses for Graph Neural Networks (GNNs).

Collective Robustness Certificates: Exploiting Interdependence in Graph Neural Networks

no code implementations6 Feb 2023 Jan Schuchardt, Aleksandar Bojchevski, Johannes Gasteiger, Stephan Günnemann

In tasks like node classification, image segmentation, and named-entity recognition we have a classifier that simultaneously outputs multiple predictions (a vector of labels) based on a single input, i. e. a single graph, image, or document respectively.

Adversarial Robustness Image Segmentation +5

Accuracy is not the only Metric that matters: Estimating the Energy Consumption of Deep Learning Models

no code implementations3 Apr 2023 Johannes Getzner, Bertrand Charpentier, Stephan Günnemann

Modern machine learning models have started to consume incredible amounts of energy, thus incurring large carbon footprints (Strubell et al., 2019).

Towards Efficient MCMC Sampling in Bayesian Neural Networks by Exploiting Symmetry

no code implementations6 Apr 2023 Jonas Gregor Wiese, Lisa Wimmer, Theodore Papamarkou, Bernd Bischl, Stephan Günnemann, David Rügamer

Bayesian inference in deep neural networks is challenging due to the high-dimensional, strongly multi-modal parameter posterior density landscape.

Bayesian Inference Uncertainty Quantification

Revisiting Robustness in Graph Machine Learning

no code implementations1 May 2023 Lukas Gosch, Daniel Sturm, Simon Geisler, Stephan Günnemann

Many works show that node-level predictions of Graph Neural Networks (GNNs) are unrobust to small, often termed adversarial, changes to the graph structure.

Adversarial Robustness

Efficient MILP Decomposition in Quantum Computing for ReLU Network Robustness

1 code implementation30 Apr 2023 Nicola Franco, Tom Wollschläger, Benedikt Poggel, Stephan Günnemann, Jeanette Miriam Lorenz

We conduct a detailed analysis for the decomposition of MILP with Benders and Dantzig-Wolfe methods.

Adversarial Training for Graph Neural Networks: Pitfalls, Solutions, and New Directions

no code implementations NeurIPS 2023 Lukas Gosch, Simon Geisler, Daniel Sturm, Bertrand Charpentier, Daniel Zügner, Stephan Günnemann

Including these contributions, we demonstrate that adversarial training is a state-of-the-art defense against adversarial structure perturbations.

Graph Learning

Uncertainty Estimation for Molecules: Desiderata and Methods

no code implementations20 Jun 2023 Tom Wollschläger, Nicholas Gao, Bertrand Charpentier, Mohamed Amine Ketata, Stephan Günnemann

Graph Neural Networks (GNNs) are promising surrogates for quantum mechanical calculations as they establish unprecedented low errors on collections of molecular dynamics (MD) trajectories.

The power of motifs as inductive bias for learning molecular distributions

no code implementations4 Apr 2023 Johanna Sommer, Leon Hetzel, David Lüdke, Fabian Theis, Stephan Günnemann

Machine learning for molecules holds great potential for efficiently exploring the vast chemical space and thus streamlining the drug discovery process by facilitating the design of new therapeutic molecules.

Drug Discovery Inductive Bias

AI-Enabled Software and System Architecture Frameworks: Focusing on smart Cyber-Physical Systems (CPS)

no code implementations9 Aug 2023 Armin Moin, Atta Badii, Stephan Günnemann, Moharram Challenger

Therefore, they failed to address the architecture viewpoints and views responsive to the concerns of the data science community.

Benchmarking

Expressivity of Graph Neural Networks Through the Lens of Adversarial Robustness

1 code implementation16 Aug 2023 Francesco Campi, Lukas Gosch, Tom Wollschläger, Yan Scholten, Stephan Günnemann

We perform the first adversarial robustness study into Graph Neural Networks (GNNs) that are provably more powerful than traditional Message Passing Neural Networks (MPNNs).

Adversarial Robustness Subgraph Counting

Stream-based Active Learning by Exploiting Temporal Properties in Perception with Temporal Predicted Loss

no code implementations11 Sep 2023 Sebastian Schmidt, Stephan Günnemann

We exploited the temporal properties for such image streams in our work and proposed the novel temporal predicted loss (TPL) method.

Active Learning

Assessing Robustness via Score-Based Adversarial Image Generation

no code implementations6 Oct 2023 Marcel Kollovieh, Lukas Gosch, Yan Scholten, Marten Lienen, Stephan Günnemann

In this work, we introduce Score-Based Adversarial Generation (ScoreAG), a novel framework that leverages the advancements in score-based generative models to generate adversarial examples beyond $\ell_p$-norm constraints, so-called unrestricted adversarial examples, overcoming their limitations.

Image Generation

Hierarchical Randomized Smoothing

no code implementations NeurIPS 2023 Yan Scholten, Jan Schuchardt, Aleksandar Bojchevski, Stephan Günnemann

Randomized smoothing is a powerful framework for making models provably robust against small changes to their inputs - by guaranteeing robustness of the majority vote when randomly adding noise before classification.

Node Classification

Add and Thin: Diffusion for Temporal Point Processes

no code implementations NeurIPS 2023 David Lüdke, Marin Biloš, Oleksandr Shchur, Marten Lienen, Stephan Günnemann

Autoregressive neural networks within the temporal point process (TPP) framework have become the standard for modeling continuous-time event data.

Denoising Density Estimation +1

On the Adversarial Robustness of Graph Contrastive Learning Methods

no code implementations29 Nov 2023 Filippo Guerranti, Zinuo Yi, Anna Starovoit, Rafiq Kamel, Simon Geisler, Stephan Günnemann

Contrastive learning (CL) has emerged as a powerful framework for learning representations of images and text in a self-supervised manner while enhancing model robustness against adversarial attacks.

Adversarial Robustness Contrastive Learning +2

Attacking Large Language Models with Projected Gradient Descent

no code implementations14 Feb 2024 Simon Geisler, Tom Wollschläger, M. H. I. Abdalla, Johannes Gasteiger, Stephan Günnemann

Current LLM alignment methods are readily broken through specifically crafted adversarial prompts.

Shaving Weights with Occam's Razor: Bayesian Sparsification for Neural Networks Using the Marginal Likelihood

no code implementations25 Feb 2024 Rayen Dhahri, Alexander Immer, Betrand Charpentier, Stephan Günnemann, Vincent Fortuin

Neural network sparsification is a promising avenue to save computational time and memory costs, especially in an age where many successful AI models are becoming too large to na\"ively deploy on consumer hardware.

On Representing Electronic Wave Functions with Sign Equivariant Neural Networks

no code implementations8 Mar 2024 Nicholas Gao, Stephan Günnemann

Recent neural networks demonstrated impressively accurate approximations of electronic ground-state wave functions.

Group Privacy Amplification and Unified Amplification by Subsampling for Rényi Differential Privacy

no code implementations7 Mar 2024 Jan Schuchardt, Mihail Stoian, Arthur Kosmala, Stephan Günnemann

Differential privacy (DP) has various desirable properties, such as robustness to post-processing, group privacy, and amplification by subsampling, which can be derived independently of each other.

Cannot find the paper you are looking for? You can Submit a new open access paper.