Search Results for author: Stephan Günnemann

Found 117 papers, 62 papers with code

Stream-based Active Learning by Exploiting Temporal Properties in Perception with Temporal Predicted Loss

no code implementations11 Sep 2023 Sebastian Schmidt, Stephan Günnemann

We exploited the temporal properties for such image streams in our work and proposed the novel temporal predicted loss (TPL) method.

Active Learning

Expressivity of Graph Neural Networks Through the Lens of Adversarial Robustness

1 code implementation16 Aug 2023 Francesco Campi, Lukas Gosch, Tom Wollschläger, Yan Scholten, Stephan Günnemann

We perform the first adversarial robustness study into Graph Neural Networks (GNNs) that are provably more powerful than traditional Message Passing Neural Networks (MPNNs).

Adversarial Robustness Subgraph Counting

AI-Enabled Software and System Architecture Frameworks: Focusing on smart Cyber-Physical Systems (CPS)

no code implementations9 Aug 2023 Armin Moin, Atta Badii, Stephan Günnemann, Moharram Challenger

Therefore, they failed to address the architecture viewpoints and views responsive to the concerns of the data science community.


Preventing Errors in Person Detection: A Part-Based Self-Monitoring Framework

1 code implementation10 Jul 2023 Franziska Schwaiger, Andrea Matic, Karsten Roscher, Stephan Günnemann

The ability to detect learned objects regardless of their appearance is crucial for autonomous systems in real-world applications.

Human Detection object-detection +1

Density-based Feasibility Learning with Normalizing Flows for Introspective Robotic Assembly

1 code implementation3 Jul 2023 Jianxiang Feng, Matan Atad, Ismael Rodríguez, Maximilian Durner, Stephan Günnemann, Rudolph Triebel

Machine Learning (ML) models in Robotic Assembly Sequence Planning (RASP) need to be introspective on the predicted solutions, i. e. whether they are feasible or not, to circumvent potential efficiency degradation.

Out of Distribution (OOD) Detection

Adversarial Training for Graph Neural Networks

no code implementations27 Jun 2023 Lukas Gosch, Simon Geisler, Daniel Sturm, Bertrand Charpentier, Daniel Zügner, Stephan Günnemann

Including these contributions, we demonstrate that adversarial training is a state-of-the-art defense against adversarial structure perturbations.

Graph Learning

Uncertainty Estimation for Molecules: Desiderata and Methods

no code implementations20 Jun 2023 Tom Wollschläger, Nicholas Gao, Bertrand Charpentier, Mohamed Amine Ketata, Stephan Günnemann

Graph Neural Networks (GNNs) are promising surrogates for quantum mechanical calculations as they establish unprecedented low errors on collections of molecular dynamics (MD) trajectories.

MAGNet: Motif-Agnostic Generation of Molecules from Shapes

no code implementations30 May 2023 Leon Hetzel, Johanna Sommer, Bastian Rieck, Fabian Theis, Stephan Günnemann

To this end, we introduce a novel factorisation of the molecules' data distribution that accounts for the molecules' global context and facilitates learning adequate assignments of atoms and bonds onto shapes.

Drug Discovery

Generative Diffusion for 3D Turbulent Flows

no code implementations29 May 2023 Marten Lienen, Jan Hansen-Palmus, David Lüdke, Stephan Günnemann

Turbulent flows are well known to be chaotic and hard to predict; however, their dynamics differ between two and three dimensions.

Revisiting Robustness in Graph Machine Learning

no code implementations1 May 2023 Lukas Gosch, Daniel Sturm, Simon Geisler, Stephan Günnemann

Many works show that node-level predictions of Graph Neural Networks (GNNs) are unrobust to small, often termed adversarial, changes to the graph structure.

Adversarial Robustness

Towards Efficient MCMC Sampling in Bayesian Neural Networks by Exploiting Symmetry

no code implementations6 Apr 2023 Jonas Gregor Wiese, Lisa Wimmer, Theodore Papamarkou, Bernd Bischl, Stephan Günnemann, David Rügamer

Bayesian inference in deep neural networks is challenging due to the high-dimensional, strongly multi-modal parameter posterior density landscape.

Bayesian Inference

The power of motifs as inductive bias for learning molecular distributions

no code implementations4 Apr 2023 Johanna Sommer, Leon Hetzel, David Lüdke, Fabian Theis, Stephan Günnemann

Machine learning for molecules holds great potential for efficiently exploring the vast chemical space and thus streamlining the drug discovery process by facilitating the design of new therapeutic molecules.

Drug Discovery Inductive Bias

Accuracy is not the only Metric that matters: Estimating the Energy Consumption of Deep Learning Models

no code implementations3 Apr 2023 Johannes Getzner, Bertrand Charpentier, Stephan Günnemann

Modern machine learning models have started to consume incredible amounts of energy, thus incurring large carbon footprints (Strubell et al., 2019).

Training, Architecture, and Prior for Deterministic Uncertainty Methods

1 code implementation10 Mar 2023 Bertrand Charpentier, Chenxiang Zhang, Stephan Günnemann

Accurate and efficient uncertainty estimation is crucial to build reliable Machine Learning (ML) models capable to provide calibrated uncertainty estimates, generalize and detect Out-Of-Distribution (OOD) datasets.

Ewald-based Long-Range Message Passing for Molecular Graphs

1 code implementation8 Mar 2023 Arthur Kosmala, Johannes Gasteiger, Nicholas Gao, Stephan Günnemann

Neural architectures that learn potential energy surfaces from molecular data have undergone fast improvement in recent years.

Inductive Bias

Generalizing Neural Wave Functions

1 code implementation8 Feb 2023 Nicholas Gao, Stephan Günnemann

To overcome this limitation, we present Graph-learned orbital embeddings (Globe), a neural network-based reparametrization method that can adapt neural wave functions to different molecules.

Collective Robustness Certificates: Exploiting Interdependence in Graph Neural Networks

no code implementations6 Feb 2023 Jan Schuchardt, Aleksandar Bojchevski, Johannes Gasteiger, Stephan Günnemann

In tasks like node classification, image segmentation, and named-entity recognition we have a classifier that simultaneously outputs multiple predictions (a vector of labels) based on a single input, i. e. a single graph, image, or document respectively.

Adversarial Robustness Image Segmentation +5

Are Defenses for Graph Neural Networks Robust?

no code implementations31 Jan 2023 Felix Mujkanovic, Simon Geisler, Stephan Günnemann, Aleksandar Bojchevski

A cursory reading of the literature suggests that we have made a lot of progress in designing effective adversarial defenses for Graph Neural Networks (GNNs).

Transformers Meet Directed Graphs

1 code implementation31 Jan 2023 Simon Geisler, Yujia Li, Daniel Mankowitz, Ali Taylan Cemgil, Stephan Günnemann, Cosmin Paduraru

Transformers were originally proposed as a sequence-to-sequence model for text but have become vital for a wide range of modalities, including images, audio, video, and undirected graphs.

graph construction Graph Property Prediction

Randomized Message-Interception Smoothing: Gray-box Certificates for Graph Neural Networks

1 code implementation5 Jan 2023 Yan Scholten, Jan Schuchardt, Simon Geisler, Aleksandar Bojchevski, Stephan Günnemann

To remedy this, we propose novel gray-box certificates that exploit the message-passing principle of GNNs: We randomly intercept messages and carefully analyze the probability that messages from adversarially controlled nodes reach their target nodes.

Adversarial Robustness

Influence-Based Mini-Batching for Graph Neural Networks

no code implementations18 Dec 2022 Johannes Gasteiger, Chendi Qian, Stephan Günnemann

Using graph neural networks for large graphs is challenging since there is no clear way of constructing mini-batches.

Graph Clustering

Invariance-Aware Randomized Smoothing Certificates

no code implementations25 Nov 2022 Jan Schuchardt, Stephan Günnemann

Building models that comply with the invariances inherent to different domains, such as invariance under translation or rotation, is a key aspect of applying machine learning to real world problems like molecular property prediction, medical imaging, protein folding or LiDAR classification.

Molecular Property Prediction Property Prediction +1

Localized Randomized Smoothing for Collective Robustness Certification

no code implementations28 Oct 2022 Jan Schuchardt, Tom Wollschläger, Aleksandar Bojchevski, Stephan Günnemann

We further show that this approach is beneficial for the larger class of softly local models, where each output is dependent on the entire input but assigns different levels of importance to different input regions (e. g. based on their proximity in the image).

Image Segmentation Node Classification +1

torchode: A Parallel ODE Solver for PyTorch

1 code implementation22 Oct 2022 Marten Lienen, Stephan Günnemann

We introduce an ODE solver for the PyTorch ecosystem that can solve multiple ODEs in parallel independently from each other while achieving significant performance gains.

Irregularly-Sampled Time Series Modeling with Spline Networks

no code implementations19 Oct 2022 Marin Biloš, Emanuel Ramneantu, Stephan Günnemann

Observations made in continuous time are often irregular and contain the missing values across different channels.

Time Series Time Series Analysis

Unveiling the Sampling Density in Non-Uniform Geometric Graphs

no code implementations15 Oct 2022 Raffaele Paolino, Aleksandar Bojchevski, Stephan Günnemann, Gitta Kutyniok, Ron Levie

A powerful framework for studying graphs is to consider them as geometric graphs: nodes are randomly sampled from an underlying metric space, and any pair of nodes is connected if their distance is less than a specified neighborhood radius.

A Systematic Evaluation of Node Embedding Robustness

1 code implementation16 Sep 2022 Alexandru Mara, Jefrey Lijffijt, Stephan Günnemann, Tijl De Bie

We find that node classification results are impacted more than network reconstruction ones, that degree-based and label-based attacks are on average the most damaging and that label heterophily can strongly influence attack performance.

Classification Node Classification

MDE for Machine Learning-Enabled Software Systems: A Case Study and Comparison of MontiAnna & ML-Quadrat

no code implementations15 Sep 2022 Jörg Christian Kirchhof, Evgeny Kusmenko, Jonas Ritz, Bernhard Rumpe, Armin Moin, Atta Badii, Stephan Günnemann, Moharram Challenger

In this paper, we propose to adopt the MDE paradigm for the development of Machine Learning (ML)-enabled software systems with a focus on the Internet of Things (IoT) domain.


United States Politicians' Tone Became More Negative with 2016 Primary Campaigns

1 code implementation17 Jul 2022 Jonathan Külz, Andreas Spitz, Ahmad Abu-Akel, Stephan Günnemann, Robert West

There is a widespread belief that the tone of US political language has become more negative recently, in particular when Donald Trump entered politics.

On the Robustness and Anomaly Detection of Sparse Neural Networks

no code implementations9 Jul 2022 Morgane Ayle, Bertrand Charpentier, John Rachwan, Daniel Zügner, Simon Geisler, Stephan Günnemann

The robustness and anomaly detection capability of neural networks are crucial topics for their safe adoption in the real-world.

Anomaly Detection

Disentangling Epistemic and Aleatoric Uncertainty in Reinforcement Learning

no code implementations3 Jun 2022 Bertrand Charpentier, Ransalu Senanayake, Mykel Kochenderfer, Stephan Günnemann

Characterizing aleatoric and epistemic uncertainty can be used to speed up learning in a training environment, improve generalization to similar testing environments, and flag unfamiliar behavior in anomalous testing environments.

reinforcement-learning Reinforcement Learning (RL)

Sampling-free Inference for Ab-Initio Potential Energy Surface Networks

1 code implementation30 May 2022 Nicholas Gao, Stephan Günnemann

In this work, we address the inference shortcomings by proposing the Potential learning from ab-initio Networks (PlaNet) framework, in which we simultaneously train a surrogate model in addition to the neural wave function.

Inductive Bias Numerical Integration

Predicting Cellular Responses to Novel Drug Perturbations at a Single-Cell Resolution

1 code implementation28 Apr 2022 Leon Hetzel, Simon Böhm, Niki Kilbertus, Stephan Günnemann, Mohammad Lotfollahi, Fabian Theis

Single-cell transcriptomics enabled the study of cellular heterogeneity in response to perturbations at the resolution of individual cells.

Drug Discovery Transfer Learning

Is it all a cluster game? -- Exploring Out-of-Distribution Detection based on Clustering in the Embedding Space

no code implementations16 Mar 2022 Poulami Sinhamahapatra, Rajat Koner, Karsten Roscher, Stephan Günnemann

It is essential for safety-critical applications of deep neural networks to determine when new inputs are significantly different from the training distribution.

Contrastive Learning Out-of-Distribution Detection +1

Differentiable DAG Sampling

1 code implementation ICLR 2022 Bertrand Charpentier, Simon Kibler, Stephan Günnemann

To this end, DP-DAG samples a DAG by successively (1) sampling a linear ordering of the node and (2) sampling edges consistent with the sampled linear ordering.

Variational Inference

Multi-Objective Model Selection for Time Series Forecasting

no code implementations17 Feb 2022 Oliver Borchert, David Salinas, Valentin Flunkert, Tim Januschowski, Stephan Günnemann

By learning a mapping from forecasting models to performance metrics, we show that our method PARETOSELECT is able to accurately select models from the Pareto front -- alleviating the need to train or evaluate many forecasting models for model selection.

Model Selection Time Series +1

Graph Data Augmentation for Graph Machine Learning: A Survey

1 code implementation17 Feb 2022 Tong Zhao, Wei Jin, Yozen Liu, Yingheng Wang, Gang Liu, Stephan Günnemann, Neil Shah, Meng Jiang

Overall, our work aims to clarify the landscape of existing literature in graph data augmentation and motivates additional work in this area, providing a helpful resource for researchers and practitioners in the broader graph machine learning domain.

BIG-bench Machine Learning Data Augmentation

Directional Message Passing on Molecular Graphs via Synthetic Coordinates

no code implementations NeurIPS 2021 Johannes Gasteiger, Chandan Yeshwanth, Stephan Günnemann

We furthermore set the state of the art on ZINC and coordinate-free QM9 by incorporating synthetic coordinates in the SMP and DimeNet++ models.

Molecular Property Prediction Property Prediction

Robustness of Graph Neural Networks at Scale

1 code implementation NeurIPS 2021 Simon Geisler, Tobias Schmidt, Hakan Şirin, Daniel Zügner, Aleksandar Bojchevski, Stephan Günnemann

Graph Neural Networks (GNNs) are increasingly important given their popularity and the diversity of applications.

Intriguing Properties of Input-dependent Randomized Smoothing

no code implementations11 Oct 2021 Peter Súkeník, Aleksei Kuvshinov, Stephan Günnemann

We show that in general, the input-dependent smoothing suffers from the curse of dimensionality, forcing the variance function to have low semi-elasticity.


Provably Robust Transfer

no code implementations29 Sep 2021 Anna-Kathrin Kopetzki, Jana Obernosterer, Aleksandar Bojchevski, Stephan Günnemann

Our experiments show how adversarial training on the source domain affects robustness on source and target domain, and we propose the first provably robust transfer learning models.

Adversarial Robustness Transfer Learning

Locality-Based Mini Batching for Graph Neural Networks

no code implementations29 Sep 2021 Johannes Klicpera, Chendi Qian, Stephan Günnemann

Training graph neural networks on large graphs is challenging since there is no clear way of how to extract mini batches from connected data.

A Study of Joint Graph Inference and Forecasting

no code implementations10 Sep 2021 Daniel Zügner, François-Xavier Aubet, Victor Garcia Satorras, Tim Januschowski, Stephan Günnemann, Jan Gasthaus

We study a recent class of models which uses graph neural networks (GNNs) to improve forecasting in multivariate time series.

Graph Learning Time Series +1

On Second-order Optimization Methods for Federated Learning

no code implementations6 Sep 2021 Sebastian Bischoff, Stephan Günnemann, Martin Jaggi, Sebastian U. Stich

We consider federated learning (FL), where the training data is distributed across a large number of clients.

Federated Learning Specificity

Whole Brain Vessel Graphs: A Dataset and Benchmark for Graph Learning and Neuroscience (VesselGraph)

1 code implementation30 Aug 2021 Johannes C. Paetzold, Julian McGinnis, Suprosanna Shit, Ivan Ezhov, Paul Büschl, Chinmay Prabhakar, Mihail I. Todorov, Anjany Sekuboyina, Georgios Kaissis, Ali Ertürk, Stephan Günnemann, Bjoern H. Menze

Moreover, we benchmark numerous state-of-the-art graph learning algorithms on the biologically relevant tasks of vessel prediction and vessel classification using the introduced vessel graph dataset.

Graph Learning

OODformer: Out-Of-Distribution Detection Transformer

1 code implementation19 Jul 2021 Rajat Koner, Poulami Sinhamahapatra, Karsten Roscher, Stephan Günnemann, Volker Tresp

A serious problem in image classification is that a trained model might perform well for input data that originates from the same distribution as the data available for model training, but performs much worse for out-of-distribution (OOD) samples.

Contrastive Learning Out-of-Distribution Detection +1

Scalable Optimal Transport in High Dimensions for Graph Distances, Embedding Alignment, and More

no code implementations14 Jul 2021 Johannes Gasteiger, Marten Lienen, Stephan Günnemann

The current best practice for computing optimal transport (OT) is via entropy regularization and Sinkhorn iterations.

MDE4QAI: Towards Model-Driven Engineering for Quantum Artificial Intelligence

no code implementations14 Jul 2021 Armin Moin, Moharram Challenger, Atta Badii, Stephan Günnemann

Over the past decade, Artificial Intelligence (AI) has provided enormous new possibilities and opportunities, but also new demands and requirements for software systems.

Code Generation

Graphhopper: Multi-Hop Scene Graph Reasoning for Visual Question Answering

1 code implementation13 Jul 2021 Rajat Koner, Hang Li, Marcel Hildebrandt, Deepan Das, Volker Tresp, Stephan Günnemann

We conduct an experimental study on the challenging dataset GQA, based on both manually curated and automatically generated scene graphs.

Navigate Question Answering +1

ML-Quadrat & DriotData: A Model-Driven Engineering Tool and a Low-Code Platform for Smart IoT Services

1 code implementation6 Jul 2021 Armin Moin, Andrei Mituca, Moharram Challenger, Atta Badii, Stephan Günnemann

In this paper, we present ML-Quadrat, an open-source research prototype that is based on the Eclipse Modeling Framework (EMF) and the state of the art in the literature of Model-Driven Software Engineering (MDSE) for smart Cyber-Physical Systems (CPS) and the Internet of Things (IoT).

A Model-Driven Approach to Machine Learning and Software Modeling for the IoT

1 code implementation6 Jul 2021 Armin Moin, Moharram Challenger, Atta Badii, Stephan Günnemann

In particular, we implement the proposed approach, called ML-Quadrat, based on ThingML, and validate it using a case study from the IoT domain, as well as through an empirical user evaluation.

BIG-bench Machine Learning Decision Making

Supporting AI Engineering on the IoT Edge through Model-Driven TinyML

no code implementations6 Jul 2021 Armin Moin, Moharram Challenger, Atta Badii, Stephan Günnemann

We focus on a sub-discipline of AI, namely Machine Learning (ML) and propose the delegation of data analytics and ML to the IoT edge.

On Out-of-distribution Detection with Energy-based Models

1 code implementation3 Jul 2021 Sven Elflein, Bertrand Charpentier, Daniel Zügner, Stephan Günnemann

Several density estimation methods have shown to fail to detect out-of-distribution (OOD) samples by assigning higher likelihoods to anomalous data.

Density Estimation Out-of-Distribution Detection +1

GemNet: Universal Directional Graph Neural Networks for Molecules

3 code implementations NeurIPS 2021 Johannes Gasteiger, Florian Becker, Stephan Günnemann

Effectively predicting molecular interactions has the potential to accelerate molecular dynamics by multiple orders of magnitude and thus revolutionize chemical simulations.


Neural Temporal Point Processes: A Review

no code implementations8 Apr 2021 Oleksandr Shchur, Ali Caner Türkmen, Tim Januschowski, Stephan Günnemann

Temporal point processes (TPP) are probabilistic generative models for continuous-time event sequences.

Point Processes

Collective Robustness Certificates

no code implementations ICLR 2021 Jan Schuchardt, Aleksandar Bojchevski, Johannes Klicpera, Stephan Günnemann

In tasks like node classification, image segmentation, and named-entity recognition we have a classifier that simultaneously outputs multiple predictions (a vector of labels) based on a single input, i. e. a single graph, image, or document respectively.

Adversarial Robustness Image Segmentation +5

Deep Rao-Blackwellised Particle Filters for Time Series Forecasting

no code implementations NeurIPS 2020 Richard Kurle, Syama Sundar Rangapuram, Emmanuel de Bézenac, Stephan Günnemann, Jan Gasthaus

We propose a Monte Carlo objective that leverages the conditional linearity by computing the corresponding conditional expectations in closed-form and a suitable proposal distribution that is factorised similarly to the optimal proposal distribution.

Time Series Time Series Forecasting

Reliable Graph Neural Networks via Robust Aggregation

1 code implementation NeurIPS 2020 Simon Geisler, Daniel Zügner, Stephan Günnemann

Perturbations targeting the graph structure have proven to be extremely effective in reducing the performance of Graph Neural Networks (GNNs), and traditional defenses such as adversarial training do not seem to be able to improve robustness.

Scalable Normalizing Flows for Permutation Invariant Densities

no code implementations7 Oct 2020 Marin Biloš, Stephan Günnemann

Modeling sets is an important problem in machine learning since this type of data can be found in many domains.

Point Processes

Equivariant Normalizing Flows for Point Processes and Sets

no code implementations28 Sep 2020 Marin Biloš, Stephan Günnemann

To model this behavior, it is enough to transform the samples from the uniform process with a sufficiently complex equivariant function.

Point Processes

ThingML+ Augmenting Model-Driven Software Engineering for the Internet of Things with Machine Learning

1 code implementation22 Sep 2020 Armin Moin, Stephan Rössler, Stephan Günnemann

In this paper, we present the current position of the research project ML-Quadrat, which aims to extend the methodology, modeling language and tool support of ThingML - an open source modeling tool for IoT/CPS - to address Machine Learning needs for the IoT applications.

BIG-bench Machine Learning Code Generation

From Things' Modeling Language (ThingML) to Things' Machine Learning (ThingML2)

1 code implementation22 Sep 2020 Armin Moin, Stephan Rössler, Marouane Sayih, Stephan Günnemann

In this paper, we illustrate how to enhance an existing state-of-the-art modeling language and tool for the Internet of Things (IoT), called ThingML, to support machine learning on the modeling level.

BIG-bench Machine Learning Code Generation

Efficient Robustness Certificates for Discrete Data: Sparsity-Aware Randomized Smoothing for Graphs, Images and More

no code implementations ICML 2020 Aleksandar Bojchevski, Johannes Gasteiger, Stephan Günnemann

Existing techniques for certifying the robustness of models for discrete data either work only for a small class of models or are general at the expense of efficiency or tightness.

Deep Representation Learning and Clustering of Traffic Scenarios

no code implementations15 Jul 2020 Nick Harmening, Marin Biloš, Stephan Günnemann

Determining the traffic scenario space is a major challenge for the homologation and coverage assessment of automated driving functions.

Clustering Representation Learning +1

Scene Graph Reasoning for Visual Question Answering

no code implementations2 Jul 2020 Marcel Hildebrandt, Hang Li, Rajat Koner, Volker Tresp, Stephan Günnemann

We propose a novel method that approaches the task by performing context-driven, sequential reasoning based on the objects and their semantic and spatial relationships present in the scene.

Navigate Question Answering +1

Fast and Flexible Temporal Point Processes with Triangular Maps

1 code implementation NeurIPS 2020 Oleksandr Shchur, Nicholas Gao, Marin Biloš, Stephan Günnemann

Temporal point process (TPP) models combined with recurrent neural networks provide a powerful framework for modeling continuous-time event data.

Point Processes Variational Inference

Posterior Network: Uncertainty Estimation without OOD Samples via Density-Based Pseudo-Counts

1 code implementation NeurIPS 2020 Bertrand Charpentier, Daniel Zügner, Stephan Günnemann

The posterior distributions learned by PostNet accurately reflect uncertainty for in- and out-of-distribution data -- without requiring access to OOD data at training time.

Out of Distribution (OOD) Detection

Continual Learning with Bayesian Neural Networks for Non-Stationary Data

no code implementations ICLR 2020 Richard Kurle, Botond Cseke, Alexej Klushyn, Patrick van der Smagt, Stephan Günnemann

We represent the posterior approximation of the network weights by a diagonal Gaussian distribution and a complementary memory of raw data.

Continual Learning

Graph Hawkes Neural Network for Forecasting on Temporal Knowledge Graphs

1 code implementation AKBC 2020 Zhen Han, Yunpu Ma, Yuyi Wang, Stephan Günnemann, Volker Tresp

The Hawkes process has become a standard method for modeling self-exciting event sequences with different event types.

Knowledge Graphs

Certifiable Robustness to Graph Perturbations

1 code implementation NeurIPS 2019 Aleksandar Bojchevski, Stephan Günnemann

Despite the exploding interest in graph neural networks there has been little effort to verify and improve their robustness.

Diffusion Improves Graph Learning

2 code implementations NeurIPS 2019 Johannes Gasteiger, Stefan Weißenberger, Stephan Günnemann

In this work, we remove the restriction of using only the direct neighbors by introducing a powerful, yet spatially localized graph convolution: Graph diffusion convolution (GDC).

Clustering Graph Learning +1

Intensity-Free Learning of Temporal Point Processes

3 code implementations ICLR 2020 Oleksandr Shchur, Marin Biloš, Stephan Günnemann

The standard way of learning in such models is by estimating the conditional intensity function.

Point Processes

Certifiable Robustness and Robust Training for Graph Convolutional Networks

1 code implementation28 Jun 2019 Daniel Zügner, Stephan Günnemann

Recent works show that Graph Neural Networks (GNNs) are highly non-robust with respect to adversarial attacks on both the graph structure and the node attributes, making their outcomes unreliable.

Node Classification

Adversarial Attacks on Node Embeddings

no code implementations ICLR 2019 Aleksandar Bojchevski, Stephan Günnemann

The goal of network representation learning is to learn low-dimensional node embeddings that capture the graph structure and are useful for solving downstream tasks.

Representation Learning

Pitfalls of Graph Neural Network Evaluation

2 code implementations14 Nov 2018 Oleksandr Shchur, Maximilian Mumme, Aleksandar Bojchevski, Stephan Günnemann

We perform a thorough empirical evaluation of four prominent GNN models and show that considering different splits of the data leads to dramatically different rankings of models.

Graph Mining Node Classification

Multi-Source Neural Variational Inference

no code implementations11 Nov 2018 Richard Kurle, Stephan Günnemann, Patrick van der Smagt

Learning from multiple sources of information is an important problem in machine-learning research.

Variational Inference

Mining Contrasting Quasi-Clique Patterns

no code implementations3 Oct 2018 Roberto Alonso, Stephan Günnemann

Mining dense quasi-cliques is a well-known clustering task with applications ranging from social networks over collaboration graphs to document analysis.


Adversarial Attacks on Node Embeddings via Graph Poisoning

1 code implementation ICLR 2019 Aleksandar Bojchevski, Stephan Günnemann

The goal of network representation learning is to learn low-dimensional node embeddings that capture the graph structure and are useful for solving downstream tasks.

Representation Learning

Dual-Primal Graph Convolutional Networks

no code implementations3 Jun 2018 Federico Monti, Oleksandr Shchur, Aleksandar Bojchevski, Or Litany, Stephan Günnemann, Michael M. Bronstein

In recent years, there has been a surge of interest in developing deep learning methods for non-Euclidean structured data such as graphs.

Graph Attention Recommendation Systems

Adversarial Attacks on Neural Networks for Graph Data

1 code implementation21 May 2018 Daniel Zügner, Amir Akbarnejad, Stephan Günnemann

Even more, our attacks are transferable: the learned attacks generalize to other state-of-the-art node classification models and unsupervised approaches, and likewise are successful even when only limited knowledge about the graph is given.

General Classification Node Classification

NetGAN: Generating Graphs via Random Walks

1 code implementation ICML 2018 Aleksandar Bojchevski, Oleksandr Shchur, Daniel Zügner, Stephan Günnemann

NetGAN is able to produce graphs that exhibit well-known network patterns without explicitly specifying them in the model definition.

Graph Generation Link Prediction

Deep Gaussian Embedding of Graphs: Unsupervised Inductive Learning via Ranking

1 code implementation ICLR 2018 Aleksandar Bojchevski, Stephan Günnemann

We propose Graph2Gauss - an approach that can efficiently learn versatile node embeddings on large scale (attributed) graphs that show strong performance on tasks such as link prediction and node classification.

Link Prediction Network Embedding +1

Linearized and Single-Pass Belief Propagation

1 code implementation27 Jun 2014 Wolfgang Gatterbauer, Stephan Günnemann, Danai Koutra, Christos Faloutsos

Often, we can answer such questions and label nodes in a network based on the labels of their neighbors and appropriate assumptions of homophily ("birds of a feather flock together") or heterophily ("opposites attract").

Cannot find the paper you are looking for? You can Submit a new open access paper.