no code implementations • 21 Jun 2022 • John Rachwan, Daniel Zügner, Bertrand Charpentier, Simon Geisler, Morgane Ayle, Stephan Günnemann
Pruning, the task of sparsifying deep neural networks, received increasing attention recently.
no code implementations • 3 Jun 2022 • Bertrand Charpentier, Ransalu Senanayake, Mykel Kochenderfer, Stephan Günnemann
Characterizing aleatoric and epistemic uncertainty can be used to speed up learning in a training environment, improve generalization to similar testing environments, and flag unfamiliar behavior in anomalous testing environments.
no code implementations • 30 May 2022 • Nicholas Gao, Stephan Günnemann
In recent work, the potential energy surface network (PESNet) has been proposed to reduce training time by solving the Schr\"odinger equation for many geometries simultaneously.
1 code implementation • 28 Apr 2022 • Leon Hetzel, Simon Böhm, Niki Kilbertus, Stephan Günnemann, Mohammad Lotfollahi, Fabian Theis
Single-cell transcriptomics enabled the study of cellular heterogeneity in response to perturbations at the resolution of individual cells.
no code implementations • 6 Apr 2022 • Johannes Gasteiger, Muhammed Shuaibi, Anuroop Sriram, Stephan Günnemann, Zachary Ulissi, C. Lawrence Zitnick, Abhishek Das
Based on this analysis, we identify a smaller dataset that correlates well with the full OC20 dataset, and propose the GemNet-OC model, which outperforms the previous state-of-the-art on OC20 by 16%, while reducing training time by a factor of 10.
Ranked #1 on
Initial Structure to Relaxed Energy (IS2RE)
on OC20
1 code implementation • ICLR 2022 • Marten Lienen, Stephan Günnemann
We propose a new method for spatio-temporal forecasting on arbitrarily distributed points.
Interpretability Techniques for Deep Learning
Interpretable Machine Learning
+2
no code implementations • ICLR 2022 • Bertrand Charpentier, Simon Kibler, Stephan Günnemann
To this end, DP-DAG samples a DAG by successively (1) sampling a linear ordering of the node and (2) sampling edges consistent with the sampled linear ordering.
no code implementations • 16 Mar 2022 • Poulami Sinhamahapatra, Rajat Koner, Karsten Roscher, Stephan Günnemann
It is essential for safety-critical applications of deep neural networks to determine when new inputs are significantly different from the training distribution.
no code implementations • 6 Mar 2022 • Armin Moin, Ukrit Wattanavaekin, Alexandra Lungu, Moharram Challenger, Atta Badii, Stephan Günnemann
Developing smart software services requires both Software Engineering and Artificial Intelligence (AI) skills.
1 code implementation • 17 Feb 2022 • Tong Zhao, Gang Liu, Stephan Günnemann, Meng Jiang
In this paper, we present a comprehensive and systematic survey of graph data augmentation that summarizes the literature in a structured manner.
no code implementations • 17 Feb 2022 • Oliver Borchert, David Salinas, Valentin Flunkert, Tim Januschowski, Stephan Günnemann
By learning a mapping from forecasting models to performance metrics, we show that our method PARETOSELECT is able to accurately select models from the Pareto front -- alleviating the need to train or evaluate many forecasting models for model selection.
no code implementations • NeurIPS 2021 • Johannes Gasteiger, Chandan Yeshwanth, Stephan Günnemann
We furthermore set the state of the art on ZINC and coordinate-free QM9 by incorporating synthetic coordinates in the SMP and DimeNet++ models.
1 code implementation • NeurIPS 2021 • Simon Geisler, Tobias Schmidt, Hakan Şirin, Daniel Zügner, Aleksandar Bojchevski, Stephan Günnemann
Graph Neural Networks (GNNs) are increasingly important given their popularity and the diversity of applications.
1 code implementation • NeurIPS 2021 • Maximilian Stadler, Bertrand Charpentier, Simon Geisler, Daniel Zügner, Stephan Günnemann
GPN outperforms existing approaches for uncertainty estimation in the experiments.
1 code implementation • NeurIPS 2021 • Marin Biloš, Johanna Sommer, Syama Sundar Rangapuram, Tim Januschowski, Stephan Günnemann
Neural ordinary differential equations describe how values change in time.
no code implementations • ICLR 2022 • Simon Geisler, Johanna Sommer, Jan Schuchardt, Aleksandar Bojchevski, Stephan Günnemann
Specifically, most datasets only capture a simpler subproblem and likely suffer from spurious features.
no code implementations • 11 Oct 2021 • Peter Súkeník, Aleksei Kuvshinov, Stephan Günnemann
We show that in general, the input-dependent smoothing suffers from the curse of dimensionality, forcing the variance function to have low semi-elasticity.
1 code implementation • ICLR 2022 • Nicholas Gao, Stephan Günnemann
Solving the Schr\"odinger equation is key to many quantum mechanical properties.
1 code implementation • NeurIPS Workshop AI4Scien 2021 • Hannes Stärk, Dominique Beaini, Gabriele Corso, Prudencio Tossou, Christian Dallago, Stephan Günnemann, Pietro Liò
Molecular property prediction is one of the fastest-growing applications of deep learning with critical real-world impacts.
no code implementations • 29 Sep 2021 • Jan Schuchardt, Tom Wollschläger, Aleksandar Bojchevski, Stephan Günnemann
Models for image segmentation, node classification and many other tasks map a single input to multiple labels.
no code implementations • 29 Sep 2021 • Hannes Stärk, Dominique Beaini, Gabriele Corso, Prudencio Tossou, Christian Dallago, Stephan Günnemann, Pietro Lio
Molecular property prediction is one of the fastest-growing applications of deep learning with critical real-world impacts.
no code implementations • ICLR 2022 • Daniel Zügner, Bertrand Charpentier, Morgane Ayle, Sascha Geringer, Stephan Günnemann
We propose a novel probabilistic model over hierarchies on graphs obtained by continuous relaxation of tree-based hierarchies.
no code implementations • 29 Sep 2021 • Johannes Klicpera, Chendi Qian, Stephan Günnemann
Training graph neural networks on large graphs is challenging since there is no clear way of how to extract mini batches from connected data.
no code implementations • 29 Sep 2021 • Anna-Kathrin Kopetzki, Jana Obernosterer, Aleksandar Bojchevski, Stephan Günnemann
Our experiments show how adversarial training on the source domain affects robustness on source and target domain, and we propose the first provably robust transfer learning models.
no code implementations • 10 Sep 2021 • Daniel Zügner, François-Xavier Aubet, Victor Garcia Satorras, Tim Januschowski, Stephan Günnemann, Jan Gasthaus
We study a recent class of models which uses graph neural networks (GNNs) to improve forecasting in multivariate time series.
no code implementations • 6 Sep 2021 • Sebastian Bischoff, Stephan Günnemann, Martin Jaggi, Sebastian U. Stich
We consider federated learning (FL), where the training data is distributed across a large number of clients.
1 code implementation • 30 Aug 2021 • Johannes C. Paetzold, Julian McGinnis, Suprosanna Shit, Ivan Ezhov, Paul Büschl, Chinmay Prabhakar, Mihail I. Todorov, Anjany Sekuboyina, Georgios Kaissis, Ali Ertürk, Stephan Günnemann, Bjoern H. Menze
Moreover, we benchmark numerous state-of-the-art graph learning algorithms on the biologically relevant tasks of vessel prediction and vessel classification using the introduced vessel graph dataset.
1 code implementation • 19 Jul 2021 • Rajat Koner, Poulami Sinhamahapatra, Karsten Roscher, Stephan Günnemann, Volker Tresp
A serious problem in image classification is that a trained model might perform well for input data that originates from the same distribution as the data available for model training, but performs much worse for out-of-distribution (OOD) samples.
no code implementations • 14 Jul 2021 • Johannes Gasteiger, Marten Lienen, Stephan Günnemann
The current best practice for computing optimal transport (OT) is via entropy regularization and Sinkhorn iterations.
no code implementations • 14 Jul 2021 • Armin Moin, Moharram Challenger, Atta Badii, Stephan Günnemann
Over the past decade, Artificial Intelligence (AI) has provided enormous new possibilities and opportunities, but also new demands and requirements for software systems.
1 code implementation • 13 Jul 2021 • Rajat Koner, Hang Li, Marcel Hildebrandt, Deepan Das, Volker Tresp, Stephan Günnemann
We conduct an experimental study on the challenging dataset GQA, based on both manually curated and automatically generated scene graphs.
no code implementations • 6 Jul 2021 • Armin Moin, Moharram Challenger, Atta Badii, Stephan Günnemann
We focus on a sub-discipline of AI, namely Machine Learning (ML) and propose the delegation of data analytics and ML to the IoT edge.
no code implementations • 6 Jul 2021 • Armin Moin, Andrei Mituca, Moharram Challenger, Atta Badii, Stephan Günnemann
In this paper, we present ML-Quadrat, an open-source research prototype that is based on the Eclipse Modeling Framework (EMF) and the state of the art in the literature of Model-Driven Software Engineering (MDSE) for smart Cyber-Physical Systems (CPS) and the Internet of Things (IoT).
1 code implementation • 6 Jul 2021 • Armin Moin, Moharram Challenger, Atta Badii, Stephan Günnemann
In particular, we implement the proposed approach, called ML-Quadrat, based on ThingML, and validate it using a case study from the IoT domain, as well as through an empirical user evaluation.
1 code implementation • 3 Jul 2021 • Sven Elflein, Bertrand Charpentier, Daniel Zügner, Stephan Günnemann
Several density estimation methods have shown to fail to detect out-of-distribution (OOD) samples by assigning higher likelihoods to anomalous data.
no code implementations • NeurIPS 2021 • Oleksandr Shchur, Ali Caner Türkmen, Tim Januschowski, Jan Gasthaus, Stephan Günnemann
Automatically detecting anomalies in event data can provide substantial value in domains such as healthcare, DevOps, and information security.
2 code implementations • NeurIPS 2021 • Johannes Gasteiger, Florian Becker, Stephan Günnemann
Effectively predicting molecular interactions has the potential to accelerate molecular dynamics by multiple orders of magnitude and thus revolutionize chemical simulations.
1 code implementation • ICLR 2022 • Bertrand Charpentier, Oliver Borchert, Daniel Zügner, Simon Geisler, Stephan Günnemann
Uncertainty awareness is crucial to develop reliable machine learning models.
no code implementations • 8 Apr 2021 • Oleksandr Shchur, Ali Caner Türkmen, Tim Januschowski, Stephan Günnemann
Temporal point processes (TPP) are probabilistic generative models for continuous-time event sequences.
2 code implementations • ICLR 2021 • Daniel Zügner, Tobias Kirschstein, Michele Catasta, Jure Leskovec, Stephan Günnemann
Source code (Context) and its parsed abstract syntax tree (AST; Structure) are two complementary representations of the same computer program.
no code implementations • ICLR 2021 • Jan Schuchardt, Aleksandar Bojchevski, Johannes Klicpera, Stephan Günnemann
In tasks like node classification, image segmentation, and named-entity recognition we have a classifier that simultaneously outputs multiple predictions (a vector of labels) based on a single input, i. e. a single graph, image, or document respectively.
no code implementations • 1 Jan 2021 • Johannes Klicpera, Marten Lienen, Stephan Günnemann
Optimal transport (OT) is a cornerstone of many machine learning tasks.
no code implementations • NeurIPS 2020 • Richard Kurle, Syama Sundar Rangapuram, Emmanuel de Bézenac, Stephan Günnemann, Jan Gasthaus
We propose a Monte Carlo objective that leverages the conditional linearity by computing the corresponding conditional expectations in closed-form and a suitable proposal distribution that is factorised similarly to the optimal proposal distribution.
3 code implementations • 28 Nov 2020 • Johannes Gasteiger, Shankari Giri, Johannes T. Margraf, Stephan Günnemann
Many important tasks in chemistry revolve around molecules during reactions.
1 code implementation • NeurIPS 2020 • Simon Geisler, Daniel Zügner, Stephan Günnemann
Perturbations targeting the graph structure have proven to be extremely effective in reducing the performance of Graph Neural Networks (GNNs), and traditional defenses such as adversarial training do not seem to be able to improve robustness.
1 code implementation • 28 Oct 2020 • Anna-Kathrin Kopetzki, Bertrand Charpentier, Daniel Zügner, Sandhya Giri, Stephan Günnemann
Dirichlet-based uncertainty (DBU) models are a recent and promising class of uncertainty-aware models.
no code implementations • 7 Oct 2020 • Marin Biloš, Stephan Günnemann
Modeling sets is an important problem in machine learning since this type of data can be found in many domains.
no code implementations • 28 Sep 2020 • Marin Biloš, Stephan Günnemann
To model this behavior, it is enough to transform the samples from the uniform process with a sufficiently complex equivariant function.
1 code implementation • 22 Sep 2020 • Armin Moin, Stephan Rössler, Marouane Sayih, Stephan Günnemann
In this paper, we illustrate how to enhance an existing state-of-the-art modeling language and tool for the Internet of Things (IoT), called ThingML, to support machine learning on the modeling level.
1 code implementation • 22 Sep 2020 • Armin Moin, Stephan Rössler, Stephan Günnemann
In this paper, we present the current position of the research project ML-Quadrat, which aims to extend the methodology, modeling language and tool support of ThingML - an open source modeling tool for IoT/CPS - to address Machine Learning needs for the IoT applications.
no code implementations • ICML 2020 • Aleksandar Bojchevski, Johannes Klicpera, Stephan Günnemann
Existing techniques for certifying the robustness of models for discrete data either work only for a small class of models or are general at the expense of efficiency or tightness.
no code implementations • 28 Jul 2020 • Anna-Kathrin Kopetzki, Stephan Günnemann
This principle is highly versatile, as we show.
no code implementations • 15 Jul 2020 • Nick Harmening, Marin Biloš, Stephan Günnemann
Determining the traffic scenario space is a major challenge for the homologation and coverage assessment of automated driving functions.
2 code implementations • 3 Jul 2020 • Aleksandar Bojchevski, Johannes Gasteiger, Bryan Perozzi, Amol Kapoor, Martin Blais, Benedek Rózemberczki, Michal Lukasik, Stephan Günnemann
Graph neural networks (GNNs) have emerged as a powerful approach for solving many network mining tasks.
no code implementations • 2 Jul 2020 • Marcel Hildebrandt, Hang Li, Rajat Koner, Volker Tresp, Stephan Günnemann
We propose a novel method that approaches the task by performing context-driven, sequential reasoning based on the objects and their semantic and spatial relationships present in the scene.
1 code implementation • NeurIPS 2020 • Oleksandr Shchur, Nicholas Gao, Marin Biloš, Stephan Günnemann
Temporal point process (TPP) models combined with recurrent neural networks provide a powerful framework for modeling continuous-time event data.
1 code implementation • NeurIPS 2020 • Bertrand Charpentier, Daniel Zügner, Stephan Günnemann
The posterior distributions learned by PostNet accurately reflect uncertainty for in- and out-of-distribution data -- without requiring access to OOD data at training time.
no code implementations • ICLR 2020 • Richard Kurle, Botond Cseke, Alexej Klushyn, Patrick van der Smagt, Stephan Günnemann
We represent the posterior approximation of the network weights by a diagonal Gaussian distribution and a complementary memory of raw data.
1 code implementation • AKBC 2020 • Zhen Han, Yunpu Ma, Yuyi Wang, Stephan Günnemann, Volker Tresp
The Hawkes process has become a standard method for modeling self-exciting event sequences with different event types.
4 code implementations • ICLR 2020 • Johannes Gasteiger, Janek Groß, Stephan Günnemann
Each message is associated with a direction in coordinate space.
Ranked #2 on
Drug Discovery
on QM9
1 code implementation • 22 Nov 2019 • Alexander Ziller, Julius Hansjakob, Vitalii Rusinov, Daniel Zügner, Peter Vogel, Stephan Günnemann
We release a realistic, diverse, and challenging dataset for object detection on images.
1 code implementation • NeurIPS 2019 • Marin Biloš, Bertrand Charpentier, Stephan Günnemann
Asynchronous event sequences are the basis of many applications throughout different industries.
1 code implementation • NeurIPS 2019 • Aleksandar Bojchevski, Stephan Günnemann
Despite the exploding interest in graph neural networks there has been little effort to verify and improve their robustness.
2 code implementations • NeurIPS 2019 • Johannes Gasteiger, Stefan Weißenberger, Stephan Günnemann
In this work, we remove the restriction of using only the direct neighbors by introducing a powerful, yet spatially localized graph convolution: Graph diffusion convolution (GDC).
Ranked #1 on
Node Classification
on AMZ Comp
1 code implementation • ICLR 2019 • Oleksandr Shchur, Stephan Günnemann
Community detection is a fundamental problem in machine learning.
2 code implementations • ICLR 2020 • Oleksandr Shchur, Marin Biloš, Stephan Günnemann
The standard way of learning in such models is by estimating the conditional intensity function.
1 code implementation • 28 Jun 2019 • Daniel Zügner, Stephan Günnemann
Recent works show that Graph Neural Networks (GNNs) are highly non-robust with respect to adversarial attacks on both the graph structure and the node attributes, making their outcomes unreliable.
Ranked #18 on
Node Classification
on Pubmed
no code implementations • ICLR 2019 • Aleksandar Bojchevski, Stephan Günnemann
The goal of network representation learning is to learn low-dimensional node embeddings that capture the graph structure and are useful for solving downstream tasks.
1 code implementation • ICLR 2019 • Daniel Zügner, Stephan Günnemann
Deep learning models for graphs have advanced the state of the art on many tasks.
1 code implementation • 14 Nov 2018 • Oleksandr Shchur, Maximilian Mumme, Aleksandar Bojchevski, Stephan Günnemann
We perform a thorough empirical evaluation of four prominent GNN models and show that considering different splits of the data leads to dramatically different rankings of models.
no code implementations • 11 Nov 2018 • Richard Kurle, Stephan Günnemann, Patrick van der Smagt
Learning from multiple sources of information is an important problem in machine-learning research.
1 code implementation • NeurIPS 2019 • Stephan Rabanser, Stephan Günnemann, Zachary C. Lipton
We might hope that when faced with unexpected inputs, well-designed software systems would fire off warnings.
4 code implementations • ICLR 2019 • Johannes Gasteiger, Aleksandar Bojchevski, Stephan Günnemann
We utilize this propagation procedure to construct a simple model, personalized propagation of neural predictions (PPNP), and its fast approximation, APPNP.
Ranked #1 on
Node Classification
on MS ACADEMIC
no code implementations • 3 Oct 2018 • Roberto Alonso, Stephan Günnemann
Mining dense quasi-cliques is a well-known clustering task with applications ranging from social networks over collaboration graphs to document analysis.
1 code implementation • ICLR 2019 • Aleksandar Bojchevski, Stephan Günnemann
The goal of network representation learning is to learn low-dimensional node embeddings that capture the graph structure and are useful for solving downstream tasks.
no code implementations • 3 Jun 2018 • Federico Monti, Oleksandr Shchur, Aleksandar Bojchevski, Or Litany, Stephan Günnemann, Michael M. Bronstein
In recent years, there has been a surge of interest in developing deep learning methods for non-Euclidean structured data such as graphs.
1 code implementation • 21 May 2018 • Daniel Zügner, Amir Akbarnejad, Stephan Günnemann
Even more, our attacks are transferable: the learned attacks generalize to other state-of-the-art node classification models and unsupervised approaches, and likewise are successful even when only limited knowledge about the graph is given.
1 code implementation • ICML 2018 • Aleksandar Bojchevski, Oleksandr Shchur, Daniel Zügner, Stephan Günnemann
NetGAN is able to produce graphs that exhibit well-known network patterns without explicitly specifying them in the model definition.
no code implementations • ICLR 2018 • Aleksandar Bojchevski, Oleksandr Shchur, Daniel Zügner, Stephan Günnemann
Moreover, GraphGAN learns a semantic mapping from the latent input space to the generated graph's properties.
1 code implementation • 29 Nov 2017 • Stephan Rabanser, Oleksandr Shchur, Stephan Günnemann
Tensors are multidimensional arrays of numerical values and therefore generalize matrices to multiple dimensions.
1 code implementation • ICLR 2018 • Aleksandar Bojchevski, Stephan Günnemann
We propose Graph2Gauss - an approach that can efficiently learn versatile node embeddings on large scale (attributed) graphs that show strong performance on tasks such as link prediction and node classification.
1 code implementation • 27 Jun 2014 • Wolfgang Gatterbauer, Stephan Günnemann, Danai Koutra, Christos Faloutsos
Often, we can answer such questions and label nodes in a network based on the labels of their neighbors and appropriate assumptions of homophily ("birds of a feather flock together") or heterophily ("opposites attract").