no code implementations • 11 Sep 2023 • Sebastian Schmidt, Stephan Günnemann
We exploited the temporal properties for such image streams in our work and proposed the novel temporal predicted loss (TPL) method.
1 code implementation • 16 Aug 2023 • Francesco Campi, Lukas Gosch, Tom Wollschläger, Yan Scholten, Stephan Günnemann
We perform the first adversarial robustness study into Graph Neural Networks (GNNs) that are provably more powerful than traditional Message Passing Neural Networks (MPNNs).
no code implementations • 9 Aug 2023 • Armin Moin, Atta Badii, Stephan Günnemann, Moharram Challenger
Therefore, they failed to address the architecture viewpoints and views responsive to the concerns of the data science community.
1 code implementation • 17 Jul 2023 • Xuan Zhang, Limei Wang, Jacob Helwig, Youzhi Luo, Cong Fu, Yaochen Xie, Meng Liu, Yuchao Lin, Zhao Xu, Keqiang Yan, Keir Adams, Maurice Weiler, Xiner Li, Tianfan Fu, Yucheng Wang, Haiyang Yu, Yuqing Xie, Xiang Fu, Alex Strasser, Shenglong Xu, Yi Liu, Yuanqi Du, Alexandra Saxton, Hongyi Ling, Hannah Lawrence, Hannes Stärk, Shurui Gui, Carl Edwards, Nicholas Gao, Adriana Ladera, Tailin Wu, Elyssa F. Hofgard, Aria Mansouri Tehrani, Rui Wang, Ameya Daigavane, Montgomery Bohde, Jerry Kurtin, Qian Huang, Tuong Phung, Minkai Xu, Chaitanya K. Joshi, Simon V. Mathis, Kamyar Azizzadenesheli, Ada Fang, Alán Aspuru-Guzik, Erik Bekkers, Michael Bronstein, Marinka Zitnik, Anima Anandkumar, Stefano Ermon, Pietro Liò, Rose Yu, Stephan Günnemann, Jure Leskovec, Heng Ji, Jimeng Sun, Regina Barzilay, Tommi Jaakkola, Connor W. Coley, Xiaoning Qian, Xiaofeng Qian, Tess Smidt, Shuiwang Ji
Advances in artificial intelligence (AI) are fueling a new paradigm of discoveries in natural sciences.
1 code implementation • 10 Jul 2023 • Franziska Schwaiger, Andrea Matic, Karsten Roscher, Stephan Günnemann
The ability to detect learned objects regardless of their appearance is crucial for autonomous systems in real-world applications.
1 code implementation • 3 Jul 2023 • Jianxiang Feng, Matan Atad, Ismael Rodríguez, Maximilian Durner, Stephan Günnemann, Rudolph Triebel
Machine Learning (ML) models in Robotic Assembly Sequence Planning (RASP) need to be introspective on the predicted solutions, i. e. whether they are feasible or not, to circumvent potential efficiency degradation.
no code implementations • 27 Jun 2023 • Lukas Gosch, Simon Geisler, Daniel Sturm, Bertrand Charpentier, Daniel Zügner, Stephan Günnemann
Including these contributions, we demonstrate that adversarial training is a state-of-the-art defense against adversarial structure perturbations.
no code implementations • 20 Jun 2023 • Tom Wollschläger, Nicholas Gao, Bertrand Charpentier, Mohamed Amine Ketata, Stephan Günnemann
Graph Neural Networks (GNNs) are promising surrogates for quantum mechanical calculations as they establish unprecedented low errors on collections of molecular dynamics (MD) trajectories.
no code implementations • 30 May 2023 • Leon Hetzel, Johanna Sommer, Bastian Rieck, Fabian Theis, Stephan Günnemann
To this end, we introduce a novel factorisation of the molecules' data distribution that accounts for the molecules' global context and facilitates learning adequate assignments of atoms and bonds onto shapes.
no code implementations • 29 May 2023 • Marten Lienen, Jan Hansen-Palmus, David Lüdke, Stephan Günnemann
Turbulent flows are well known to be chaotic and hard to predict; however, their dynamics differ between two and three dimensions.
1 code implementation • 17 May 2023 • Emanuele Rossi, Bertrand Charpentier, Francesco Di Giovanni, Fabrizio Frasca, Stephan Günnemann, Michael Bronstein
Graph Neural Networks (GNNs) have become the de-facto standard tool for modeling relational data.
Ranked #1 on
Node Classification
on snap-patents
no code implementations • 1 May 2023 • Lukas Gosch, Daniel Sturm, Simon Geisler, Stephan Günnemann
Many works show that node-level predictions of Graph Neural Networks (GNNs) are unrobust to small, often termed adversarial, changes to the graph structure.
no code implementations • 30 Apr 2023 • Nicola Franco, Tom Wollschläger, Benedikt Poggel, Stephan Günnemann, Jeanette Miriam Lorenz
We conduct a detailed analysis for the decomposition of MILP with Benders and Dantzig-Wolfe methods.
no code implementations • 6 Apr 2023 • Jonas Gregor Wiese, Lisa Wimmer, Theodore Papamarkou, Bernd Bischl, Stephan Günnemann, David Rügamer
Bayesian inference in deep neural networks is challenging due to the high-dimensional, strongly multi-modal parameter posterior density landscape.
no code implementations • 4 Apr 2023 • Johanna Sommer, Leon Hetzel, David Lüdke, Fabian Theis, Stephan Günnemann
Machine learning for molecules holds great potential for efficiently exploring the vast chemical space and thus streamlining the drug discovery process by facilitating the design of new therapeutic molecules.
no code implementations • 3 Apr 2023 • Johannes Getzner, Bertrand Charpentier, Stephan Günnemann
Modern machine learning models have started to consume incredible amounts of energy, thus incurring large carbon footprints (Strubell et al., 2019).
1 code implementation • 10 Mar 2023 • Bertrand Charpentier, Chenxiang Zhang, Stephan Günnemann
Accurate and efficient uncertainty estimation is crucial to build reliable Machine Learning (ML) models capable to provide calibrated uncertainty estimates, generalize and detect Out-Of-Distribution (OOD) datasets.
1 code implementation • 8 Mar 2023 • Arthur Kosmala, Johannes Gasteiger, Nicholas Gao, Stephan Günnemann
Neural architectures that learn potential energy surfaces from molecular data have undergone fast improvement in recent years.
1 code implementation • 8 Feb 2023 • Nicholas Gao, Stephan Günnemann
To overcome this limitation, we present Graph-learned orbital embeddings (Globe), a neural network-based reparametrization method that can adapt neural wave functions to different molecules.
no code implementations • 6 Feb 2023 • Jan Schuchardt, Aleksandar Bojchevski, Johannes Gasteiger, Stephan Günnemann
In tasks like node classification, image segmentation, and named-entity recognition we have a classifier that simultaneously outputs multiple predictions (a vector of labels) based on a single input, i. e. a single graph, image, or document respectively.
no code implementations • 31 Jan 2023 • Felix Mujkanovic, Simon Geisler, Stephan Günnemann, Aleksandar Bojchevski
A cursory reading of the literature suggests that we have made a lot of progress in designing effective adversarial defenses for Graph Neural Networks (GNNs).
1 code implementation • 31 Jan 2023 • Simon Geisler, Yujia Li, Daniel Mankowitz, Ali Taylan Cemgil, Stephan Günnemann, Cosmin Paduraru
Transformers were originally proposed as a sequence-to-sequence model for text but have become vital for a wide range of modalities, including images, audio, video, and undirected graphs.
Ranked #1 on
Graph Property Prediction
on ogbg-code2
1 code implementation • 5 Jan 2023 • Yan Scholten, Jan Schuchardt, Simon Geisler, Aleksandar Bojchevski, Stephan Günnemann
To remedy this, we propose novel gray-box certificates that exploit the message-passing principle of GNNs: We randomly intercept messages and carefully analyze the probability that messages from adversarially controlled nodes reach their target nodes.
no code implementations • 2 Jan 2023 • Morgane Ayle, Jan Schuchardt, Lukas Gosch, Daniel Zügner, Stephan Günnemann
We propose to solve this issue by training graph neural networks on disjoint subgraphs of a given training graph.
no code implementations • 18 Dec 2022 • Johannes Gasteiger, Chendi Qian, Stephan Günnemann
Using graph neural networks for large graphs is challenging since there is no clear way of constructing mini-batches.
no code implementations • 25 Nov 2022 • Jan Schuchardt, Stephan Günnemann
Building models that comply with the invariances inherent to different domains, such as invariance under translation or rotation, is a key aspect of applying machine learning to real world problems like molecular property prediction, medical imaging, protein folding or LiDAR classification.
no code implementations • 4 Nov 2022 • Marin Biloš, Kashif Rasul, Anderson Schneider, Yuriy Nevmyvaka, Stephan Günnemann
Temporal data such as time series can be viewed as discretized measurements of the underlying function.
no code implementations • 28 Oct 2022 • Jan Schuchardt, Tom Wollschläger, Aleksandar Bojchevski, Stephan Günnemann
We further show that this approach is beneficial for the larger class of softly local models, where each output is dependent on the entire input but assigns different levels of importance to different input regions (e. g. based on their proximity in the image).
1 code implementation • 22 Oct 2022 • Marten Lienen, Stephan Günnemann
We introduce an ODE solver for the PyTorch ecosystem that can solve multiple ODEs in parallel independently from each other while achieving significant performance gains.
no code implementations • 19 Oct 2022 • Marin Biloš, Emanuel Ramneantu, Stephan Günnemann
Observations made in continuous time are often irregular and contain the missing values across different channels.
no code implementations • 15 Oct 2022 • Raffaele Paolino, Aleksandar Bojchevski, Stephan Günnemann, Gitta Kutyniok, Ron Levie
A powerful framework for studying graphs is to consider them as geometric graphs: nodes are randomly sampled from an underlying metric space, and any pair of nodes is connected if their distance is less than a specified neighborhood radius.
1 code implementation • 16 Sep 2022 • Alexandru Mara, Jefrey Lijffijt, Stephan Günnemann, Tijl De Bie
We find that node classification results are impacted more than network reconstruction ones, that degree-based and label-based attacks are on average the most damaging and that label heterophily can strongly influence attack performance.
no code implementations • 15 Sep 2022 • Jörg Christian Kirchhof, Evgeny Kusmenko, Jonas Ritz, Bernhard Rumpe, Armin Moin, Atta Badii, Stephan Günnemann, Moharram Challenger
In this paper, we propose to adopt the MDE paradigm for the development of Machine Learning (ML)-enabled software systems with a focus on the Internet of Things (IoT) domain.
1 code implementation • 17 Jul 2022 • Jonathan Külz, Andreas Spitz, Ahmad Abu-Akel, Stephan Günnemann, Robert West
There is a widespread belief that the tone of US political language has become more negative recently, in particular when Donald Trump entered politics.
no code implementations • 9 Jul 2022 • Morgane Ayle, Bertrand Charpentier, John Rachwan, Daniel Zügner, Simon Geisler, Stephan Günnemann
The robustness and anomaly detection capability of neural networks are crucial topics for their safe adoption in the real-world.
1 code implementation • 21 Jun 2022 • John Rachwan, Daniel Zügner, Bertrand Charpentier, Simon Geisler, Morgane Ayle, Stephan Günnemann
Pruning, the task of sparsifying deep neural networks, received increasing attention recently.
1 code implementation • CVPR2022W 2022 • Codruţ-Andrei Diaconu, Sudipan Saha, Stephan Günnemann, Xiao Xiang Zhu
Climate change is perhaps the biggest single threat to humankind and the environment, as it severely impacts our terrestrial surface, home to most of the living species.
Ranked #2 on
Earth Surface Forecasting
on EarthNet2021 OOD Track
no code implementations • 3 Jun 2022 • Bertrand Charpentier, Ransalu Senanayake, Mykel Kochenderfer, Stephan Günnemann
Characterizing aleatoric and epistemic uncertainty can be used to speed up learning in a training environment, improve generalization to similar testing environments, and flag unfamiliar behavior in anomalous testing environments.
1 code implementation • 30 May 2022 • Nicholas Gao, Stephan Günnemann
In this work, we address the inference shortcomings by proposing the Potential learning from ab-initio Networks (PlaNet) framework, in which we simultaneously train a surrogate model in addition to the neural wave function.
1 code implementation • 28 Apr 2022 • Leon Hetzel, Simon Böhm, Niki Kilbertus, Stephan Günnemann, Mohammad Lotfollahi, Fabian Theis
Single-cell transcriptomics enabled the study of cellular heterogeneity in response to perturbations at the resolution of individual cells.
1 code implementation • 6 Apr 2022 • Johannes Gasteiger, Muhammed Shuaibi, Anuroop Sriram, Stephan Günnemann, Zachary Ulissi, C. Lawrence Zitnick, Abhishek Das
This work investigates this question by first developing the GemNet-OC model based on the large Open Catalyst 2020 (OC20) dataset.
Ranked #1 on
Initial Structure to Relaxed Energy (IS2RE)
on OC20
no code implementations • 16 Mar 2022 • Poulami Sinhamahapatra, Rajat Koner, Karsten Roscher, Stephan Günnemann
It is essential for safety-critical applications of deep neural networks to determine when new inputs are significantly different from the training distribution.
1 code implementation • ICLR 2022 • Bertrand Charpentier, Simon Kibler, Stephan Günnemann
To this end, DP-DAG samples a DAG by successively (1) sampling a linear ordering of the node and (2) sampling edges consistent with the sampled linear ordering.
1 code implementation • ICLR 2022 • Marten Lienen, Stephan Günnemann
We propose a new method for spatio-temporal forecasting on arbitrarily distributed points.
Interpretability Techniques for Deep Learning
Interpretable Machine Learning
+2
no code implementations • 6 Mar 2022 • Armin Moin, Ukrit Wattanavaekin, Alexandra Lungu, Moharram Challenger, Atta Badii, Stephan Günnemann
Developing smart software services requires both Software Engineering and Artificial Intelligence (AI) skills.
no code implementations • 17 Feb 2022 • Oliver Borchert, David Salinas, Valentin Flunkert, Tim Januschowski, Stephan Günnemann
By learning a mapping from forecasting models to performance metrics, we show that our method PARETOSELECT is able to accurately select models from the Pareto front -- alleviating the need to train or evaluate many forecasting models for model selection.
1 code implementation • 17 Feb 2022 • Tong Zhao, Wei Jin, Yozen Liu, Yingheng Wang, Gang Liu, Stephan Günnemann, Neil Shah, Meng Jiang
Overall, our work aims to clarify the landscape of existing literature in graph data augmentation and motivates additional work in this area, providing a helpful resource for researchers and practitioners in the broader graph machine learning domain.
no code implementations • NeurIPS 2021 • Johannes Gasteiger, Chandan Yeshwanth, Stephan Günnemann
We furthermore set the state of the art on ZINC and coordinate-free QM9 by incorporating synthetic coordinates in the SMP and DimeNet++ models.
1 code implementation • NeurIPS 2021 • Maximilian Stadler, Bertrand Charpentier, Simon Geisler, Daniel Zügner, Stephan Günnemann
GPN outperforms existing approaches for uncertainty estimation in the experiments.
1 code implementation • NeurIPS 2021 • Simon Geisler, Tobias Schmidt, Hakan Şirin, Daniel Zügner, Aleksandar Bojchevski, Stephan Günnemann
Graph Neural Networks (GNNs) are increasingly important given their popularity and the diversity of applications.
1 code implementation • NeurIPS 2021 • Marin Biloš, Johanna Sommer, Syama Sundar Rangapuram, Tim Januschowski, Stephan Günnemann
Neural ordinary differential equations describe how values change in time.
no code implementations • ICLR 2022 • Simon Geisler, Johanna Sommer, Jan Schuchardt, Aleksandar Bojchevski, Stephan Günnemann
Specifically, most datasets only capture a simpler subproblem and likely suffer from spurious features.
1 code implementation • ICLR 2022 • Nicholas Gao, Stephan Günnemann
Solving the Schr\"odinger equation is key to many quantum mechanical properties.
no code implementations • 11 Oct 2021 • Peter Súkeník, Aleksei Kuvshinov, Stephan Günnemann
We show that in general, the input-dependent smoothing suffers from the curse of dimensionality, forcing the variance function to have low semi-elasticity.
1 code implementation • NeurIPS Workshop AI4Scien 2021 • Hannes Stärk, Dominique Beaini, Gabriele Corso, Prudencio Tossou, Christian Dallago, Stephan Günnemann, Pietro Liò
Molecular property prediction is one of the fastest-growing applications of deep learning with critical real-world impacts.
no code implementations • 29 Sep 2021 • Anna-Kathrin Kopetzki, Jana Obernosterer, Aleksandar Bojchevski, Stephan Günnemann
Our experiments show how adversarial training on the source domain affects robustness on source and target domain, and we propose the first provably robust transfer learning models.
no code implementations • 29 Sep 2021 • Johannes Klicpera, Chendi Qian, Stephan Günnemann
Training graph neural networks on large graphs is challenging since there is no clear way of how to extract mini batches from connected data.
no code implementations • 29 Sep 2021 • Hannes Stärk, Dominique Beaini, Gabriele Corso, Prudencio Tossou, Christian Dallago, Stephan Günnemann, Pietro Lio
Molecular property prediction is one of the fastest-growing applications of deep learning with critical real-world impacts.
no code implementations • ICLR 2022 • Daniel Zügner, Bertrand Charpentier, Morgane Ayle, Sascha Geringer, Stephan Günnemann
We propose a novel probabilistic model over hierarchies on graphs obtained by continuous relaxation of tree-based hierarchies.
no code implementations • 10 Sep 2021 • Daniel Zügner, François-Xavier Aubet, Victor Garcia Satorras, Tim Januschowski, Stephan Günnemann, Jan Gasthaus
We study a recent class of models which uses graph neural networks (GNNs) to improve forecasting in multivariate time series.
no code implementations • 6 Sep 2021 • Sebastian Bischoff, Stephan Günnemann, Martin Jaggi, Sebastian U. Stich
We consider federated learning (FL), where the training data is distributed across a large number of clients.
1 code implementation • 30 Aug 2021 • Johannes C. Paetzold, Julian McGinnis, Suprosanna Shit, Ivan Ezhov, Paul Büschl, Chinmay Prabhakar, Mihail I. Todorov, Anjany Sekuboyina, Georgios Kaissis, Ali Ertürk, Stephan Günnemann, Bjoern H. Menze
Moreover, we benchmark numerous state-of-the-art graph learning algorithms on the biologically relevant tasks of vessel prediction and vessel classification using the introduced vessel graph dataset.
1 code implementation • 19 Jul 2021 • Rajat Koner, Poulami Sinhamahapatra, Karsten Roscher, Stephan Günnemann, Volker Tresp
A serious problem in image classification is that a trained model might perform well for input data that originates from the same distribution as the data available for model training, but performs much worse for out-of-distribution (OOD) samples.
no code implementations • 14 Jul 2021 • Johannes Gasteiger, Marten Lienen, Stephan Günnemann
The current best practice for computing optimal transport (OT) is via entropy regularization and Sinkhorn iterations.
no code implementations • 14 Jul 2021 • Armin Moin, Moharram Challenger, Atta Badii, Stephan Günnemann
Over the past decade, Artificial Intelligence (AI) has provided enormous new possibilities and opportunities, but also new demands and requirements for software systems.
1 code implementation • 13 Jul 2021 • Rajat Koner, Hang Li, Marcel Hildebrandt, Deepan Das, Volker Tresp, Stephan Günnemann
We conduct an experimental study on the challenging dataset GQA, based on both manually curated and automatically generated scene graphs.
1 code implementation • 6 Jul 2021 • Armin Moin, Andrei Mituca, Moharram Challenger, Atta Badii, Stephan Günnemann
In this paper, we present ML-Quadrat, an open-source research prototype that is based on the Eclipse Modeling Framework (EMF) and the state of the art in the literature of Model-Driven Software Engineering (MDSE) for smart Cyber-Physical Systems (CPS) and the Internet of Things (IoT).
1 code implementation • 6 Jul 2021 • Armin Moin, Moharram Challenger, Atta Badii, Stephan Günnemann
In particular, we implement the proposed approach, called ML-Quadrat, based on ThingML, and validate it using a case study from the IoT domain, as well as through an empirical user evaluation.
no code implementations • 6 Jul 2021 • Armin Moin, Moharram Challenger, Atta Badii, Stephan Günnemann
We focus on a sub-discipline of AI, namely Machine Learning (ML) and propose the delegation of data analytics and ML to the IoT edge.
1 code implementation • 3 Jul 2021 • Sven Elflein, Bertrand Charpentier, Daniel Zügner, Stephan Günnemann
Several density estimation methods have shown to fail to detect out-of-distribution (OOD) samples by assigning higher likelihoods to anomalous data.
no code implementations • NeurIPS 2021 • Oleksandr Shchur, Ali Caner Türkmen, Tim Januschowski, Jan Gasthaus, Stephan Günnemann
Automatically detecting anomalies in event data can provide substantial value in domains such as healthcare, DevOps, and information security.
3 code implementations • NeurIPS 2021 • Johannes Gasteiger, Florian Becker, Stephan Günnemann
Effectively predicting molecular interactions has the potential to accelerate molecular dynamics by multiple orders of magnitude and thus revolutionize chemical simulations.
1 code implementation • ICLR 2022 • Bertrand Charpentier, Oliver Borchert, Daniel Zügner, Simon Geisler, Stephan Günnemann
Uncertainty awareness is crucial to develop reliable machine learning models.
no code implementations • 8 Apr 2021 • Oleksandr Shchur, Ali Caner Türkmen, Tim Januschowski, Stephan Günnemann
Temporal point processes (TPP) are probabilistic generative models for continuous-time event sequences.
1 code implementation • ICLR 2021 • Daniel Zügner, Tobias Kirschstein, Michele Catasta, Jure Leskovec, Stephan Günnemann
Source code (Context) and its parsed abstract syntax tree (AST; Structure) are two complementary representations of the same computer program.
no code implementations • 1 Jan 2021 • Johannes Klicpera, Marten Lienen, Stephan Günnemann
Optimal transport (OT) is a cornerstone of many machine learning tasks.
no code implementations • ICLR 2021 • Jan Schuchardt, Aleksandar Bojchevski, Johannes Klicpera, Stephan Günnemann
In tasks like node classification, image segmentation, and named-entity recognition we have a classifier that simultaneously outputs multiple predictions (a vector of labels) based on a single input, i. e. a single graph, image, or document respectively.
no code implementations • NeurIPS 2020 • Richard Kurle, Syama Sundar Rangapuram, Emmanuel de Bézenac, Stephan Günnemann, Jan Gasthaus
We propose a Monte Carlo objective that leverages the conditional linearity by computing the corresponding conditional expectations in closed-form and a suitable proposal distribution that is factorised similarly to the optimal proposal distribution.
4 code implementations • 28 Nov 2020 • Johannes Gasteiger, Shankari Giri, Johannes T. Margraf, Stephan Günnemann
Many important tasks in chemistry revolve around molecules during reactions.
1 code implementation • NeurIPS 2020 • Simon Geisler, Daniel Zügner, Stephan Günnemann
Perturbations targeting the graph structure have proven to be extremely effective in reducing the performance of Graph Neural Networks (GNNs), and traditional defenses such as adversarial training do not seem to be able to improve robustness.
1 code implementation • 28 Oct 2020 • Anna-Kathrin Kopetzki, Bertrand Charpentier, Daniel Zügner, Sandhya Giri, Stephan Günnemann
Dirichlet-based uncertainty (DBU) models are a recent and promising class of uncertainty-aware models.
no code implementations • 7 Oct 2020 • Marin Biloš, Stephan Günnemann
Modeling sets is an important problem in machine learning since this type of data can be found in many domains.
no code implementations • 28 Sep 2020 • Marin Biloš, Stephan Günnemann
To model this behavior, it is enough to transform the samples from the uniform process with a sufficiently complex equivariant function.
1 code implementation • 22 Sep 2020 • Armin Moin, Stephan Rössler, Stephan Günnemann
In this paper, we present the current position of the research project ML-Quadrat, which aims to extend the methodology, modeling language and tool support of ThingML - an open source modeling tool for IoT/CPS - to address Machine Learning needs for the IoT applications.
1 code implementation • 22 Sep 2020 • Armin Moin, Stephan Rössler, Marouane Sayih, Stephan Günnemann
In this paper, we illustrate how to enhance an existing state-of-the-art modeling language and tool for the Internet of Things (IoT), called ThingML, to support machine learning on the modeling level.
no code implementations • ICML 2020 • Aleksandar Bojchevski, Johannes Gasteiger, Stephan Günnemann
Existing techniques for certifying the robustness of models for discrete data either work only for a small class of models or are general at the expense of efficiency or tightness.
no code implementations • 28 Jul 2020 • Anna-Kathrin Kopetzki, Stephan Günnemann
This principle is highly versatile, as we show.
no code implementations • 15 Jul 2020 • Nick Harmening, Marin Biloš, Stephan Günnemann
Determining the traffic scenario space is a major challenge for the homologation and coverage assessment of automated driving functions.
2 code implementations • 3 Jul 2020 • Aleksandar Bojchevski, Johannes Gasteiger, Bryan Perozzi, Amol Kapoor, Martin Blais, Benedek Rózemberczki, Michal Lukasik, Stephan Günnemann
Graph neural networks (GNNs) have emerged as a powerful approach for solving many network mining tasks.
no code implementations • 2 Jul 2020 • Marcel Hildebrandt, Hang Li, Rajat Koner, Volker Tresp, Stephan Günnemann
We propose a novel method that approaches the task by performing context-driven, sequential reasoning based on the objects and their semantic and spatial relationships present in the scene.
1 code implementation • NeurIPS 2020 • Oleksandr Shchur, Nicholas Gao, Marin Biloš, Stephan Günnemann
Temporal point process (TPP) models combined with recurrent neural networks provide a powerful framework for modeling continuous-time event data.
1 code implementation • NeurIPS 2020 • Bertrand Charpentier, Daniel Zügner, Stephan Günnemann
The posterior distributions learned by PostNet accurately reflect uncertainty for in- and out-of-distribution data -- without requiring access to OOD data at training time.
no code implementations • ICLR 2020 • Richard Kurle, Botond Cseke, Alexej Klushyn, Patrick van der Smagt, Stephan Günnemann
We represent the posterior approximation of the network weights by a diagonal Gaussian distribution and a complementary memory of raw data.
1 code implementation • AKBC 2020 • Zhen Han, Yunpu Ma, Yuyi Wang, Stephan Günnemann, Volker Tresp
The Hawkes process has become a standard method for modeling self-exciting event sequences with different event types.
4 code implementations • ICLR 2020 • Johannes Gasteiger, Janek Groß, Stephan Günnemann
Each message is associated with a direction in coordinate space.
Ranked #2 on
Drug Discovery
on QM9
1 code implementation • 22 Nov 2019 • Alexander Ziller, Julius Hansjakob, Vitalii Rusinov, Daniel Zügner, Peter Vogel, Stephan Günnemann
We release a realistic, diverse, and challenging dataset for object detection on images.
1 code implementation • NeurIPS 2019 • Marin Biloš, Bertrand Charpentier, Stephan Günnemann
Asynchronous event sequences are the basis of many applications throughout different industries.
1 code implementation • NeurIPS 2019 • Aleksandar Bojchevski, Stephan Günnemann
Despite the exploding interest in graph neural networks there has been little effort to verify and improve their robustness.
2 code implementations • NeurIPS 2019 • Johannes Gasteiger, Stefan Weißenberger, Stephan Günnemann
In this work, we remove the restriction of using only the direct neighbors by introducing a powerful, yet spatially localized graph convolution: Graph diffusion convolution (GDC).
Ranked #3 on
Node Classification
on AMZ Comp
1 code implementation • ICLR 2019 • Oleksandr Shchur, Stephan Günnemann
Community detection is a fundamental problem in machine learning.
3 code implementations • ICLR 2020 • Oleksandr Shchur, Marin Biloš, Stephan Günnemann
The standard way of learning in such models is by estimating the conditional intensity function.
1 code implementation • 28 Jun 2019 • Daniel Zügner, Stephan Günnemann
Recent works show that Graph Neural Networks (GNNs) are highly non-robust with respect to adversarial attacks on both the graph structure and the node attributes, making their outcomes unreliable.
Ranked #20 on
Node Classification
on Pubmed
no code implementations • ICLR 2019 • Aleksandar Bojchevski, Stephan Günnemann
The goal of network representation learning is to learn low-dimensional node embeddings that capture the graph structure and are useful for solving downstream tasks.
1 code implementation • ICLR 2019 • Daniel Zügner, Stephan Günnemann
Deep learning models for graphs have advanced the state of the art on many tasks.
2 code implementations • 14 Nov 2018 • Oleksandr Shchur, Maximilian Mumme, Aleksandar Bojchevski, Stephan Günnemann
We perform a thorough empirical evaluation of four prominent GNN models and show that considering different splits of the data leads to dramatically different rankings of models.
no code implementations • 11 Nov 2018 • Richard Kurle, Stephan Günnemann, Patrick van der Smagt
Learning from multiple sources of information is an important problem in machine-learning research.
1 code implementation • NeurIPS 2019 • Stephan Rabanser, Stephan Günnemann, Zachary C. Lipton
We might hope that when faced with unexpected inputs, well-designed software systems would fire off warnings.
5 code implementations • ICLR 2019 • Johannes Gasteiger, Aleksandar Bojchevski, Stephan Günnemann
We utilize this propagation procedure to construct a simple model, personalized propagation of neural predictions (PPNP), and its fast approximation, APPNP.
Ranked #1 on
Node Classification
on MS ACADEMIC
General Classification
Node Classification on Non-Homophilic (Heterophilic) Graphs
no code implementations • 3 Oct 2018 • Roberto Alonso, Stephan Günnemann
Mining dense quasi-cliques is a well-known clustering task with applications ranging from social networks over collaboration graphs to document analysis.
1 code implementation • ICLR 2019 • Aleksandar Bojchevski, Stephan Günnemann
The goal of network representation learning is to learn low-dimensional node embeddings that capture the graph structure and are useful for solving downstream tasks.
no code implementations • 3 Jun 2018 • Federico Monti, Oleksandr Shchur, Aleksandar Bojchevski, Or Litany, Stephan Günnemann, Michael M. Bronstein
In recent years, there has been a surge of interest in developing deep learning methods for non-Euclidean structured data such as graphs.
1 code implementation • 21 May 2018 • Daniel Zügner, Amir Akbarnejad, Stephan Günnemann
Even more, our attacks are transferable: the learned attacks generalize to other state-of-the-art node classification models and unsupervised approaches, and likewise are successful even when only limited knowledge about the graph is given.
1 code implementation • ICML 2018 • Aleksandar Bojchevski, Oleksandr Shchur, Daniel Zügner, Stephan Günnemann
NetGAN is able to produce graphs that exhibit well-known network patterns without explicitly specifying them in the model definition.
no code implementations • ICLR 2018 • Aleksandar Bojchevski, Oleksandr Shchur, Daniel Zügner, Stephan Günnemann
Moreover, GraphGAN learns a semantic mapping from the latent input space to the generated graph's properties.
1 code implementation • 29 Nov 2017 • Stephan Rabanser, Oleksandr Shchur, Stephan Günnemann
Tensors are multidimensional arrays of numerical values and therefore generalize matrices to multiple dimensions.
1 code implementation • ICLR 2018 • Aleksandar Bojchevski, Stephan Günnemann
We propose Graph2Gauss - an approach that can efficiently learn versatile node embeddings on large scale (attributed) graphs that show strong performance on tasks such as link prediction and node classification.
1 code implementation • 27 Jun 2014 • Wolfgang Gatterbauer, Stephan Günnemann, Danai Koutra, Christos Faloutsos
Often, we can answer such questions and label nodes in a network based on the labels of their neighbors and appropriate assumptions of homophily ("birds of a feather flock together") or heterophily ("opposites attract").