Graph Property Prediction

47 papers with code • 5 benchmarks • 3 datasets

This task has no description! Would you like to contribute one?

Most implemented papers

Generative Adversarial Networks

goodfeli/adversarial Proceedings of the 27th International Conference on Neural Information Processing Systems 2014

We propose a new framework for estimating generative models via an adversarial process, in which we simultaneously train two models: a generative model G that captures the data distribution, and a discriminative model D that estimates the probability that a sample came from the training data rather than G. The training procedure for G is to maximize the probability of D making a mistake.

How Attentive are Graph Attention Networks?

tech-srl/how_attentive_are_gats ICLR 2022

Because GATs use a static attention mechanism, there are simple graph problems that GAT cannot express: in a controlled problem, we show that static attention hinders GAT from even fitting the training data.

E(n) Equivariant Graph Neural Networks

vgsatorras/egnn 19 Feb 2021

This paper introduces a new model to learn graph neural networks equivariant to rotations, translations, reflections and permutations called E(n)-Equivariant Graph Neural Networks (EGNNs).

Do Transformers Really Perform Bad for Graph Representation?

Microsoft/Graphormer 9 Jun 2021

Our key insight to utilizing Transformer in the graph is the necessity of effectively encoding the structural information of a graph into the model.

Large-scale Robust Deep AUC Maximization: A New Surrogate Loss and Empirical Studies on Medical Image Classification

Optimization-AI/LibAUC ICCV 2021

Our studies demonstrate that the proposed DAM method improves the performance of optimizing cross-entropy loss by a large margin, and also achieves better performance than optimizing the existing AUC square loss on these medical image classification tasks.

Recipe for a General, Powerful, Scalable Graph Transformer

rampasek/GraphGPS 25 May 2022

We propose a recipe on how to build a general, powerful, scalable (GPS) graph Transformer with linear complexity and state-of-the-art results on a diverse set of benchmarks.

Equiformer: Equivariant Graph Attention Transformer for 3D Atomistic Graphs

atomicarchitects/equiformer 23 Jun 2022

Despite their widespread success in various domains, Transformer networks have yet to perform well across datasets in the domain of 3D atomistic graphs such as molecules even when 3D-related inductive biases like translational invariance and rotational equivariance are considered.

DeeperGCN: All You Need to Train Deeper GCNs

dmlc/dgl 13 Jun 2020

Graph Convolutional Networks (GCNs) have been drawing significant attention with the power of representation learning on graphs.

Global Self-Attention as a Replacement for Graph Convolution

shamim-hussain/egt_pytorch 7 Aug 2021

The resultant framework - which we call Edge-augmented Graph Transformer (EGT) - can directly accept, process and output structural information of arbitrary form, which is important for effective learning on graph-structured data.

Triplet Interaction Improves Graph Transformers: Accurate Molecular Graph Learning with Triplet Graph Transformers

shamim-hussain/tgt 7 Feb 2024

We also obtain SOTA results on QM9, MOLPCBA, and LIT-PCBA molecular property prediction benchmarks via transfer learning.