Graph Attention
409 papers with code • 0 benchmarks • 1 datasets
Benchmarks
These leaderboards are used to track progress in Graph Attention
Libraries
Use these libraries to find Graph Attention models and implementationsMost implemented papers
Graph Attention Networks
We present graph attention networks (GATs), novel neural network architectures that operate on graph-structured data, leveraging masked self-attentional layers to address the shortcomings of prior methods based on graph convolutions or their approximations.
GraphSAINT: Graph Sampling Based Inductive Learning Method
Graph Convolutional Networks (GCNs) are powerful models for learning representations of attributed graphs.
How Attentive are Graph Attention Networks?
Because GATs use a static attention mechanism, there are simple graph problems that GAT cannot express: in a controlled problem, we show that static attention hinders GAT from even fitting the training data.
Representation Learning on Graphs with Jumping Knowledge Networks
Furthermore, combining the JK framework with models like Graph Convolutional Networks, GraphSAGE and Graph Attention Networks consistently improves those models' performance.
Graph Neural Networks: A Review of Methods and Applications
Lots of learning tasks require dealing with graph data which contains rich relation information among elements.
Inductive Representation Learning on Temporal Graphs
Moreover, node and topological features can be temporal as well, whose patterns the node embeddings should also capture.
Visual-Semantic Graph Attention Networks for Human-Object Interaction Detection
Few works have studied the disambiguating contribution of subsidiary relations made available via graph networks.
Graph Neural Network for Traffic Forecasting: A Survey
In recent years, to model the graph structures in transportation systems as well as contextual information, graph neural networks have been introduced and have achieved state-of-the-art performance in a series of traffic forecasting problems.
Spatial Graph Attention and Curiosity-driven Policy for Antiviral Drug Discovery
We developed Distilled Graph Attention Policy Network (DGAPN), a reinforcement learning model to generate novel graph-structured chemical representations that optimize user-defined objectives by efficiently navigating a physically constrained domain.
Equiformer: Equivariant Graph Attention Transformer for 3D Atomistic Graphs
Despite their widespread success in various domains, Transformer networks have yet to perform well across datasets in the domain of 3D atomistic graphs such as molecules even when 3D-related inductive biases like translational invariance and rotational equivariance are considered.