We propose to theoretically and empirically examine the effect of incorporating weighting schemes into walk-aggregating GNNs. To this end, we propose a simple, interpretable, and end-to-end supervised GNN model, called AWARE (Attentive Walk-Aggregating GRaph Neural NEtwork), for graph-level prediction. AWARE aggregates the walk information by means of weighting schemes at distinct levels (vertex-, walk-, and graph-level) in a principled manner. By virtue of the incorporated weighting schemes at these different levels, AWARE can emphasize the information important for prediction while diminishing the irrelevant ones—leading to representations that can improve learning performance.
Paper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Language Modelling | 30 | 3.68% |
Language Modeling | 21 | 2.58% |
Quantization | 18 | 2.21% |
Retrieval | 17 | 2.09% |
Semantic Segmentation | 17 | 2.09% |
Autonomous Driving | 15 | 1.84% |
Diversity | 14 | 1.72% |
Decision Making | 14 | 1.72% |
Large Language Model | 13 | 1.60% |
Component | Type |
|
---|---|---|
🤖 No Components Found | You can add them if they exist; e.g. Mask R-CNN uses RoIAlign |