Graph Sampling
41 papers with code • 0 benchmarks • 3 datasets
Training GNNs or generating graph embeddings requires graph samples.
Benchmarks
These leaderboards are used to track progress in Graph Sampling
Libraries
Use these libraries to find Graph Sampling models and implementationsMost implemented papers
Reducing Large Internet Topologies for Faster Simulations
In this paper, we develop methods to “sample” a small realistic graph from a large real network.
On Random Walk Based Graph Sampling
In this paper, we first present a comprehensive analysis of the drawbacks of three widely-used random walk based graph sampling algorithms, called re-weighted random walk (RW) algorithm, Metropolis-Hastings random walk (MH) algorithm and maximum-degree random walk (MD) algorithm.
Little Ball of Fur: A Python Library for Graph Sampling
In this paper, we describe Little Ball of Fur a Python library that includes more than twenty graph sampling algorithms.
C-SAW: A Framework for Graph Sampling and Random Walk on GPUs
In this paper, we propose, to the best of our knowledge, the first GPU-based framework for graph sampling/random walk.
Scalable Graph Neural Networks via Bidirectional Propagation
Most notably, GBP can deliver superior performance on a graph with over 60 million nodes and 1. 8 billion edges in less than half an hour on a single machine.
Efficient Neural Architecture Search for End-to-end Speech Recognition via Straight-Through Gradients
Using ST gradients to support sub-graph sampling is a core element to achieve efficient NAS beyond DARTS and SNAS.
Generalize a Small Pre-trained Model to Arbitrarily Large TSP Instances
For the traveling salesman problem (TSP), the existing supervised learning based algorithms suffer seriously from the lack of generalization ability.
Efficient Graph Deep Learning in TensorFlow with tf_geometric
We introduce tf_geometric, an efficient and friendly library for graph deep learning, which is compatible with both TensorFlow 1. x and 2. x.
GIST: Distributed Training for Large-Scale Graph Convolutional Networks
The graph convolutional network (GCN) is a go-to solution for machine learning on graphs, but its training is notoriously difficult to scale both in terms of graph size and the number of model parameters.
Pre-Training on Dynamic Graph Neural Networks
This paper proposes a pre-training method on dynamic graph neural networks (PT-DGNN), which uses dynamic attributed graph generation tasks to simultaneously learn the structure, semantics, and evolution features of the graph.