We use these metrics to compare GraphRNN and GRAN, two well-known generative models for graphs, and unveil the influence of node orderings.
deepstruct connects deep learning models and graph theory such that different graph structures can be imposed on neural networks or graph structures can be extracted from trained neural network models.
Sparsity in the structure of Neural Networks can lead to less energy consumption, less memory usage, faster computation times on convenient hardware, and automated machine learning.
Learning distributions of graphs can be used for automatic drug discovery, molecular design, complex network analysis, and much more.
Ranked #1 on Graph Embedding on Barabasi-Albert
Sparse Neural Networks regained attention due to their potential for mathematical and computational advantages.
Ranked #1 on Neural Architecture Search on MNIST
In previous work, heuristics based on using the weight distribution of a neuron as contribution measure have shown some success, but do not provide a proper theoretical understanding.
Ranked #1 on Network Pruning on MNIST