You need to log in to edit.

You can create a new account if you don't have one.

Or, discuss a change on Slack.

You can create a new account if you don't have one.

Or, discuss a change on Slack.

1 code implementation • 13 Jul 2023 • Ousmane Touat, Julian Stier, Pierre-Edouard Portier, Michael Granitzer

We use these metrics to compare GraphRNN and GRAN, two well-known generative models for graphs, and unveil the influence of node orderings.

no code implementations • 12 Nov 2021 • Julian Stier, Michael Granitzer

deepstruct connects deep learning models and graph theory such that different graph structures can be imposed on neural networks or graph structures can be extracted from trained neural network models.

1 code implementation • 27 Jul 2021 • Julian Stier, Harshil Darji, Michael Granitzer

Sparsity in the structure of Neural Networks can lead to less energy consumption, less memory usage, faster computation times on convenient hardware, and automated machine learning.

1 code implementation • 7 Jun 2020 • Julian Stier, Michael Granitzer

Learning distributions of graphs can be used for automatic drug discovery, molecular design, complex network analysis, and much more.

Ranked #1 on Graph Embedding on Barabasi-Albert

no code implementations • 16 Oct 2019 • Julian Stier, Michael Granitzer

Sparse Neural Networks regained attention due to their potential for mathematical and computational advantages.

Ranked #1 on Neural Architecture Search on MNIST

no code implementations • 17 Apr 2019 • Julian Stier, Gabriele Gianini, Michael Granitzer, Konstantin Ziegler

In previous work, heuristics based on using the weight distribution of a neuron as contribution measure have shown some success, but do not provide a proper theoretical understanding.

Ranked #1 on Network Pruning on MNIST

Cannot find the paper you are looking for? You can
Submit a new open access paper.

Contact us on:
hello@paperswithcode.com
.
Papers With Code is a free resource with all data licensed under CC-BY-SA.