Search Results for author: Lukas Faber

Found 9 papers, 2 papers with code

Asynchronous Neural Networks for Learning in Graphs

no code implementations24 May 2022 Lukas Faber, Roger Wattenhofer

This paper studies asynchronous message passing (AMP), a new paradigm for applying neural network based learning to graphs.

Distributed Computing Graph Classification

DropGNN: Random Dropouts Increase the Expressiveness of Graph Neural Networks

1 code implementation NeurIPS 2021 Pál András Papp, Karolis Martinkus, Lukas Faber, Roger Wattenhofer

In DropGNNs, we execute multiple runs of a GNN on the input graph, with some of the nodes randomly and independently dropped in each of these runs.

Graph Classification Graph Regression

Contrastive Graph Neural Network Explanation

1 code implementation26 Oct 2020 Lukas Faber, Amin K. Moghaddam, Roger Wattenhofer

Graph Neural Networks achieve remarkable results on problems with structured data but come as black-box predictors.

Learning Lower Bounds for Graph Exploration With Reinforcement Learning

no code implementations NeurIPS Workshop LMCA 2020 Jorel Elmiger, Lukas Faber, Pankaj Khanchandani, Oliver Paul Richter, Roger Wattenhofer

Given there are quadratically many possible edges in a graph and each subset of edges is a possible solution, this yields unfeasibly large search spaces even for few nodes.

reinforcement-learning Reinforcement Learning (RL)

Medley2K: A Dataset of Medley Transitions

no code implementations25 Aug 2020 Lukas Faber, Sandro Luck, Damian Pascual, Andreas Roth, Gino Brunner, Roger Wattenhofer

The automatic generation of medleys, i. e., musical pieces formed by different songs concatenated via smooth transitions, is not well studied in the current literature.

Neural Status Registers

no code implementations15 Apr 2020 Lukas Faber, Roger Wattenhofer

Standard Neural Networks can learn mathematical operations, but they do not extrapolate.

Cannot find the paper you are looking for? You can Submit a new open access paper.