1 code implementation • 5 Feb 2024 • Nicolò Penzo, Antonio Longa, Bruno Lepri, Sara Tonelli, Marco Guerini
We also experiment with different amounts of training data and analyse the topology of local discussion networks in a privacy-compliant way.
1 code implementation • 29 Sep 2023 • Francesco Ferrini, Antonio Longa, Andrea Passerini, Manfred Jaeger
Existing multi-relational graph neural networks use one of two strategies for identifying informative relations: either they reduce this problem to low-level weight learning, or they rely on handcrafted chains of relational dependencies, called meta-paths.
no code implementations • 6 Apr 2023 • Peter Samoaa, Linus Aronsson, Antonio Longa, Philipp Leitner, Morteza Haghir Chehreghani
Then, we convert the tree representation of the source code to a Flow Augmented-AST graph (FA-AST) representation.
no code implementations • 2 Feb 2023 • Antonio Longa, Veronica Lachi, Gabriele Santin, Monica Bianchini, Bruno Lepri, Pietro Lio, Franco Scarselli, Andrea Passerini
Graph Neural Networks (GNNs) have become the leading paradigm for learning on (static) graph-structured data.
2 code implementations • 27 Oct 2022 • Antonio Longa, Steve Azzolin, Gabriele Santin, Giulia Cencetti, Pietro Liò, Bruno Lepri, Andrea Passerini
Following a fast initial breakthrough in graph based learning, Graph Neural Networks (GNNs) have reached a widespread application in many science and engineering fields, prompting the need for methods to understand their decision process.
1 code implementation • 13 Oct 2022 • Steve Azzolin, Antonio Longa, Pietro Barbiero, Pietro Liò, Andrea Passerini
While instance-level explanation of GNN is a well-studied problem with plenty of approaches being developed, providing a global explanation for the behaviour of a GNN is much less explored, despite its potential in interpretability and debugging.
no code implementations • 25 Aug 2022 • Hazem Peter Samoaa, Antonio Longa, Mazen Mohamad, Morteza Haghir Chehreghani, Philipp Leitner
TEP-GNN uses FA-ASTs, or flow-augmented ASTs, as a graph-based code representation approach, and predicts test execution times using a powerful graph neural network (GNN) deep learning model.
no code implementations • 2 Jul 2022 • Anna Nguyen, Antonio Longa, Massimiliano Luca, Joe Kaul, Gabriel Lopez
State of the art Graph Neural Networks (GNNs) are used to extract information from the Tweet-MLN and make predictions based on the extracted graph features.
no code implementations • 22 Feb 2022 • Giovanni Mauro, Massimiliano Luca, Antonio Longa, Bruno Lepri, Luca Pappalardo
We conduct extensive experiments on public datasets of bike and taxi rides to show that MoGAN outperforms the classical Gravity and Radiation models regarding the realism of the generated networks.