You need to log in to edit.

You can create a new account if you don't have one.

Or, discuss a change on Slack.

You can create a new account if you don't have one.

Or, discuss a change on Slack.

1 code implementation • 16 Oct 2021 • Domenico Tortorella, Alessio Micheli

Dynamic temporal graphs represent evolving relations between entities, e. g. interactions between social network users or infection spreading.

no code implementations • 14 Jul 2021 • Davide Bacciu, Siranush Akarmazyan, Eric Armengaud, Manlio Bacco, George Bravos, Calogero Calandra, Emanuele Carlini, Antonio Carta, Pietro Cassara, Massimo Coppola, Charalampos Davalas, Patrizio Dazzi, Maria Carmela Degennaro, Daniele Di Sarli, Jürgen Dobaj, Claudio Gallicchio, Sylvain Girbal, Alberto Gotta, Riccardo Groppo, Vincenzo Lomonaco, Georg Macher, Daniele Mazzei, Gabriele Mencagli, Dimitrios Michail, Alessio Micheli, Roberta Peroglio, Salvatore Petroni, Rosaria Potenza, Farank Pourdanesh, Christos Sardianos, Konstantinos Tserpes, Fulvio Tagliabò, Jakob Valtl, Iraklis Varlamis, Omar Veledar

This paper discusses the perspective of the H2020 TEACHING project on the next generation of autonomous applications running in a distributed and highly heterogeneous environment comprising both virtual and physical resources spanning the edge-cloud continuum.

1 code implementation • 20 Apr 2021 • Claudio Gallicchio, Alessio Micheli, Luca Silvestri

Artificial Recurrent Neural Networks are a powerful information processing abstraction, and Reservoir Computing provides an efficient strategy to build robust implementations by projecting external inputs into high dimensional dynamical system trajectories.

no code implementations • 10 Apr 2021 • Filippo Maria Bianchi, Claudio Gallicchio, Alessio Micheli

We propose a deep Graph Neural Network (GNN) model that alternates two types of layers.

1 code implementation • 5 Dec 2020 • Federico Errica, Davide Bacciu, Alessio Micheli

We introduce the Graph Mixture Density Networks, a new family of machine learning models that can fit multimodal output distributions conditioned on graphs of arbitrary topology.

no code implementations • 14 Jul 2020 • Federico Errica, Marco Giulini, Davide Bacciu, Roberto Menichetti, Alessio Micheli, Raffaello Potestio

The method relies on deep graph networks, which provide extreme flexibility in the input format.

no code implementations • 11 May 2020 • Claudio Gallicchio, Alessio Micheli

Machine Learning for graphs is nowadays a research topic of consolidated relevance.

1 code implementation • 28 Feb 2020 • Marco Podda, Davide Bacciu, Alessio Micheli

Molecule generation is a challenging open problem in cheminformatics.

no code implementations • 10 Feb 2020 • Rita Pucci, Alessio Micheli, Stefano Chessa, Jane Hunter

Systems developed in wearable devices with sensors onboard are widely used to collect data of humans and animals activities with the perspective of an on-board automatic classification of data.

1 code implementation • 31 Jan 2020 • Davide Bacciu, Alessio Micheli, Marco Podda

Graph generation with Machine Learning is an open problem with applications in various research fields.

no code implementations • 24 Jan 2020 • Federico Errica, Davide Bacciu, Alessio Micheli

We propose a new Graph Neural Network that combines recent advancements in the field.

2 code implementations • 29 Dec 2019 • Davide Bacciu, Federico Errica, Alessio Micheli, Marco Podda

The adaptive processing of graph data is a long-standing research topic which has been lately consolidated as a theme of major interest in the deep learning community.

1 code implementation • ICLR 2020 • Federico Errica, Marco Podda, Davide Bacciu, Alessio Micheli

We believe that this work can contribute to the development of the graph learning field, by providing a much needed grounding for rigorous evaluations of graph classification models.

Ranked #1 on Graph Classification on REDDIT-MULTI-5k

no code implementations • 20 Nov 2019 • Claudio Gallicchio, Alessio Micheli

We address the efficiency issue for the construction of a deep graph neural network (GNN).

no code implementations • 24 Sep 2019 • Claudio Gallicchio, Alessio Micheli

Deep Echo State Networks (DeepESNs) recently extended the applicability of Reservoir Computing (RC) methods towards the field of deep learning.

no code implementations • 15 May 2019 • Benjamin Paaßen, Claudio Gallicchio, Alessio Micheli, Alessandro Sperduti

Performing machine learning on structured data is complicated by the fact that such data does not have vectorial form.

no code implementations • 12 Mar 2019 • Claudio Gallicchio, Alessio Micheli

Reservoir Computing (RC) is a popular methodology for the efficient design of Recurrent Neural Networks (RNNs).

no code implementations • 30 Dec 2018 • Claudio Gallicchio, Alessio Micheli, Luca Pedrelli

The analysis is performed in terms of efficiency and prediction accuracy on 4 polyphonic music tasks.

no code implementations • ICML 2018 • Benjamin Paaßen, Claudio Gallicchio, Alessio Micheli, Barbara Hammer

Metric learning has the aim to improve classification accuracy by learning a distance measure which brings data points from the same class closer together and pushes data points from different classes further apart.

1 code implementation • ICML 2018 • Davide Bacciu, Federico Errica, Alessio Micheli

We introduce the Contextual Graph Markov Model, an approach combining ideas from generative models and neural networks for the processing of graph data.

no code implementations • 19 Feb 2018 • Claudio Gallicchio, Alessio Micheli, Luca Pedrelli

In this paper, we introduce a novel approach for diagnosis of Parkinson's Disease (PD) based on deep Echo State Networks (ESNs).

4 code implementations • 12 Dec 2017 • Claudio Gallicchio, Alessio Micheli

The study of deep recurrent neural networks (RNNs) and, in particular, of deep Reservoir Computing (RC) is gaining an increasing research attention in the neural networks community.

no code implementations • 16 May 2017 • Claudio Gallicchio, Alessio Micheli, Luca Pedrelli

Recently, studies on deep Reservoir Computing (RC) highlighted the role of layering in deep recurrent neural networks (RNNs).

Cannot find the paper you are looking for? You can
Submit a new open access paper.

Contact us on:
hello@paperswithcode.com
.
Papers With Code is a free resource with all data licensed under CC-BY-SA.