Search Results for author: Claudio Battiloro

Found 21 papers, 9 papers with code

Quantum Simplicial Neural Networks

no code implementations9 Jan 2025 Simone Piperno, Claudio Battiloro, Andrea Ceschini, Francesca Dominici, Paolo Di Lorenzo, Massimo Panella

Topological Deep Learning (TDL) has allowed for systematic modeling of hierarchical higher-order interactions by relying on combinatorial topological spaces such as simplicial complexes.

Deep Learning

Towards a Health-Based Power Grid Optimization in the Artificial Intelligence Era

no code implementations11 Oct 2024 Claudio Battiloro, Gianluca Guidi, Falco J. Bargagli-Stoffi, Francesca Dominici

The electric power sector is one of the largest contributors to greenhouse gas emissions in the world.

TopoTune : A Framework for Generalized Combinatorial Complex Neural Networks

2 code implementations9 Oct 2024 Mathilde Papillon, Guillermo Bernárdez, Claudio Battiloro, Nina Miolane

Combinatorial Complex Neural Networks (CCNNs), fairly general TDL models, have been shown to be more expressive and better performing than GNNs.

Graph Neural Network

Higher-Order Topological Directionality and Directed Simplicial Neural Networks

2 code implementations12 Sep 2024 Manuel Lecha, Andrea Cavallo, Francesca Dominici, Elvin Isufi, Claudio Battiloro

In this paper, we first introduce a novel notion of higher-order directionality and we then design Directed Simplicial Neural Networks (Dir-SNNs) based on it.

ICML Topological Deep Learning Challenge 2024: Beyond the Graph Domain

no code implementations8 Sep 2024 Guillermo Bernárdez, Lev Telyatnikov, Marco Montagna, Federica Baccini, Mathilde Papillon, Miquel Ferriol-Galmés, Mustafa Hajij, Theodore Papamarkou, Maria Sofia Bucarelli, Olga Zaghen, Johan Mathe, Audun Myers, Scott Mahan, Hansen Lillemark, Sharvaree Vadgama, Erik Bekkers, Tim Doster, Tegan Emerson, Henry Kvinge, Katrina Agate, Nesreen K Ahmed, Pengfei Bai, Michael Banf, Claudio Battiloro, Maxim Beketov, Paul Bogdan, Martin Carrasco, Andrea Cavallo, Yun Young Choi, George Dasoulas, Matouš Elphick, Giordan Escalona, Dominik Filipiak, Halley Fritze, Thomas Gebhart, Manel Gil-Sorribes, Salvish Goomanee, Victor Guallar, Liliya Imasheva, Andrei Irimia, Hongwei Jin, Graham Johnson, Nikos Kanakaris, Boshko Koloski, Veljko Kovač, Manuel Lecha, Minho Lee, Pierrick Leroy, Theodore Long, German Magai, Alvaro Martinez, Marissa Masden, Sebastian Mežnar, Bertran Miquel-Oliver, Alexis Molina, Alexander Nikitin, Marco Nurisso, Matt Piekenbrock, Yu Qin, Patryk Rygiel, Alessandro Salatiello, Max Schattauer, Pavel Snopov, Julian Suk, Valentina Sánchez, Mauricio Tec, Francesco Vaccarino, Jonas Verhellen, Frederic Wantiez, Alexander Weers, Patrik Zajec, Blaž Škrlj, Nina Miolane

This paper describes the 2nd edition of the ICML Topological Deep Learning Challenge that was hosted within the ICML 2024 ELLIS Workshop on Geometry-grounded Representation Learning and Generative Modeling (GRaM).

Deep Learning Representation Learning

E(n) Equivariant Topological Neural Networks

1 code implementation24 May 2024 Claudio Battiloro, Ege Karaismailoğlu, Mauricio Tec, George Dasoulas, Michelle Audirac, Francesca Dominici

This paper introduces E(n)-Equivariant Topological Neural Networks (ETNNs), which are E(n)-equivariant message-passing networks operating on combinatorial complexes, formal objects unifying graphs, hypergraphs, simplicial, path, and cell complexes.

Inductive Bias Molecular Property Prediction +1

Attending to Topological Spaces: The Cellular Transformer

no code implementations23 May 2024 Rubén Ballester, Pablo Hernández-García, Mathilde Papillon, Claudio Battiloro, Nina Miolane, Tolga Birdal, Carles Casacuberta, Sergio Escalera, Mustafa Hajij

Topological Deep Learning seeks to enhance the predictive performance of neural network models by harnessing topological structures in input data.

Dynamic Relative Representations for Goal-Oriented Semantic Communications

no code implementations25 Mar 2024 Simone Fiorellino, Claudio Battiloro, Emilio Calvanese Strinati, Paolo Di Lorenzo

This paper presents a novel framework for goal-oriented semantic communication, leveraging relative representations to mitigate semantic mismatches via latent space alignment.

Semantic Communication

Stability of Graph Convolutional Neural Networks through the lens of small perturbation analysis

no code implementations20 Dec 2023 Lucia Testa, Claudio Battiloro, Stefania Sardellitti, Sergio Barbarossa

In this work, we study the problem of stability of Graph Convolutional Neural Networks (GCNs) under random small perturbations in the underlying graph topology, i. e. under a limited number of insertions or deletions of edges.

Goal-oriented Communications for the IoT: System Design and Adaptive Resource Optimization

no code implementations21 Oct 2023 Paolo Di Lorenzo, Mattia Merluzzi, Francesco Binucci, Claudio Battiloro, Paolo Banelli, Emilio Calvanese Strinati, Sergio Barbarossa

Internet of Things (IoT) applications combine sensing, wireless communication, intelligence, and actuation, enabling the interaction among heterogeneous devices that collect and process considerable amounts of data.

Federated Learning

From Latent Graph to Latent Topology Inference: Differentiable Cell Complex Module

no code implementations25 May 2023 Claudio Battiloro, Indro Spinelli, Lev Telyatnikov, Michael Bronstein, Simone Scardapane, Paolo Di Lorenzo

Latent Graph Inference (LGI) relaxed the reliance of Graph Neural Networks (GNNs) on a given graph topology by dynamically learning it.

Tangent Bundle Convolutional Learning: from Manifolds to Cellular Sheaves and Back

no code implementations20 Mar 2023 Claudio Battiloro, Zhiyang Wang, Hans Riess, Paolo Di Lorenzo, Alejandro Ribeiro

We define tangent bundle filters and tangent bundle neural networks (TNNs) based on this convolution operation, which are novel continuous architectures operating on tangent bundle signals, i. e. vector fields over the manifolds.

Topological Signal Processing over Weighted Simplicial Complexes

no code implementations16 Feb 2023 Claudio Battiloro, Stefania Sardellitti, Sergio Barbarossa, Paolo Di Lorenzo

Weighing the topological domain over which data can be represented and analysed is a key strategy in many signal processing and machine learning applications, enabling the extraction and exploitation of meaningful data features and their (higher order) relationships.

Tangent Bundle Filters and Neural Networks: from Manifolds to Cellular Sheaves and Back

no code implementations26 Oct 2022 Claudio Battiloro, Zhiyang Wang, Hans Riess, Paolo Di Lorenzo, Alejandro Ribeiro

In this work we introduce a convolution operation over the tangent bundle of Riemannian manifolds exploiting the Connection Laplacian operator.

Denoising

Topological Slepians: Maximally Localized Representations of Signals over Simplicial Complexes

1 code implementation26 Oct 2022 Claudio Battiloro, Paolo Di Lorenzo, Sergio Barbarossa

This paper introduces topological Slepians, i. e., a novel class of signals defined over topological spaces (e. g., simplicial complexes) that are maximally concentrated on the topological domain (e. g., over a set of nodes, edges, triangles, etc.)

Denoising

Pooling Strategies for Simplicial Convolutional Networks

1 code implementation11 Oct 2022 Domenico Mattia Cinque, Claudio Battiloro, Paolo Di Lorenzo

The goal of this paper is to introduce pooling strategies for simplicial convolutional neural networks.

Graph Classification

Cell Attention Networks

1 code implementation16 Sep 2022 Lorenzo Giusti, Claudio Battiloro, Lucia Testa, Paolo Di Lorenzo, Stefania Sardellitti, Sergio Barbarossa

In this paper, we introduce Cell Attention Networks (CANs), a neural architecture operating on data defined over the vertices of a graph, representing the graph as the 1-skeleton of a cell complex introduced to capture higher order interactions.

Graph Attention Graph Classification +1

Energy-Efficient Classification at the Wireless Edge with Reliability Guarantees

no code implementations21 Apr 2022 Mattia Merluzzi, Claudio Battiloro, Paolo Di Lorenzo, Emilio Calvanese Strinati

Learning at the edge is a challenging task from several perspectives, since data must be collected by end devices (e. g. sensors), possibly pre-processed (e. g. data compression), and finally processed remotely to output the result of training and/or inference phases.

Data Compression Image Classification

Cannot find the paper you are looking for? You can Submit a new open access paper.