no code implementations • 17 Jan 2024 • Marco Pacini, Xiaowen Dong, Bruno Lepri, Gabriele Santin
Equivariant neural networks have shown improved performance, expressiveness and sample complexity on symmetrical domains.
no code implementations • 15 Sep 2023 • Laura Ferrarotti, Massimiliano Luca, Gabriele Santin, Giorgio Previati, Gianpiero Mastinu, Massimiliano Gobbi, Elena Campi, Lorenzo Uccello, Antonino Albanese, Praveen Zalaya, Alessandro Roccasalva, Bruno Lepri
To gauge the practicality and acceptability of the policy, we conduct evaluations with human participants using the simulator, focusing on a range of metrics like traffic smoothness and safety perception.
no code implementations • 2 Feb 2023 • Antonio Longa, Veronica Lachi, Gabriele Santin, Monica Bianchini, Bruno Lepri, Pietro Lio, Franco Scarselli, Andrea Passerini
Graph Neural Networks (GNNs) have become the leading paradigm for learning on (static) graph-structured data.
no code implementations • 15 Dec 2022 • Giacomo Elefante, Wolfgang Erb, Francesco Marchetti, Emma Perracchione, Davide Poggiali, Gabriele Santin
We will then study the Reproducing Kernel Hilbert Spaces (or native spaces) of these kernels and their norms, and provide inclusion relations between spaces corresponding to different kernel parameters.
2 code implementations • 27 Oct 2022 • Antonio Longa, Steve Azzolin, Gabriele Santin, Giulia Cencetti, Pietro Liò, Bruno Lepri, Andrea Passerini
Following a fast initial breakthrough in graph based learning, Graph Neural Networks (GNNs) have reached a widespread application in many science and engineering fields, prompting the need for methods to understand their decision process.
no code implementations • 15 Mar 2022 • Gabriele Santin, Inna Skarbovsky, Fabiana Fournier, Bruno Lepri
Federated Leaning is an emerging approach to manage cooperation between a group of agents for the solution of Machine Learning tasks, with the goal of improving each agent's performance without disclosing any data.
no code implementations • 11 Mar 2022 • Beatrice Nobile, Gabriele Santin, Bruno Lepri, Pierpaolo Brutti
Fairness-aware GANs (FairGANs) exploit the mechanisms of Generative Adversarial Networks (GANs) to impose fairness on the generated data, freeing them from both disparate impact and disparate treatment.
no code implementations • 15 May 2021 • Tizian Wenzel, Gabriele Santin, Bernard Haasdonk
In particular, we show that the use of special types of kernels yield models reminiscent of neural networks that are founded in the same theoretical framework of classical kernel methods, while enjoying many computational properties of deep neural networks.
no code implementations • 25 Mar 2021 • Tizian Wenzel, Marius Kurz, Andrea Beck, Gabriele Santin, Bernard Haasdonk
Standard kernel methods for machine learning usually struggle when dealing with large datasets.
1 code implementation • 2 Mar 2021 • Salvatore Cuomo, Wolfgang Erb, Gabriele Santin
The inference of novel knowledge, the discovery of hidden patterns, and the uncovering of insights from large amounts of data from a multitude of sources make Data Science (DS) to an art rather than just a mere scientific discipline.
no code implementations • 1 Dec 2020 • Bernard Haasdonk, Boumediene Hamzi, Gabriele Santin, Dominik Wittwar
We then use an apposite data-based kernel method to construct a suitable approximation of the manifold close to the equilibrium, which is compatible with our general error theory.
no code implementations • 27 Apr 2020 • Bernard Haasdonk, Tizian Wenzel, Gabriele Santin, Syn Schmitt
Greedy kernel approximation algorithms are successful techniques for sparse and accurate data-based modelling and function approximation.
1 code implementation • 11 Nov 2019 • Tizian Wenzel, Gabriele Santin, Bernard Haasdonk
Since the computation of an optimal selection of sampling points may be an infeasible task, one promising option is to use greedy methods.
Numerical Analysis Numerical Analysis
1 code implementation • 24 Jul 2019 • Gabriele Santin, Bernard Haasdonk
Second, if a function is available only via measurements or a few function evaluation samples, kernel approximation techniques can provide function surrogates that allow global evaluation.
Numerical Analysis Numerical Analysis
1 code implementation • 25 Jul 2018 • Gabriele Santin, Dominik Wittwar, Bernard Haasdonk
Kernel based regularized interpolation is a well known technique to approximate a continuous multivariate function using a set of scattered data points and the corresponding function evaluations, or data values.
Numerical Analysis