Search Results for author: Gabriele Santin

Found 15 papers, 5 papers with code

A Characterization Theorem for Equivariant Networks with Point-wise Activations

no code implementations17 Jan 2024 Marco Pacini, Xiaowen Dong, Bruno Lepri, Gabriele Santin

Equivariant neural networks have shown improved performance, expressiveness and sample complexity on symmetrical domains.

Interpolation with the polynomial kernels

no code implementations15 Dec 2022 Giacomo Elefante, Wolfgang Erb, Francesco Marchetti, Emma Perracchione, Davide Poggiali, Gabriele Santin

We will then study the Reproducing Kernel Hilbert Spaces (or native spaces) of these kernels and their norms, and provide inclusion relations between spaces corresponding to different kernel parameters.

Explaining the Explainers in Graph Neural Networks: a Comparative Study

2 code implementations27 Oct 2022 Antonio Longa, Steve Azzolin, Gabriele Santin, Giulia Cencetti, Pietro Liò, Bruno Lepri, Andrea Passerini

Following a fast initial breakthrough in graph based learning, Graph Neural Networks (GNNs) have reached a widespread application in many science and engineering fields, prompting the need for methods to understand their decision process.

Node Classification

A Framework for Verifiable and Auditable Federated Anomaly Detection

no code implementations15 Mar 2022 Gabriele Santin, Inna Skarbovsky, Fabiana Fournier, Bruno Lepri

Federated Leaning is an emerging approach to manage cooperation between a group of agents for the solution of Machine Learning tasks, with the goal of improving each agent's performance without disclosing any data.

Anomaly Detection Ensemble Learning

Reprogramming FairGANs with Variational Auto-Encoders: A New Transfer Learning Model

no code implementations11 Mar 2022 Beatrice Nobile, Gabriele Santin, Bruno Lepri, Pierpaolo Brutti

Fairness-aware GANs (FairGANs) exploit the mechanisms of Generative Adversarial Networks (GANs) to impose fairness on the generated data, freeing them from both disparate impact and disparate treatment.

Fairness Transfer Learning

Universality and Optimality of Structured Deep Kernel Networks

no code implementations15 May 2021 Tizian Wenzel, Gabriele Santin, Bernard Haasdonk

In particular, we show that the use of special types of kernels yield models reminiscent of neural networks that are founded in the same theoretical framework of classical kernel methods, while enjoying many computational properties of deep neural networks.

Kernel-Based Models for Influence Maximization on Graphs based on Gaussian Process Variance Minimization

1 code implementation2 Mar 2021 Salvatore Cuomo, Wolfgang Erb, Gabriele Santin

The inference of novel knowledge, the discovery of hidden patterns, and the uncovering of insights from large amounts of data from a multitude of sources make Data Science (DS) to an art rather than just a mere scientific discipline.

Kernel methods for center manifold approximation and a data-based version of the Center Manifold Theorem

no code implementations1 Dec 2020 Bernard Haasdonk, Boumediene Hamzi, Gabriele Santin, Dominik Wittwar

We then use an apposite data-based kernel method to construct a suitable approximation of the manifold close to the equilibrium, which is compatible with our general error theory.

Biomechanical surrogate modelling using stabilized vectorial greedy kernel methods

no code implementations27 Apr 2020 Bernard Haasdonk, Tizian Wenzel, Gabriele Santin, Syn Schmitt

Greedy kernel approximation algorithms are successful techniques for sparse and accurate data-based modelling and function approximation.

A novel class of stabilized greedy kernel approximation algorithms: Convergence, stability & uniform point distribution

1 code implementation11 Nov 2019 Tizian Wenzel, Gabriele Santin, Bernard Haasdonk

Since the computation of an optimal selection of sampling points may be an infeasible task, one promising option is to use greedy methods.

Numerical Analysis Numerical Analysis

Kernel Methods for Surrogate Modeling

1 code implementation24 Jul 2019 Gabriele Santin, Bernard Haasdonk

Second, if a function is available only via measurements or a few function evaluation samples, kernel approximation techniques can provide function surrogates that allow global evaluation.

Numerical Analysis Numerical Analysis

Greedy regularized kernel interpolation

1 code implementation25 Jul 2018 Gabriele Santin, Dominik Wittwar, Bernard Haasdonk

Kernel based regularized interpolation is a well known technique to approximate a continuous multivariate function using a set of scattered data points and the corresponding function evaluations, or data values.

Numerical Analysis

Cannot find the paper you are looking for? You can Submit a new open access paper.