1 code implementation • 3 Aug 2022 • Jary Pomponi, Simone Scardapane, Aurelio Uncini
In this paper, we propose a novel regularization method called Centroids Matching, that, inspired by meta-learning approaches, fights CF by operating in the feature space produced by the neural network, achieving good results while requiring a small memory footprint.
1 code implementation • 11 Feb 2022 • Jary Pomponi, Simone Scardapane, Aurelio Uncini
We show that our method performs favorably with respect to state-of-the-art approaches in the literature, with bounded computational power and memory overheads.
1 code implementation • 4 Feb 2022 • Jary Pomponi, Simone Scardapane, Aurelio Uncini
Recent research has found that neural networks are vulnerable to several types of adversarial attacks, where the input samples are modified in such a way that the model produces a wrong prediction that misclassifies the adversarial sample.
2 code implementations • 6 May 2021 • Jary Pomponi, Simone Scardapane, Aurelio Uncini
In this paper, we propose a novel ensembling technique for deep neural networks, which is able to drastically reduce the required memory compared to alternative approaches.
4 code implementations • 1 Apr 2021 • Vincenzo Lomonaco, Lorenzo Pellegrini, Andrea Cossu, Antonio Carta, Gabriele Graffieti, Tyler L. Hayes, Matthias De Lange, Marc Masana, Jary Pomponi, Gido van de Ven, Martin Mundt, Qi She, Keiland Cooper, Jeremy Forest, Eden Belouadah, Simone Calderara, German I. Parisi, Fabio Cuzzolin, Andreas Tolias, Simone Scardapane, Luca Antiga, Subutai Amhad, Adrian Popescu, Christopher Kanan, Joost Van de Weijer, Tinne Tuytelaars, Davide Bacciu, Davide Maltoni
Learning continually from non-stationary data streams is a long-standing goal and a challenging problem in machine learning.
1 code implementation • ICML Workshop LifelongML 2020 • Jary Pomponi, Simone Scardapane, Aurelio Uncini
We show that our method performs favorably with respect to state-of-the-art approaches in the literature, with bounded computational power and memory overheads.
4 code implementations • 2 Mar 2020 • Jary Pomponi, Simone Scardapane, Aurelio Uncini
Bayesian Neural Networks (BNNs) are trained to optimize an entire distribution over their weights instead of a single set, having significant advantages in terms of, e. g., interpretability, multi-task learning, and calibration.
no code implementations • 26 Nov 2019 • Cristiano Fanelli, Jary Pomponi
Imaging Cherenkov detectors are largely used for particle identification (PID) in nuclear and particle physics experiments, where developing fast reconstruction algorithms is becoming of paramount importance to allow for near real time calibration and data quality control, as well as to speed up offline analysis of large amount of data.
1 code implementation • 9 Sep 2019 • Jary Pomponi, Simone Scardapane, Vincenzo Lomonaco, Aurelio Uncini
Continual learning of deep neural networks is a key requirement for scaling them up to more complex applicative scenarios and for achieving real lifelong learning of these architectures.