no code implementations • 20 Nov 2023 • Eli Verwimp, Rahaf Aljundi, Shai Ben-David, Matthias Bethge, Andrea Cossu, Alexander Gepperth, Tyler L. Hayes, Eyke Hüllermeier, Christopher Kanan, Dhireesha Kudithipudi, Christoph H. Lampert, Martin Mundt, Razvan Pascanu, Adrian Popescu, Andreas S. Tolias, Joost Van de Weijer, Bing Liu, Vincenzo Lomonaco, Tinne Tuytelaars, Gido M. van de Ven
Continual learning is a sub-field of machine learning, which aims to allow machine learning models to continuously learn on new data, by accumulating knowledge without forgetting what was learned in the past.
2 code implementations • 20 Aug 2023 • Albin Soutif--Cormerais, Antonio Carta, Andrea Cossu, Julio Hurtado, Hamed Hemati, Vincenzo Lomonaco, Joost Van de Weijer
Online continual learning aims to get closer to a live learning experience by learning directly on a stream of data with temporally shifting distribution and by storing a minimum amount of data from that stream.
1 code implementation • 12 Jun 2023 • Andrea Cossu, Francesco Spinnato, Riccardo Guidotti, Davide Bacciu
Continual Learning trains models on a stream of data, with the aim of learning new information without forgetting previous knowledge.
1 code implementation • 28 Mar 2023 • Antonio Carta, Andrea Cossu, Vincenzo Lomonaco, Davide Bacciu, Joost Van de Weijer
We formalize this problem as a Distributed Continual Learning scenario, where SCD adapt to local tasks and a CL model consolidates the knowledge from the resulting stream of models without looking at the SCD's private data.
1 code implementation • 2 Feb 2023 • Antonio Carta, Lorenzo Pellegrini, Andrea Cossu, Hamed Hemati, Vincenzo Lomonaco
Continual learning is the problem of learning from a nonstationary stream of data, a fundamental issue for sustainable and efficient training of deep neural networks over time.
1 code implementation • 26 Jan 2023 • Hamed Hemati, Andrea Cossu, Antonio Carta, Julio Hurtado, Lorenzo Pellegrini, Davide Bacciu, Vincenzo Lomonaco, Damian Borth
We propose two stochastic stream generators that produce a wide range of CIR streams starting from a single dataset and a few interpretable control parameters.
1 code implementation • 29 Jun 2022 • Federico Matteoni, Andrea Cossu, Claudio Gallicchio, Vincenzo Lomonaco, Davide Bacciu
Continual Learning (CL) on time series data represents a promising but under-studied avenue for real-world applications.
1 code implementation • 23 Jun 2022 • Mattia Sangermano, Antonio Carta, Andrea Cossu, Davide Bacciu
A popular solution in these scenario is to use a small memory to retain old data and rehearse them over time.
1 code implementation • 19 May 2022 • Andrea Cossu, Tinne Tuytelaars, Antonio Carta, Lucia Passaro, Vincenzo Lomonaco, Davide Bacciu
We formalize and investigate the characteristics of the continual pre-training scenario in both language and vision environments, where a model is continually pre-trained on a stream of incoming data and only later fine-tuned to different downstream tasks.
no code implementations • 19 Mar 2022 • Gabriele Merlin, Vincenzo Lomonaco, Andrea Cossu, Antonio Carta, Davide Bacciu
Continual Learning requires the model to learn from a stream of dynamic, non-stationary data without forgetting previous knowledge.
1 code implementation • 13 Dec 2021 • Antonio Carta, Andrea Cossu, Vincenzo Lomonaco, Davide Bacciu
Learning continually from non-stationary data streams is a challenging research topic of growing popularity in the last few years.
no code implementations • 6 Dec 2021 • Andrea Cossu, Gabriele Graffieti, Lorenzo Pellegrini, Davide Maltoni, Davide Bacciu, Antonio Carta, Vincenzo Lomonaco
The ability of a model to learn continually can be empirically assessed in different continual learning scenarios.
no code implementations • 17 Nov 2021 • Andrea Cossu, Marta Ziosi, Vincenzo Lomonaco
The increasing attention on Artificial Intelligence (AI) regulation has led to the definition of a set of ethical principles grouped into the Sustainable AI framework.
1 code implementation • 17 May 2021 • Andrea Cossu, Davide Bacciu, Antonio Carta, Claudio Gallicchio, Vincenzo Lomonaco
Continual Learning (CL) refers to a learning setup where data is non stationary and the model has to learn without forgetting existing knowledge.
4 code implementations • 1 Apr 2021 • Vincenzo Lomonaco, Lorenzo Pellegrini, Andrea Cossu, Antonio Carta, Gabriele Graffieti, Tyler L. Hayes, Matthias De Lange, Marc Masana, Jary Pomponi, Gido van de Ven, Martin Mundt, Qi She, Keiland Cooper, Jeremy Forest, Eden Belouadah, Simone Calderara, German I. Parisi, Fabio Cuzzolin, Andreas Tolias, Simone Scardapane, Luca Antiga, Subutai Amhad, Adrian Popescu, Christopher Kanan, Joost Van de Weijer, Tinne Tuytelaars, Davide Bacciu, Davide Maltoni
Learning continually from non-stationary data streams is a long-standing goal and a challenging problem in machine learning.
2 code implementations • 29 Mar 2021 • Andrea Rosasco, Antonio Carta, Andrea Cossu, Vincenzo Lomonaco, Davide Bacciu
Replay strategies are Continual Learning techniques which mitigate catastrophic forgetting by keeping a buffer of patterns from previous experiences, which are interleaved with new data during training.
1 code implementation • 22 Mar 2021 • Antonio Carta, Andrea Cossu, Federico Errica, Davide Bacciu
In this work, we study the phenomenon of catastrophic forgetting in the graph representation learning scenario.
no code implementations • 12 Mar 2021 • Andrea Cossu, Antonio Carta, Vincenzo Lomonaco, Davide Bacciu
We propose two new benchmarks for CL with sequential data based on existing datasets, whose characteristics resemble real-world applications.
1 code implementation • 8 Apr 2020 • Andrea Cossu, Antonio Carta, Davide Bacciu
The ability to learn in dynamic, nonstationary environments without forgetting previous knowledge, also known as Continual Learning (CL), is a key enabler for scalable and trustworthy deployments of adaptive solutions.