1 code implementation • 30 Apr 2022 • Oleksiy Ostapenko, Timothee Lesort, Pau Rodríguez, Md Rifat Arefin, Arthur Douillard, Irina Rish, Laurent Charlin
Motivated by this, we study the efficacy of pre-trained vision models as a foundation for downstream continual learning (CL) scenarios.
1 code implementation • 25 Apr 2022 • Antoine Saporta, Arthur Douillard, Tuan-Hung Vu, Patrick Pérez, Matthieu Cord
Unsupervised Domain Adaptation (UDA) is a transfer learning task which aims at training on an unlabeled target domain by leveraging a labeled source domain.
1 code implementation • 22 Nov 2021 • Arthur Douillard, Alexandre Ramé, Guillaume Couairon, Matthieu Cord
Our strategy scales to a large number of tasks while having negligible memory and time overheads due to strict control of the parameters expansion.
Ranked #1 on
Incremental Learning
on ImageNet - 10 steps
1 code implementation • 29 Jun 2021 • Arthur Douillard, Yifu Chen, Arnaud Dapogny, Matthieu Cord
classes predicted by the old model to deal with background shift and avoid catastrophic forgetting of the old classes.
Ranked #3 on
Overlapped 15-1
on PASCAL VOC 2012
class-incremental learning
Continual Semantic Segmentation
+4
1 code implementation • 11 Feb 2021 • Arthur Douillard, Timothée Lesort
Those drifts might cause interferences in the trained model and knowledge learned on previous states of the data distribution might be forgotten.
1 code implementation • CVPR 2021 • Arthur Douillard, Yifu Chen, Arnaud Dapogny, Matthieu Cord
classes predicted by the old model to deal with background shift and avoid catastrophic forgetting of the old classes.
Ranked #1 on
Domain 1-1
on Cityscapes val
class-incremental learning
Continual Semantic Segmentation
+15
no code implementations • 6 Oct 2020 • Alexandre Rame, Arthur Douillard, Charles Ollion
The second stage combines a colorname-attention (dependent of the detected color) with an object-attention (dependent of the clothing category) and finally weights a spatial pooling over the image pixels' RGB values.
1 code implementation • 24 Jun 2020 • Arthur Douillard, Eduardo Valle, Charles Ollion, Thomas Robert, Matthieu Cord
Continual learning aims to learn tasks sequentially, with (often severe) constraints on the storage of old learning samples, without suffering from catastrophic forgetting.
1 code implementation • ECCV 2020 • Arthur Douillard, Matthieu Cord, Charles Ollion, Thomas Robert, Eduardo Valle
Lifelong learning has attracted much attention, but existing works still struggle to fight catastrophic forgetting and accumulate knowledge over long stretches of incremental learning.