Learning without Forgetting

29 Jun 2016  ·  Zhizhong Li, Derek Hoiem ·

When building a unified vision system or gradually adding new capabilities to a system, the usual assumption is that training data for all tasks is always available. However, as the number of tasks grows, storing and retraining on such data becomes infeasible. A new problem arises where we add new capabilities to a Convolutional Neural Network (CNN), but the training data for its existing capabilities are unavailable. We propose our Learning without Forgetting method, which uses only new task data to train the network while preserving the original capabilities. Our method performs favorably compared to commonly used feature extraction and fine-tuning adaption techniques and performs similarly to multitask learning that uses original task data we assume unavailable. A more surprising observation is that Learning without Forgetting may be able to replace fine-tuning with similar old and new task datasets for improved new task performance.

PDF Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Domain 11-1 Cityscapes LWF mIoU 57.3 # 5
Domain 11-5 Cityscapes LWF mIoU 59.7 # 4
Domain 1-1 Cityscapes LWF mIoU 33.0 # 4
Disjoint 10-1 PASCAL VOC 2012 LWF mIoU 4.3 # 8
Disjoint 15-1 PASCAL VOC 2012 LWF mIoU 5.3 # 8
Overlapped 15-5 PASCAL VOC 2012 LWF Mean IoU (val) 55.0 # 12
Overlapped 15-1 PASCAL VOC 2012 LWF mIoU 5.5 # 12
Disjoint 15-5 PASCAL VOC 2012 LWF Mean IoU 54.9 # 8
Overlapped 10-1 PASCAL VOC 2012 LWF mIoU 4.8 # 12
Continual Learning visual domain decathlon (10 tasks) LwF decathlon discipline (Score) 2515 # 11
Avg. Accuracy 76.93 # 4

Methods


No methods listed for this paper. Add relevant methods here