Split-CIFAR-10
5 papers with code • 0 benchmarks • 0 datasets
Benchmarks
These leaderboards are used to track progress in Split-CIFAR-10
Most implemented papers
Self-Attention Meta-Learner for Continual Learning
In this paper, we propose a new method, named Self-Attention Meta-Learner (SAM), which learns a prior knowledge for continual learning that permits learning a sequence of tasks, while avoiding catastrophic forgetting.
Mixture-of-Variational-Experts for Continual Learning
One weakness of machine learning algorithms is the poor ability of models to solve new problems without forgetting previously acquired knowledge.
Overcoming Recency Bias of Normalization Statistics in Continual Learning: Balance and Adaptation
Continual learning entails learning a sequence of tasks and balancing their knowledge appropriately.
Negotiated Representations to Prevent Forgetting in Machine Learning Applications
By evaluating our method on these challenging datasets, we aim to showcase its potential for addressing catastrophic forgetting and improving the performance of neural networks in continual learning settings.
Enhancing Robustness in Incremental Learning with Adversarial Training
In this study, we investigate Adversarially Robust Class Incremental Learning (ARCIL), which deals with adversarial robustness in incremental learning.