Split-CIFAR-10

3 papers with code • 0 benchmarks • 0 datasets

This task has no description! Would you like to contribute one?

Most implemented papers

Self-Attention Meta-Learner for Continual Learning

GhadaSokar/Self-Attention-Meta-Learner-for-Continual-Learning 28 Jan 2021

In this paper, we propose a new method, named Self-Attention Meta-Learner (SAM), which learns a prior knowledge for continual learning that permits learning a sequence of tasks, while avoiding catastrophic forgetting.

Mixture-of-Variational-Experts for Continual Learning

hhihn/HVCL 25 Oct 2021

One weakness of machine learning algorithms is the poor ability of models to solve new problems without forgetting previously acquired knowledge.

Overcoming Recency Bias of Normalization Statistics in Continual Learning: Balance and Adaptation

lvyilin/adab2n NeurIPS 2023

Continual learning entails learning a sequence of tasks and balancing their knowledge appropriately.

Negotiated Representations to Prevent Forgetting in Machine Learning Applications

nurikorhan/negotiated-representations-for-continual-learning 30 Nov 2023

By evaluating our method on these challenging datasets, we aim to showcase its potential for addressing catastrophic forgetting and improving the performance of neural networks in continual learning settings.