Search Results for author: Lapo Frati

Found 5 papers, 2 papers with code

Reset It and Forget It: Relearning Last-Layer Weights Improves Continual and Transfer Learning

no code implementations12 Oct 2023 Lapo Frati, Neil Traft, Jeff Clune, Nick Cheney

We show that our zapping procedure results in improved transfer accuracy and/or more rapid adaptation in both standard fine-tuning and continual learning settings, while being simple to implement and computationally efficient.

Continual Learning Meta-Learning +1

Coping with seasons: evolutionary dynamics of gene networks in a changing environment

1 code implementation4 Jul 2023 Csenge Petak, Lapo Frati, Melissa H. Pespeni, Nick Cheney

In contrast, conservative bet-hedgers have a set of offspring that all have an in-between phenotype compared to the specialists.

A good body is all you need: avoiding catastrophic interference via agent architecture search

no code implementations18 Aug 2021 Joshua Powers, Ryan Grindle, Lapo Frati, Josh Bongard

Efforts to combat catastrophic interference to date focus on novel neural architectures or training methods, with a recent emphasis on policies with good initial settings that facilitate training in new environments.

Learning to Continually Learn

5 code implementations21 Feb 2020 Shawn Beaulieu, Lapo Frati, Thomas Miconi, Joel Lehman, Kenneth O. Stanley, Jeff Clune, Nick Cheney

Continual lifelong learning requires an agent or model to learn many sequentially ordered tasks, building on previous knowledge without catastrophically forgetting it.

Continual Learning Meta-Learning

Embodiment dictates learnability in neural controllers

no code implementations15 Oct 2019 Joshua Powers, Ryan Grindle, Sam Kriegman, Lapo Frati, Nick Cheney, Josh Bongard

Catastrophic forgetting continues to severely restrict the learnability of controllers suitable for multiple task environments.

Cannot find the paper you are looking for? You can Submit a new open access paper.