1 code implementation • 28 May 2025 • Eleni Nisioti, Joachim Winther Pedersen, Erwan Plantec, Milton L. Montero, Sebastian Risi
The ability to continuously and efficiently transfer skills across tasks is a hallmark of biological intelligence and a long-standing goal in artificial systems.
no code implementations • 16 Mar 2025 • Binggwong Leung, Worasuchad Haomachai, Joachim Winther Pedersen, Sebastian Risi, Poramate Manoonpong
In this work, we improve the Hebbian network with a weight normalization mechanism for preventing weight divergence, analyze the principal components of the Hebbian's weights, and perform a thorough evaluation of network performance in locomotion control for real 18-DOF dung beetle-like and 16-DOF gecko-like robots.
no code implementations • 14 Jun 2024 • Eleni Nisioti, Claire Glanois, Elias Najarro, Andrew Dai, Elliot Meyerson, Joachim Winther Pedersen, Laetitia Teodorescu, Conor F. Hayes, Shyam Sudhakaran, Sebastian Risi
Large Language Models (LLMs) have taken the field of AI by storm, but their adoption in the field of Artificial Life (ALife) has been, so far, relatively reserved.
1 code implementation • 14 May 2024 • Eleni Nisioti, Erwan Plantec, Milton Montero, Joachim Winther Pedersen, Sebastian Risi
Artificial neural networks (ANNs), on the other hand, are traditionally optimized in the space of weights.
no code implementations • 6 Apr 2024 • Joachim Winther Pedersen, Erwan Plantec, Eleni Nisioti, Milton Montero, Sebastian Risi
Artificial neural networks used for reinforcement learning are structurally rigid, meaning that each optimized parameter of the network is tied to its specific placement in the network structure.
no code implementations • 25 May 2023 • Joachim Winther Pedersen, Sebastian Risi
Biological nervous systems consist of networks of diverse, sophisticated information processors in the form of neurons of different classes.
no code implementations • 12 May 2022 • Joachim Winther Pedersen, Sebastian Risi
Organisms in nature have evolved to exhibit flexibility in face of changes to the environment and/or to themselves.
no code implementations • 16 Apr 2021 • Joachim Winther Pedersen, Sebastian Risi
Inspired by the biological phenomenon of the genomic bottleneck, we show that by allowing multiple connections in the network to share the same local learning rule, it is possible to drastically reduce the number of trainable parameters, while obtaining a more robust agent.