no code implementations • 9 Sep 2023 • Peter Ashwin, Andrea Ceni
Casting our theoretical findings in the RNN computing framework, we obtain that for small amplitude forcing the echo index corresponds to the number of attractors for the input-free system, while for large amplitude forcing, the echo index reduces to one.
no code implementations • 5 Aug 2023 • Andrea Ceni, Claudio Gallicchio
With the goal of bringing together the fading memory property and the ability to retain as much memory as possible, in this paper we introduce a new ESN architecture, called the Edge of Stability Echo State Network (ES$^2$N).
no code implementations • 3 Oct 2022 • Andrea Ceni
Since the recognition in the early nineties of the vanishing/exploding (V/E) gradient issue plaguing the training of neural networks (NNs), significant efforts have been exerted to overcome this obstacle.
no code implementations • 21 Aug 2019 • Andrea Ceni, Simona Olmi, Alessandro Torcini, David Angulo-Garcia
Coupling among neural rhythms is one of the most important mechanisms at the basis of cognitive processes in the brain.
Adaptation and Self-Organizing Systems Disordered Systems and Neural Networks
no code implementations • 27 Jul 2018 • Andrea Ceni, Peter Ashwin, Lorenzo Livi
Simulations conducted on a controlled benchmark task confirm the relevance of these attractors for interpreting the behaviour of recurrent neural networks, at least for tasks that involve learning a finite number of stable states and transitions between them.