no code implementations • 11 Jan 2023 • Irene Estebanez, Apostolos Argyris, Ingo Fischer
Time delay reservoir computing (TDRC) using semiconductor lasers (SLs) has proven to be a promising photonic analog approach for information processing.
no code implementations • 28 Jul 2022 • Daniel J. Gauthier, Ingo Fischer, André Röhm
Reservoir computing is a machine learning approach that can generate a surrogate model of a dynamical system.
no code implementations • 5 Nov 2021 • Mirko Goldmann, Claudio R. Mirasso, Ingo Fischer, Miguel C. Soriano
We train these networks to predict the dynamics of delay-dynamical and spatio-temporal systems for a single size.
no code implementations • 6 Aug 2021 • André Röhm, Daniel J. Gauthier, Ingo Fischer
Reservoir computers are powerful tools for chaotic time series prediction.
no code implementations • 17 May 2021 • Irene Estébanez, Shi Li, Janek Schwind, Ingo Fischer, Stephan Pachnicke, Apostolos Argyris
In this work, we show that the effectiveness of the internal fading memory depends significantly on the properties of the signal to be processed.
1 code implementation • 19 Nov 2020 • Florian Stelzer, André Röhm, Raul Vicente, Ingo Fischer, Serhiy Yanchuk
We present a method for folding a deep neural network of arbitrary size into a single neuron with multiple time-delayed feedback loops.
no code implementations • 14 Nov 2017 • Julian Bueno, Sheler Maktoobi, Luc Froehly, Ingo Fischer, Maxime Jacquot, Laurent Larger, Daniel Brunner
Realizing photonic Neural Networks with numerous nonlinear nodes in a fully parallel and efficient learning hardware was lacking so far.
no code implementations • 12 Jan 2015 • Michiel Hermans, Miguel Soriano, Joni Dambre, Peter Bienstman, Ingo Fischer
We perform physical experiments that demonstrate that the obtained input encodings work well in reality, and we show that optimized systems perform significantly better than the common Reservoir Computing approach.