Search Results for author: Joseph D. Hart

Found 5 papers, 2 papers with code

Optimizing time-shifts for reservoir computing using a rank-revealing QR algorithm

no code implementations29 Nov 2022 Joseph D. Hart, Francesco Sorrentino, Thomas L. Carroll

Reservoir computing is a recurrent neural network paradigm in which only the output layer is trained.

Time Shifts to Reduce the Size of Reservoir Computers

no code implementations3 May 2022 Thomas L. Carroll, Joseph D. Hart

Additionally, the need to create and connect large numbers of nonlinear nodes makes it difficult to design and build analog reservoir computers that can be faster and consume less power than digital reservoir computers.

Machine Learning Link Inference of Noisy Delay-coupled Networks with Opto-Electronic Experimental Tests

no code implementations29 Oct 2020 Amitava Banerjee, Joseph D. Hart, Rajarshi Roy, Edward Ott

To achieve this, we first train a type of machine learning system known as reservoir computing to mimic the dynamics of the unknown network.

BIG-bench Machine Learning Time Series

Critical Switching in Globally Attractive Chimeras

1 code implementation18 Nov 2019 Yuanzhao Zhang, Zachary G. Nicolaou, Joseph D. Hart, Rajarshi Roy, Adilson E. Motter

We report on a new type of chimera state that attracts almost all initial conditions and exhibits power-law switching behavior in networks of coupled oscillators.

Disordered Systems and Neural Networks Dynamical Systems Adaptation and Self-Organizing Systems Chaotic Dynamics Pattern Formation and Solitons

Topological Control of Synchronization Patterns: Trading Symmetry for Stability

1 code implementation8 Feb 2019 Joseph D. Hart, Yuanzhao Zhang, Rajarshi Roy, Adilson E. Motter

Symmetries are ubiquitous in network systems and have profound impacts on the observable dynamics.

Adaptation and Self-Organizing Systems Disordered Systems and Neural Networks Chaotic Dynamics Pattern Formation and Solitons

Cannot find the paper you are looking for? You can Submit a new open access paper.