Search Results for author: Laurent Larger

Found 8 papers, 0 papers with code

Efficient Design of Hardware-Enabled Reservoir Computing in FPGAs

no code implementations4 May 2018 Bogdan Penkovsky, Laurent Larger, Daniel Brunner

In this work, we propose a new approach towards the efficient optimization and implementation of reservoir computing hardware reducing the required domain expert knowledge and optimization effort.

Dimensionality Reduction

Nonlinear memory capacity of parallel time-delay reservoir computers in the processing of multidimensional signals

no code implementations13 Oct 2015 Lyudmila Grigoryeva, Julie Henriques, Laurent Larger, Juan-Pablo Ortega

This paper addresses the reservoir design problem in the context of delay-based reservoir computers for multidimensional input signals, parallel architectures, and real-time multitasking.

Fundamental aspects of noise in analog-hardware neural networks

no code implementations21 Jul 2019 Nadezhda Semenova, Xavier Porte, Louis Andreoli, Maxime Jacquot, Laurent Larger, Daniel Brunner

The system under study consists of noisy linear nodes, and we investigate the signal-to-noise ratio at the network's outputs which is the upper limit to such a system's computing accuracy.

Management

Reservoir-size dependent learning in analogue neural networks

no code implementations23 Jul 2019 Xavier Porte, Louis Andreoli, Maxime Jacquot, Laurent Larger, Daniel Brunner

However, important questions regarding impact of reservoir size and learning routines on the convergence-speed during learning remain unaddressed.

Boolean learning under noise-perturbations in hardware neural networks

no code implementations27 Mar 2020 Louis Andreoli, Xavier Porte, Stéphane Chrétien, Maxime Jacquot, Laurent Larger, Daniel Brunner

A high efficiency hardware integration of neural networks benefits from realizing nonlinearity, network connectivity and learning fully in a physical substrate.

Understanding and mitigating noise in trained deep neural networks

no code implementations12 Mar 2021 Nadezhda Semenova, Laurent Larger, Daniel Brunner

Here, we determine for the first time the propagation of noise in deep neural networks comprising noisy nonlinear neurons in trained fully connected layers.

Unity

Cannot find the paper you are looking for? You can Submit a new open access paper.