Search Results for author: Kohei Nakajima

Found 16 papers, 2 papers with code

Hierarchy of the echo state property in quantum reservoir computing

no code implementations5 Mar 2024 Shumpei Kobayashi, Quoc Hoan Tran, Kohei Nakajima

The echo state property (ESP) represents a fundamental concept in the reservoir computing (RC) framework that ensures output-only training of reservoir networks by being agnostic to the initial states and far past inputs.

Self-organized criticality for dendritic readiness potential

no code implementations19 Sep 2022 Katsushi Kagaya, Tomoyuki Kubota, Kohei Nakajima

Self-organized criticality is a principle explaining avalanche-like phenomena obeying power-laws in integrate-and-fire type dynamical systems.

Relation

Quantum-Classical Hybrid Information Processing via a Single Quantum System

no code implementations1 Sep 2022 Quoc Hoan Tran, Sanjib Ghosh, Kohei Nakajima

Current technologies in quantum-based communications bring a new integration of quantum data with classical data for hybrid processing.

Quantum Machine Learning

Quantum Noise-Induced Reservoir Computing

no code implementations16 Jul 2022 Tomoyuki Kubota, Yudai Suzuki, Shumpei Kobayashi, Quoc Hoan Tran, Naoki Yamamoto, Kohei Nakajima

We demonstrate this ability in several typical benchmarks and investigate the information processing capacity to clarify the framework's processing mechanism and memory profile.

Composite FORCE learning of chaotic echo state networks for time-series prediction

no code implementations6 Jul 2022 Yansong Li, Kai Hu, Kohei Nakajima, Yongping Pan

Echo state network (ESN), a kind of recurrent neural networks, consists of a fixed reservoir in which neurons are connected randomly and recursively and obtains the desired output only by training output connection weights.

Time Series Time Series Prediction

Transient Chaos in BERT

no code implementations6 Jun 2021 Katsuma Inoue, Soh Ohara, Yasuo Kuniyoshi, Kohei Nakajima

A Lite BERT (ALBERT) is literally characterized as a lightweight version of BERT, in which the number of BERT parameters is reduced by repeatedly applying the same neural network called Transformer's encoder layer.

Learning Temporal Quantum Tomography

no code implementations25 Mar 2021 Quoc Hoan Tran, Kohei Nakajima

Quantifying and verifying the control level in preparing a quantum state are central challenges in building quantum devices.

Universal Approximation Property of Quantum Machine Learning Models in Quantum-Enhanced Feature Spaces

no code implementations1 Sep 2020 Takahiro Goto, Quoc Hoan Tran, Kohei Nakajima

This feature map provides opportunities to incorporate quantum advantages into machine learning algorithms to be performed on near-term intermediate-scale quantum computers.

BIG-bench Machine Learning General Classification +1

Higher-Order Quantum Reservoir Computing

1 code implementation16 Jun 2020 Quoc Hoan Tran, Kohei Nakajima

Quantum reservoir computing (QRC) is an emerging paradigm for harnessing the natural dynamics of quantum systems as computational resources that can be used for temporal machine learning tasks.

BIG-bench Machine Learning

Physical reservoir computing -- An introductory perspective

no code implementations3 May 2020 Kohei Nakajima

Understanding the fundamental relationships between physics and its information-processing capability has been an active research topic for many years.

Edge-computing

Optimal short-term memory before the edge of chaos in driven random recurrent networks

no code implementations24 Dec 2019 Taichi Haruna, Kohei Nakajima

The ability of discrete-time nonlinear recurrent neural networks to store time-varying small input signals is investigated by mean-field theory.

A Unifying Framework for Information Processing in Stochastically Driven Dynamical Systems

no code implementations11 Jun 2019 Tomoyuki Kubota, Hirokazu Takahashi, Kohei Nakajima

First, we establish a connection between the IPC for time-invariant systems and PC expansion, which is a type of polynomial expansion using orthogonal functions of input history as bases.

Use of recurrent infomax to improve the memory capability of input-driven recurrent neural networks

no code implementations14 Feb 2018 Hisashi Iwade, Kohei Nakajima, Takuma Tanaka, Toshio Aoyagi

The inherent transient dynamics of recurrent neural networks (RNNs) have been exploited as a computational resource in input-driven RNNs.

Cannot find the paper you are looking for? You can Submit a new open access paper.