no code implementations • 23 Jan 2023 • Yusuke Sakemi, Sou Nobukawa, Toshitaka Matsuki, Takashi Morie, Kazuyuki Aihara
In this paper, to improve the learning ability of RC, we propose self-modulated RC (SM-RC), which extends RC by adding a self-modulation mechanism.
no code implementations • 29 Nov 2022 • Kakei Yamamoto, Yusuke Sakemi, Kazuyuki Aihara
That was not seen in conventional SNNs with single-spike restrictions on time-to-fast-spike (TTFS) coding.
no code implementations • 22 Dec 2021 • Hiroshi Yamashita, Hideyuki Suzuki, Kazuyuki Aihara
Using the proposed entropic herding algorithm as a framework, we discuss a closer connection between herding and the maximum entropy principle.
no code implementations • 21 Aug 2021 • Gouhei Tanaka, Tadayoshi Matsumori, Hiroaki Yoshida, Kazuyuki Aihara
To develop an efficient machine learning method dedicated to modeling and prediction of multiscale dynamics, we propose a reservoir computing (RC) model with diverse timescales by using a recurrent network of heterogeneous leaky integrator (LI) neurons.
no code implementations • 18 Jun 2021 • Yusuke Sakemi, Takashi Morie, Takeo Hosomi, Kazuyuki Aihara
As SNNs are continuous-state and continuous-time models, it is favorable to implement them with analog VLSI circuits.
no code implementations • 3 Mar 2021 • Naoya Fujiwara, Tomokatsu Onaga, Takayuki Wada, Shouhei Takeuchi, Junji Seto, Tomoki Nakaya, Kazuyuki Aihara
This suggests the importance of the strength and time of NPIs, although detailed studies are necessary for the implementation of NPIs in complicated real-world environments as the model used in this study is based on various simplifications.
no code implementations • 11 Jun 2020 • Yusuke Sakemi, Kai Morino, Timothée Leleu, Kazuyuki Aihara
Reservoir computing (RC) is a machine learning algorithm that can learn complex time series from data very rapidly based on the use of high-dimensional dynamical systems, such as random networks of neurons, called "reservoirs."
no code implementations • 8 Jan 2020 • Yusuke Sakemi, Kai Morino, Takashi Morie, Kazuyuki Aihara
We also propose several techniques to improve the performance on a recognition task, and show that the classification accuracy of the proposed algorithm is as high as that of the state-of-the-art temporal coding SNN algorithms on the MNIST dataset.
no code implementations • 19 Aug 2019 • Hideyuki Miyahara, Kazuyuki Aihara, Wolfgang Lechner
Clustering algorithms are a cornerstone of machine learning applications.
no code implementations • 2 Jul 2019 • Shunya Okuno, Kazuyuki Aihara, Yoshito Hirata
We show that the framework is applicable to a wide range of data lengths and dimensions.
1 code implementation • NeurIPS 2019 • Takahiro Omi, Naonori Ueda, Kazuyuki Aihara
We herein propose a novel RNN based model in which the time course of the intensity function is represented in a general manner.
no code implementations • 19 Dec 2013 • Taichi Kiwaki, Takaki Makino, Kazuyuki Aihara
We pursue an early stopping technique that helps Gaussian Restricted Boltzmann Machines (GRBMs) to gain good natural image representations in terms of overcompleteness and data fitting.