Search Results for author: Kazuyuki Aihara

Found 14 papers, 1 papers with code

Trained Latent Space Navigation to Prevent Lack of Photorealism in Generated Images on Style-based Models

no code implementations2 Oct 2023 Takumi Harada, Kazuyuki Aihara, Hiroyuki Sakai

Experimental results demonstrate that images generated within the local latent subspace maintain photorealism even when the latent codes are significantly and repeatedly manipulated.

Sparse-firing regularization methods for spiking neural networks with time-to-first spike coding

no code implementations24 Jul 2023 Yusuke Sakemi, Kakei Yamamoto, Takeo Hosomi, Kazuyuki Aihara

The training of multilayer spiking neural networks (SNNs) using the error backpropagation algorithm has made significant progress in recent years.

Learning Reservoir Dynamics with Temporal Self-Modulation

no code implementations23 Jan 2023 Yusuke Sakemi, Sou Nobukawa, Toshitaka Matsuki, Takashi Morie, Kazuyuki Aihara

In this paper, to improve the learning ability of RC, we propose self-modulated RC (SM-RC), which extends RC by adding a self-modulation mechanism.

Time Series Time Series Analysis

Timing-Based Backpropagation in Spiking Neural Networks Without Single-Spike Restrictions

no code implementations29 Nov 2022 Kakei Yamamoto, Yusuke Sakemi, Kazuyuki Aihara

That was not seen in conventional SNNs with single-spike restrictions on time-to-fast-spike (TTFS) coding.

Entropic Herding

no code implementations22 Dec 2021 Hiroshi Yamashita, Hideyuki Suzuki, Kazuyuki Aihara

Using the proposed entropic herding algorithm as a framework, we discuss a closer connection between herding and the maximum entropy principle.

Reservoir Computing with Diverse Timescales for Prediction of Multiscale Dynamics

no code implementations21 Aug 2021 Gouhei Tanaka, Tadayoshi Matsumori, Hiroaki Yoshida, Kazuyuki Aihara

To develop an efficient machine learning method dedicated to modeling and prediction of multiscale dynamics, we propose a reservoir computing (RC) model with diverse timescales by using a recurrent network of heterogeneous leaky integrator (LI) neurons.

BIG-bench Machine Learning Time Series +1

Effects of VLSI Circuit Constraints on Temporal-Coding Multilayer Spiking Neural Networks

no code implementations18 Jun 2021 Yusuke Sakemi, Takashi Morie, Takeo Hosomi, Kazuyuki Aihara

As SNNs are continuous-state and continuous-time models, it is favorable to implement them with analog VLSI circuits.

Quantization

Analytical estimation of maximum fraction of infected individuals with one-shot non-pharmaceutical intervention in a hybrid epidemic model

no code implementations3 Mar 2021 Naoya Fujiwara, Tomokatsu Onaga, Takayuki Wada, Shouhei Takeuchi, Junji Seto, Tomoki Nakaya, Kazuyuki Aihara

This suggests the importance of the strength and time of NPIs, although detailed studies are necessary for the implementation of NPIs in complicated real-world environments as the model used in this study is based on various simplifications.

Model-Size Reduction for Reservoir Computing by Concatenating Internal States Through Time

no code implementations11 Jun 2020 Yusuke Sakemi, Kai Morino, Timothée Leleu, Kazuyuki Aihara

Reservoir computing (RC) is a machine learning algorithm that can learn complex time series from data very rapidly based on the use of high-dimensional dynamical systems, such as random networks of neurons, called "reservoirs."

Edge-computing Time Series +1

A Supervised Learning Algorithm for Multilayer Spiking Neural Networks Based on Temporal Coding Toward Energy-Efficient VLSI Processor Design

no code implementations8 Jan 2020 Yusuke Sakemi, Kai Morino, Takashi Morie, Kazuyuki Aihara

We also propose several techniques to improve the performance on a recognition task, and show that the classification accuracy of the proposed algorithm is as high as that of the state-of-the-art temporal coding SNN algorithms on the MNIST dataset.

Fully Neural Network based Model for General Temporal Point Processes

1 code implementation NeurIPS 2019 Takahiro Omi, Naonori Ueda, Kazuyuki Aihara

We herein propose a novel RNN based model in which the time course of the intensity function is represented in a general manner.

Point Processes Time Series +1

Approximated Infomax Early Stopping: Revisiting Gaussian RBMs on Natural Images

no code implementations19 Dec 2013 Taichi Kiwaki, Takaki Makino, Kazuyuki Aihara

We pursue an early stopping technique that helps Gaussian Restricted Boltzmann Machines (GRBMs) to gain good natural image representations in terms of overcompleteness and data fitting.

Attribute

Cannot find the paper you are looking for? You can Submit a new open access paper.