no code implementations • 22 Mar 2024 • Yoshihide Sawada, Ryuji Saiin, Kazuma Suetake
Recently, the number of parameters in DNNs has explosively increased, as exemplified by LLMs (Large Language Models), making inference on small-scale computers more difficult.
no code implementations • 2 Dec 2023 • Takashi Furuya, Satoshi Okuda, Kazuma Suetake, Yoshihide Sawada
This instability problem comes from the difficulty of the minimax optimization, and there have been various approaches in GANs and UDAs to overcome this problem.
no code implementations • 3 Feb 2023 • Kazuma Suetake, Takuya Ushimaru, Ryuji Saiin, Yoshihide Sawada
Spiking neural networks (SNNs) are energy-efficient neural networks because of their spiking nature.
no code implementations • 26 Feb 2022 • Takashi Furuya, Hiroyuki Kusumoto, Koichi Taniguchi, Naoya Kanno, Kazuma Suetake
Notably, Gal and Ghahramani [2016] proposed the approximate entropy that is the sum of the entropies of unimodal Gaussian distributions.
no code implementations • 26 Jan 2022 • Kazuma Suetake, Shin-ichi Ikegawa, Ryuji Saiin, Yoshihide Sawada
To solve these problems, we propose a single-step spiking neural network (S$^3$NN), an energy-efficient neural network with low computational cost and high precision.
1 code implementation • 23 May 2021 • Takashi Furuya, Kazuma Suetake, Koichi Taniguchi, Hiroyuki Kusumoto, Ryuji Saiin, Tomohiro Daimon
Recurrent neural networks (RNNs) are a class of neural networks used in sequential tasks.