Search Results for author: Kazuma Suetake

Found 6 papers, 1 papers with code

Magic for the Age of Quantized DNNs

no code implementations22 Mar 2024 Yoshihide Sawada, Ryuji Saiin, Kazuma Suetake

Recently, the number of parameters in DNNs has explosively increased, as exemplified by LLMs (Large Language Models), making inference on small-scale computers more difficult.

Model Compression Quantization

Convergences for Minimax Optimization Problems over Infinite-Dimensional Spaces Towards Stability in Adversarial Training

no code implementations2 Dec 2023 Takashi Furuya, Satoshi Okuda, Kazuma Suetake, Yoshihide Sawada

This instability problem comes from the difficulty of the minimax optimization, and there have been various approaches in GANs and UDAs to overcome this problem.

Theoretical Error Analysis of Entropy Approximation for Gaussian Mixture

no code implementations26 Feb 2022 Takashi Furuya, Hiroyuki Kusumoto, Koichi Taniguchi, Naoya Kanno, Kazuma Suetake

Notably, Gal and Ghahramani [2016] proposed the approximate entropy that is the sum of the entropies of unimodal Gaussian distributions.

Variational Inference

S$^3$NN: Time Step Reduction of Spiking Surrogate Gradients for Training Energy Efficient Single-Step Spiking Neural Networks

no code implementations26 Jan 2022 Kazuma Suetake, Shin-ichi Ikegawa, Ryuji Saiin, Yoshihide Sawada

To solve these problems, we propose a single-step spiking neural network (S$^3$NN), an energy-efficient neural network with low computational cost and high precision.

Efficient Neural Network Time Series +1

Spectral Pruning for Recurrent Neural Networks

1 code implementation23 May 2021 Takashi Furuya, Kazuma Suetake, Koichi Taniguchi, Hiroyuki Kusumoto, Ryuji Saiin, Tomohiro Daimon

Recurrent neural networks (RNNs) are a class of neural networks used in sequential tasks.

Edge-computing

Cannot find the paper you are looking for? You can Submit a new open access paper.