no code implementations • 23 Jul 2024 • Hiroyuki Tokunaga, Joel Nicholls, Daria Vazhenina, Atsunori Kanemura
By quantizing network weights and activations to low bitwidth, we can obtain hardware-friendly and energy-efficient networks.
1 code implementation • 9 Apr 2021 • Ryuji Imamura, Kohei Azuma, Atsushi Hanamoto, Atsunori Kanemura
The proposed method, multi-layer feature sparse coding (MLF-SC), employs a neural network for feature extraction, and feature maps from intermediate layers of the network are given to sparse coding, whereas the standard sparse-coding-based anomaly detection method directly works on given images.
1 code implementation • 18 Feb 2018 • Kento Nozawa, Masanari Kimura, Atsunori Kanemura
Embedding graph nodes into a vector space can allow the use of machine learning to e. g. predict node classes, but the study of node embedding algorithms is immature compared to the natural language processing field because of a diverse nature of graphs.
no code implementations • ICLR 2018 • Sotetsu Koyamada, Yuta Kikuchi, Atsunori Kanemura, Shin-ichi Maeda, Shin Ishii
Neural sequence generation is commonly approached by using maximum- likelihood (ML) estimation or reinforcement learning (RL).
1 code implementation • 30 Jun 2017 • Sotetsu Koyamada, Yuta Kikuchi, Atsunori Kanemura, Shin-ichi Maeda, Shin Ishii
We propose a new neural sequence model training method in which the objective function is defined by $\alpha$-divergence.