no code implementations • 15 Apr 2022 • Changhun Lee, HyungJun Kim, Eunhyeok Park, Jae-Joon Kim
We argue that using statistical data from a batch fails to capture the crucial information for each input instance in BNN computations, and the differences between statistical information computed from each instance need to be considered when determining the binary activation threshold of each instance.
no code implementations • CVPR 2021 • HyungJun Kim, Jihoon Park, Changhun Lee, Jae-Joon Kim
We also show that adjusting the threshold values of binary activation functions results in the unbalanced distribution of the binary activation, which increases the accuracy of BNN models.
no code implementations • 22 Sep 2020 • Hwaran Lee, Seokhwan Jo, HyungJun Kim, SangKeun Jung, Tae-Yoon Kim
To our best knowledge, our work is the first comprehensive study of a modularized E2E multi-domain dialog system that learning from each component to the entire dialog policy for task success.
no code implementations • 8 Sep 2020 • Eunho Koo, HyungJun Kim
In this study, we considered the distribution error, i. e., the inconsistency of two distributions (those of the predicted values and label), as the prediction error, and proposed weighted empirical stretching (WES) as a novel loss function to increase the overlap area of the two distributions.
1 code implementation • ICLR 2020 • Hyungjun Kim, Kyung-Su Kim, Jinseok Kim, Jae-Joon Kim
Binary Neural Networks (BNNs) have been garnering interest thanks to their compute cost reduction and memory savings.
no code implementations • 24 Jul 2019 • Hyungjun Kim, Malte Rasch, Tayfun Gokmen, Takashi Ando, Hiroyuki Miyazoe, Jae-Joon Kim, John Rozen, Seyoung Kim
By using this zero-shifting method, we show that network performance dramatically improves for imbalanced synapse devices.
no code implementations • 23 Mar 2019 • Hyungjun Kim, Yulhwa Kim, Sungju Ryu, Jae-Joon Kim
We demonstrate that the BitSplit version of LeNet-5, VGG-9, AlexNet, and ResNet-18 can be trained to have similar classification accuracy at a lower computational cost compared to conventional multi-bit networks with low bit precision (<= 4-bit).
1 code implementation • 6 Nov 2018 • Yulhwa Kim, HyungJun Kim, Jae-Joon Kim
Recently, RRAM-based Binary Neural Network (BNN) hardware has been gaining interests as it requires 1-bit sense-amp only and eliminates the need for high-resolution ADC and DAC.
no code implementations • 30 Mar 2017 • Hyungjun Kim, Taesu Kim, Jinseok Kim, Jae-Joon Kim
Artificial Neural Network computation relies on intensive vector-matrix multiplications.