Search Results for author: Hojung Lee

Found 4 papers, 2 papers with code

Students are the Best Teacher: Exit-Ensemble Distillation with Multi-Exits

1 code implementation1 Apr 2021 Hojung Lee, Jong-Seok Lee

This paper proposes a novel knowledge distillation-based learning method to improve the classification performance of convolutional neural networks (CNNs) without a pre-trained teacher network, called exit-ensemble distillation.

Classification General Classification +1

Local Critic Training for Model-Parallel Learning of Deep Neural Networks

1 code implementation3 Feb 2021 Hojung Lee, Cho-Jui Hsieh, Jong-Seok Lee

We show that the proposed approach successfully decouples the update process of the layer groups for both convolutional neural networks (CNNs) and recurrent neural networks (RNNs).

Local Critic Training of Deep Neural Networks

no code implementations ICLR 2019 Hojung Lee, Jong-Seok Lee

This paper proposes a novel approach to train deep neural networks by unlocking the layer-wise dependency of backpropagation training.

General Classification

Cannot find the paper you are looking for? You can Submit a new open access paper.