Search Results for author: Jinbae Park

Found 2 papers, 1 papers with code

Distilling Global and Local Logits With Densely Connected Relations

1 code implementation ICCV 2021 Youmin Kim, Jinbae Park, YounHo Jang, Muhammad Ali, Tae-Hyun Oh, Sung-Ho Bae

In prevalent knowledge distillation, logits in most image recognition models are computed by global average pooling, then used to learn to encode the high-level and task-relevant knowledge.

Image Classification Knowledge Distillation +3

Hybrid Weight Representation: A Quantization Method Represented with Ternary and Sparse-Large Weights

no code implementations25 Sep 2019 Jinbae Park, Sung-Ho Bae

To solve this problem, we propose a hybrid weight representation (HWR) method which produces a network consisting of two types of weights, i. e., ternary weights (TW) and sparse-large weights (SLW).

Quantization

Cannot find the paper you are looking for? You can Submit a new open access paper.