1 code implementation • CVPR 2020 • Jian-Hao Luo, Jianxin Wu
Knowledge distillation is an effective approach to compensate for the weakness of limited data.
1 code implementation • 18 Apr 2016 • Xiu-Shen Wei, Jian-Hao Luo, Jianxin Wu, Zhi-Hua Zhou
Moreover, on general image retrieval datasets, SCDA achieves comparable retrieval results with state-of-the-art general image retrieval approaches.
no code implementations • 23 May 2018 • Jian-Hao Luo, Jianxin Wu
Previous filter pruning algorithms regard channel pruning and model fine-tuning as two independent steps.
no code implementations • 8 Mar 2018 • Jianxin Wu, Jian-Hao Luo
Although traditionally binary visual representations are mainly designed to reduce computational and storage costs in the image retrieval research, this paper argues that binary visual representations can be applied to large scale recognition and detection problems in addition to hashing in retrieval.
no code implementations • ICCV 2017 • Jian-Hao Luo, Jianxin Wu, Weiyao Lin
Similar experiments with ResNet-50 reveal that even for a compact network, ThiNet can also reduce more than half of the parameters and FLOPs, at the cost of roughly 1$\%$ top-5 accuracy drop.
no code implementations • 19 Jun 2017 • Jian-Hao Luo, Jianxin Wu
Experiments on the ILSVRC-12 benchmark demonstrate the effectiveness of our method.
no code implementations • 24 May 2016 • Jianxin Wu, Chen-Wei Xie, Jian-Hao Luo
Large receptive field and dense prediction are both important for achieving high accuracy in pixel labeling tasks such as semantic segmentation.