1 code implementation • 19 Oct 2020 • Zhuo Su, Linpu Fang, Deke Guo, Dewen Hu, Matti Pietikäinen, Li Liu
Binary neural networks (BNNs), where both weights and activations are binarized into 1 bit, have been widely studied in recent years due to its great benefit of highly accelerated computation and substantially reduced memory footprint that appeal to the development of resource constrained devices.
1 code implementation • ECCV 2020 • Linpu Fang, Xingyan Liu, Li Liu, Hang Xu, Wenxiong Kang
The key ideas are two-fold: a) explicitly modeling the dependencies among joints and the relations between the pixels and the joints for better local feature representation learning; b) unifying the dense pixel-wise offset predictions and direct joint regression for end-to-end training.
1 code implementation • ECCV 2020 • Zhuo Su, Linpu Fang, Wenxiong Kang, Dewen Hu, Matti Pietikäinen, Li Liu
In this paper, we propose dynamic group convolution (DGC) that adaptively selects which part of input channels to be connected within each group for individual samples on the fly.
no code implementations • 18 Feb 2020 • Linpu Fang, Hang Xu, Zhili Liu, Sarah Parisot, Zhenguo Li
In this paper, we study the hybrid-supervised object detection problem, aiming to train a high quality detector with only a limited amount of fullyannotated data and fully exploiting cheap data with imagelevel labels.
no code implementations • 18 Feb 2020 • Hang Xu, Linpu Fang, Xiaodan Liang, Wenxiong Kang, Zhenguo Li
Finally, an InterDomain Transfer Module is proposed to exploit diverse transfer dependencies across all domains and enhance the regional feature representation by attending and transferring semantic contexts globally.