no code implementations • NeurIPS 2021 • Muhammad Awais, Fengwei Zhou, Chuanlong Xie, Jiawei Li, Sung-Ho Bae, Zhenguo Li
First, we theoretically show the transferability of robustness from an adversarially trained teacher model to a student model with the help of mixup augmentation.
no code implementations • 29 Sep 2021 • Linh-Tam Tran, A F M Shahab Uddin, Sung-Ho Bae
To remedy this problem, we propose a new paradigm for NAS that effectively reduces the use of memory while maintaining high performance.
no code implementations • ICCV 2021 • Muhammad Awais, Fengwei Zhou, Hang Xu, Lanqing Hong, Ping Luo, Sung-Ho Bae, Zhenguo Li
Extensive Unsupervised Domain Adaptation (UDA) studies have shown great success in practice by learning transferable representations across a labeled source domain and an unlabeled target domain with deep models.
no code implementations • ICCV 2021 • Youmin Kim, Jinbae Park, YounHo Jang, Muhammad Ali, Tae-Hyun Oh, Sung-Ho Bae
In prevalent knowledge distillation, logits in most image recognition models are computed by global average pooling, then used to learn to encode the high-level and task-relevant knowledge.
Ranked #10 on
Knowledge Distillation
on ImageNet
3 code implementations • 15 Sep 2020 • Kai Zhang, Martin Danelljan, Yawei Li, Radu Timofte, Jie Liu, Jie Tang, Gangshan Wu, Yu Zhu, Xiangyu He, Wenjie Xu, Chenghua Li, Cong Leng, Jian Cheng, Guangyang Wu, Wenyi Wang, Xiaohong Liu, Hengyuan Zhao, Xiangtao Kong, Jingwen He, Yu Qiao, Chao Dong, Maitreya Suin, Kuldeep Purohit, A. N. Rajagopalan, Xiaochuan Li, Zhiqiang Lang, Jiangtao Nie, Wei Wei, Lei Zhang, Abdul Muqeet, Jiwon Hwang, Subin Yang, JungHeum Kang, Sung-Ho Bae, Yongwoo Kim, Geun-Woo Jeon, Jun-Ho Choi, Jun-Hyuk Kim, Jong-Seok Lee, Steven Marty, Eric Marty, Dongliang Xiong, Siang Chen, Lin Zha, Jiande Jiang, Xinbo Gao, Wen Lu, Haicheng Wang, Vineeth Bhaskara, Alex Levinshtein, Stavros Tsogkas, Allan Jepson, Xiangzhen Kong, Tongtong Zhao, Shanshan Zhao, Hrishikesh P. S, Densen Puthussery, Jiji C. V, Nan Nan, Shuai Liu, Jie Cai, Zibo Meng, Jiaming Ding, Chiu Man Ho, Xuehui Wang, Qiong Yan, Yuzhi Zhao, Long Chen, Jiangtao Zhang, Xiaotong Luo, Liang Chen, Yanyun Qu, Long Sun, Wenhao Wang, Zhenbing Liu, Rushi Lan, Rao Muhammad Umer, Christian Micheloni
This paper reviews the AIM 2020 challenge on efficient single image super-resolution with focus on the proposed solutions and results.
1 code implementation • 29 Aug 2020 • Abdul Muqeet, Jiwon Hwang, Subin Yang, Jung Heum Kang, Yongwoo Kim, Sung-Ho Bae
MAFFSRN consists of proposed feature fusion groups (FFGs) that serve as a feature extraction block.
1 code implementation • 19 Jun 2020 • Muhammad Awais, Fahad Shamshad, Sung-Ho Bae
In this paper, we investigate how BatchNorm causes this vulnerability and proposed new normalization that is robust to adversarial attacks.
1 code implementation • ICLR 2021 • A. F. M. Shahab Uddin, Mst. Sirazam Monira, Wheemyung Shin, TaeChoong Chung, Sung-Ho Bae
We argue that such random selection strategies of the patches may not necessarily represent sufficient information about the corresponding object and thereby mixing the labels according to that uninformative patch enables the model to learn unexpected feature representation.
no code implementations • 25 Sep 2019 • Jinbae Park, Sung-Ho Bae
To solve this problem, we propose a hybrid weight representation (HWR) method which produces a network consisting of two types of weights, i. e., ternary weights (TW) and sparse-large weights (SLW).
no code implementations • 25 Sep 2019 • JoonHyun Jeong, Sung-Ho Bae
Some conventional transforms such as Discrete Walsh-Hadamard Transform (DWHT) and Discrete Cosine Transform (DCT) have been widely used as feature extractors in image processing but rarely applied in neural networks.
no code implementations • 16 Jul 2019 • Kang-Ho Lee, JoonHyun Jeong, Sung-Ho Bae
Based on SVWH, we propose a second ILWP and quantization method which quantize the predicted residuals between the weights in adjacent convolution layers.
1 code implementation • 11 Jul 2019 • Abdul Muqeet, Md Tauhid Bin Iqbal, Sung-Ho Bae
To address these issues, we present a binarized feature fusion (BFF) structure that utilizes the extracted features from residual groups (RG) in an effective way.
no code implementations • 25 Jun 2019 • Joonhyun Jeong, Sung-Ho Bae
Some conventional transforms such as Discrete Walsh-Hadamard Transform (DWHT) and Discrete Cosine Transform (DCT) have been widely used as feature extractors in image processing but rarely applied in neural networks.
no code implementations • 10 May 2017 • Sung-Ho Bae, Mohamed Elgharib, Mohamed Hefeeda, Wojciech Matusik
We present two FCN architectures for SIVG.
4 code implementations • 9 May 2017 • Ahmed Hassanien, Mohamed Elgharib, Ahmed Selim, Sung-Ho Bae, Mohamed Hefeeda, Wojciech Matusik
Since current datasets are not large enough to train an accurate SBD CNN, we present a new dataset containing more than 3. 5 million frames of sharp and gradual transitions.
Ranked #1 on
Camera shot boundary detection
on ClipShots