2 code implementations • 4 Apr 2024 • Cheeun Hong, Kyoung Mu Lee
Although image super-resolution (SR) problem has experienced unprecedented restoration accuracy with deep neural networks, it has yet limited versatile applications due to the substantial computational costs.
no code implementations • 25 Jul 2023 • Cheeun Hong, Kyoung Mu Lee
Quantization is a promising approach to reduce the high computational complexity of image super-resolution (SR) networks.
1 code implementation • 21 Jul 2022 • Cheeun Hong, Sungyong Baik, Heewon Kim, Seungjun Nah, Kyoung Mu Lee
In this work, to achieve high average bit-reduction with less accuracy loss, we propose a novel Content-Aware Dynamic Quantization (CADyQ) method for SR networks that allocates optimal bits to local regions and layers adaptively based on the local contents of an input image.
1 code implementation • CVPR 2022 • Junghun Oh, Heewon Kim, Seungjun Nah, Cheeun Hong, Jonghyun Choi, Kyoung Mu Lee
Image restoration tasks have witnessed great performance improvement in recent years by developing large deep models.
no code implementations • 2 Dec 2021 • Junghun Oh, Heewon Kim, Sungyong Baik, Cheeun Hong, Kyoung Mu Lee
The goal of filter pruning is to search for unimportant filters to remove in order to make convolutional neural networks (CNNs) efficient without sacrificing the performance in the process.
2 code implementations • 21 Dec 2020 • Cheeun Hong, Heewon Kim, Sungyong Baik, Junghun Oh, Kyoung Mu Lee
Quantizing deep convolutional neural networks for image super-resolution substantially reduces their computational costs.