Search Results for author: Chong Mou

Found 6 papers, 5 papers with code

Metric Learning based Interactive Modulation for Real-World Super-Resolution

no code implementations10 May 2022 Chong Mou, Yanze Wu, Xintao Wang, Chao Dong, Jian Zhang, Ying Shan

Instead of using known degradation levels as explicit supervision to the interactive mechanism, we propose a metric learning strategy to map the unquantifiable degradation levels in real-world scenarios to a metric space, which is trained in an unsupervised manner.

Image Restoration Metric Learning +1

Deep Generalized Unfolding Networks for Image Restoration

1 code implementation28 Apr 2022 Chong Mou, Qian Wang, Jian Zhang

Concretely, without loss of interpretability, we integrate a gradient estimation strategy into the gradient descent step of the Proximal Gradient Descent (PGD) algorithm, driving it to deal with complex and real-world image degradation.

Image Restoration

Dynamic Attentive Graph Learning for Image Restoration

1 code implementation ICCV 2021 Chong Mou, Jian Zhang, Zhuoyuan Wu

Specifically, we propose an improved graph model to perform patch-wise graph convolution with a dynamic and adaptive number of neighbors for each node.

Demosaicking Graph Learning +1

Dense Deep Unfolding Network with 3D-CNN Prior for Snapshot Compressive Imaging

1 code implementation ICCV 2021 Zhuoyuan Wu, Jian Zhang, Chong Mou

To better exploit the spatial-temporal correlation among frames and address the problem of information loss between adjacent phases in existing DUNs, we propose to adopt the 3D-CNN prior in our proximal mapping module and develop a novel dense feature map (DFM) strategy, respectively.

COLA-Net: Collaborative Attention Network for Image Restoration

2 code implementations10 Mar 2021 Chong Mou, Jian Zhang, Xiaopeng Fan, Hangfan Liu, Ronggang Wang

Local and non-local attention-based methods have been well studied in various image restoration tasks while leading to promising performance.

Image Denoising Image Restoration

Cannot find the paper you are looking for? You can Submit a new open access paper.