no code implementations • 13 Aug 2024 • Hong Ye Tan, Subhadip Mukherjee, Junqi Tang, Carola-Bibiane Schönlieb
A natural step for analysis is thus to assume the manifold hypothesis and derive bounds that are independent of any embedding space.
no code implementations • 10 Jun 2024 • Matthias J. Ehrhardt, Zeljko Kereta, Jingwei Liang, Junqi Tang
We focus on the potential and the challenges for stochastic optimisation that are unique to inverse imaging problems and are not commonly encountered in machine learning.
no code implementations • 8 Apr 2024 • Hong Ye Tan, Ziruo Cai, Marcelo Pereyra, Subhadip Mukherjee, Junqi Tang, Carola-Bibiane Schönlieb
However, many such methods require the availability of ground truth data, which may be unavailable or expensive, leading to a fundamental barrier that can not be bypassed by choice of architecture.
no code implementations • 15 Nov 2023 • Marcello Carioni, Subhadip Mukherjee, Hong Ye Tan, Junqi Tang
Together with a detailed survey, we provide an overview of the key mathematical results that underlie the methods reviewed in the chapter to keep our discussion self-contained.
1 code implementation • 30 Jul 2023 • Qingping Zhou, Jiayu Qian, Junqi Tang, Jinglai Li
We provide experimental results on two nonlinear inverse problems: a nonlinear deconvolution problem, and an electrical impedance tomography problem with limited boundary measurements.
1 code implementation • 17 Apr 2023 • Ziruo Cai, Junqi Tang, Subhadip Mukherjee, Jinglai Li, Carola Bibiane Schönlieb, Xiaoqun Zhang
Bayesian methods for solving inverse problems are a powerful alternative to classical methods since the Bayesian approach offers the ability to quantify the uncertainty in the solution.
1 code implementation • 9 Mar 2023 • Hong Ye Tan, Subhadip Mukherjee, Junqi Tang, Carola-Bibiane Schönlieb
Plug-and-Play (PnP) methods are a class of efficient iterative methods that aim to combine data fidelity terms and deep denoisers using classical optimization algorithms, such as ISTA or ADMM, with applications in inverse problems and imaging.
no code implementations • 31 Aug 2022 • Junqi Tang, Subhadip Mukherjee, Carola-Bibiane Schönlieb
In this work we propose a new paradigm for designing efficient deep unrolling networks using dimensionality reduction schemes, including minibatch gradient approximation and operator sketching.
no code implementations • 2 Aug 2022 • Junqi Tang, Matthias Ehrhardt, Carola-Bibiane Schönlieb
In this work we propose a stochastic primal-dual preconditioned three-operator splitting algorithm for solving a class of convex three-composite optimization problems.
no code implementations • 21 Mar 2022 • Junqi Tang, Subhadip Mukherjee, Carola-Bibiane Schönlieb
In this work we propose a new paradigm for designing efficient deep unrolling networks using operator sketching.
no code implementations • 14 Mar 2022 • Junqi Tang
Unlike existing approaches which utilize stochastic gradient iterations for acceleration, we propose novel multi-stage sketched gradient iterations which first perform downsampling dimensionality reduction in the image space, and then efficiently approximate the true gradient using the sketched gradient in the low-dimensional space.
no code implementations • 22 Feb 2022 • Junqi Tang
This algorithmic framework is tailor for a clinical need in medical imaging practice, that after a reconstruction of the full tomographic image, the clinician may believe that some critical parts of the image are not clear enough, and may wish to see clearer these regions-of-interest.
no code implementations • 10 Feb 2022 • Junqi Tang
In this work, we propose Regularization-by-Equivariance (REV), a novel structure-adaptive regularization scheme for solving imaging inverse problems under incomplete measurements.
no code implementations • 19 Oct 2021 • Junqi Tang, Subhadip Mukherjee, Carola-Bibiane Schönlieb
We develop a stochastic (ordered-subsets) variant of the classical learned primal-dual (LPD), which is a state-of-the-art unrolling network for tomographic image reconstruction.
no code implementations • 20 Jun 2020 • Junqi Tang, Mike Davies
In this work we propose an efficient stochastic plug-and-play (PnP) algorithm for imaging inverse problems.
no code implementations • CVPR 2021 • Julián Tachella, Junqi Tang, Mike Davies
While the NTK theory accurately predicts the filter associated with networks trained using standard gradient descent, our analysis shows that it falls short to explain the behaviour of networks trained using the popular Adam optimizer.
no code implementations • 27 Feb 2020 • Derek Driggs, Junqi Tang, Jingwei Liang, Mike Davies, Carola-Bibiane Schönlieb
We introduce SPRING, a novel stochastic proximal alternating linearized minimization algorithm for solving a class of non-smooth and non-convex optimization problems.
Image Deconvolution Stochastic Optimization Optimization and Control 90C26
no code implementations • 22 Oct 2019 • Junqi Tang, Karen Egiazarian, Mohammad Golbabaee, Mike Davies
We investigate this phenomenon and propose a theory-inspired mechanism for the practitioners to efficiently characterize whether it is beneficial for an inverse problem to be solved by stochastic optimization techniques or not.