no code implementations • 29 Aug 2024 • Oscar Leong, Eliza O'Reilly, Yong Sheng Soh
Variational regularization is a classical technique to solve statistical inference tasks and inverse problems, with modern data-driven approaches parameterizing regularizers via deep neural networks showcasing impressive empirical performance.
2 code implementations • 29 May 2024 • Yasi Zhang, Peiyu Yu, Yaxuan Zhu, Yingshan Chang, Feng Gao, Ying Nian Wu, Oscar Leong
We validate our approach for various linear inverse problems, such as super-resolution, deblurring, inpainting, and compressed sensing, and demonstrate that we can outperform other methods based on flow matching.
no code implementations • 30 Mar 2024 • Sreemanti Dey, Snigdha Saha, Berthy T. Feng, Manxiu Cui, Laure Delisle, Oscar Leong, Lihong V. Wang, Katherine L. Bouman
Photoacoustic tomography (PAT) is a rapidly-evolving medical imaging modality that combines optical absorption contrast with ultrasound imaging depth.
no code implementations • 12 Apr 2023 • Oscar Leong, Angela F. Gao, He Sun, Katherine L. Bouman
We show that such a set of inverse problems can be solved simultaneously without the use of a spatial image prior by instead inferring a shared image generator with a low-dimensional latent space.
no code implementations • 21 Mar 2023 • Angela F. Gao, Oscar Leong, He Sun, Katherine L. Bouman
We show that such a set of inverse problems can be solved simultaneously by learning a shared image generator with a low-dimensional latent space.
no code implementations • 27 Dec 2022 • Oscar Leong, Eliza O'Reilly, Yong Sheng Soh, Venkat Chandrasekaran
In this paper, we seek a systematic understanding of the power and the limitations of convex regularization by investigating the following questions: Given a distribution, what is the optimal regularizer for data drawn from the distribution?
no code implementations • 2 Nov 2022 • Rohun Agrawal, Oscar Leong
Phase retrieval is the nonlinear inverse problem of recovering a true signal from its Fourier magnitude measurements.
no code implementations • 31 Oct 2020 • Paul Hand, Oscar Leong, Vladislav Voroninski
We establish local convergence of subgradient descent with optimal sample complexity based on the uniform concentration of a random, discontinuous matrix-valued operator arising from the objective's gradient dynamics.
no code implementations • 24 Aug 2020 • Paul Hand, Oscar Leong, Vladislav Voroninski
Advances in compressive sensing provided reconstruction algorithms of sparse signals from linear measurements with optimal sample complexity, but natural extensions of this methodology to nonlinear inverse problems have been met with potentially fundamental sample complexity bottlenecks.
no code implementations • NeurIPS Workshop Deep_Invers 2019 • Oscar Leong, Wesam Sakla
In particular, how can one use the availability of a small amount of data (even $5-25$ examples) to one's advantage in solving these inverse problems and can a system's performance increase as the amount of data increases as well?
1 code implementation • 28 May 2019 • Muhammad Asim, Max Daniels, Oscar Leong, Ali Ahmed, Paul Hand
For compressive sensing, invertible priors can yield higher accuracy than sparsity priors across almost all undersampling ratios, and due to their lack of representation error, invertible priors can yield better reconstructions than GAN priors for images that have rare features of variation within the biased training set, including out-of-distribution natural images.
no code implementations • NeurIPS 2018 • Paul Hand, Oscar Leong, Vladislav Voroninski
Our formulation has provably favorable global geometry for gradient methods, as soon as $m = O(kd^2\log n)$, where $d$ is the depth of the network.