no code implementations • 1 Jan 2021 • Shehryar Malik, Usman Anwar, Alireza Aghasi, Ali Ahmed
In this work, given a reward function and a set of demonstrations from an expert that maximizes this reward function while respecting \textit{unknown} constraints, we propose a framework to learn the most likely constraints that the expert respects.
1 code implementation • 19 Nov 2020 • Usman Anwar, Shehryar Malik, Alireza Aghasi, Ali Ahmed
However, for the real world deployment of reinforcement learning (RL), it is critical that RL agents are aware of these constraints, so that they can act safely.
no code implementations • 13 May 2020 • Fahad Shamshad, Asif Hanif, Ali Ahmed
Recently pretrained generative models have shown promising results for subsampled Fourier Ptychography (FP) in terms of quality of reconstruction for extremely low sampling rate and high noise.
no code implementations • ICLR Workshop DeepDiffEq 2019 • Shehryar Malik, Usman Anwar, Ali Ahmed, Alireza Aghasi
Recently, there has been a lot of interest in using neural networks for solving partial differential equations.
no code implementations • 28 Feb 2020 • Fahad Shamshad, Ali Ahmed
In this paper, we consider the highly ill-posed problem of jointly recovering two real-valued signals from the phaseless measurements of their circular convolution.
no code implementations • NeurIPS Workshop Deep_Invers 2019 • Muhammad Asim, Fahad Shamshad, Ali Ahmed
In this work, we show that this strong prior, enforced by the structure of a ConvNet, can be augmented with the information that recurs in different patches of a natural image to boost the performance.
no code implementations • NeurIPS Workshop Deep_Invers 2019 • Fahad Shamshad, Asif Hanif, Ali Ahmed
Recently pretrained generative models have shown promising results for subsampled Fourier Ptychography (FP) in terms of quality of reconstruction for extremely low sampling rate and high noise.
1 code implementation • 20 Aug 2019 • Muhammad Asim, Fahad Shamshad, Ali Ahmed
This paper proposes a novel approach to regularize the ill-posed blind image deconvolution (blind image deblurring) problem using deep generative networks.
1 code implementation • 28 May 2019 • Muhammad Asim, Max Daniels, Oscar Leong, Ali Ahmed, Paul Hand
For compressive sensing, invertible priors can yield higher accuracy than sparsity priors across almost all undersampling ratios, and due to their lack of representation error, invertible priors can yield better reconstructions than GAN priors for images that have rare features of variation within the biased training set, including out-of-distribution natural images.
1 code implementation • 22 Dec 2018 • Fahad Shamshad, Farwa Abbas, Ali Ahmed
This paper proposes a novel framework to regularize the highly ill-posed and non-linear Fourier ptychography problem using generative models.
1 code implementation • 29 Nov 2018 • Fahad Shamshad, Muhammad Awais, Muhammad Asim, Zain ul Aabidin Lodhi, Muhammad Umair, Ali Ahmed
Among the plethora of techniques devised to curb the prevalence of noise in medical images, deep learning based approaches have shown the most promise.
1 code implementation • 17 Aug 2018 • Fahad Shamshad, Ali Ahmed
This paper proposes a new framework to regularize the highly ill-posed and non-linear phase retrieval problem through deep generative priors using simple gradient descent algorithm.
1 code implementation • 12 Feb 2018 • Muhammad Asim, Fahad Shamshad, Ali Ahmed
This paper proposes a novel approach to regularize the \textit{ill-posed} and \textit{non-linear} blind image deconvolution (blind deblurring) using deep generative networks as priors.
1 code implementation • 21 Nov 2012 • Ali Ahmed, Benjamin Recht, Justin Romberg
That is, we show that if $\boldsymbol{x}$ is drawn from a random subspace of dimension $N$, and $\boldsymbol{w}$ is a vector in a subspace of dimension $K$ whose basis vectors are "spread out" in the frequency domain, then nuclear norm minimization recovers $\boldsymbol{w}\boldsymbol{x}^*$ without error.
Information Theory Information Theory