no code implementations • 2 Nov 2022 • Zhong Zhuang, David Yang, Felix Hofmann, David Barmherzig, Ju Sun
Phase retrieval (PR) concerns the recovery of complex phases from complex magnitudes.
no code implementations • 21 Oct 2022 • Le Peng, Yash Travadi, Rui Zhang, Ying Cui, Ju Sun
We propose performing imbalanced classification by regrouping majority classes into small classes so that we turn the problem into balanced multiclass classification.
no code implementations • 3 Oct 2022 • Buyun Liang, Tim Mitchell, Ju Sun
Imposing explicit constraints is relatively new but increasingly pressing in deep learning, stimulated by, e. g., trustworthy AI that performs robust optimization over complicated perturbation sets and scientific applications that need to respect physical laws and constraints.
no code implementations • 2 Oct 2022 • Hengyue Liang, Buyun Liang, Ying Cui, Tim Mitchell, Ju Sun
Empirical evaluation of deep learning models against adversarial attacks entails solving nontrivial constrained optimization problems.
no code implementations • 18 Aug 2022 • Zhong Zhuang, Taihui Li, Hengkang Wang, Ju Sun
Blind image deblurring (BID) has been extensively studied in computer vision and adjacent fields.
1 code implementation • 11 Dec 2021 • Hengkang Wang, Taihui Li, Zhong Zhuang, Tiancong Chen, Hengyue Liang, Ju Sun
In this regard, the majority of DIP works for vision tasks only demonstrates the potential of the models -- reporting the peak performance against the ground truth, but provides no clue about how to operationally obtain near-peak performance without access to the groundtruth.
1 code implementation • 27 Nov 2021 • Buyun Liang, Tim Mitchell, Ju Sun
GRANSO is among the first optimization solvers targeting general nonsmooth NCVX problems with nonsmooth constraints, but, as it is implemented in MATLAB and requires the user to provide analytical gradients, GRANSO is often not a convenient choice in machine learning (especially deep learning) applications.
2 code implementations • 23 Oct 2021 • Taihui Li, Zhong Zhuang, Hengyue Liang, Le Peng, Hengkang Wang, Ju Sun
Recent works have shown the surprising effectiveness of deep generative models in solving numerous image reconstruction (IR) tasks, even without training data.
no code implementations • 9 Jun 2021 • Kshitij Tayal, Raunak Manekar, Zhong Zhuang, David Yang, Vipin Kumar, Felix Hofmann, Ju Sun
Several deep learning methods for phase retrieval exist, but most of them fail on realistic data without precise support information.
1 code implementation • 9 Jun 2021 • Le Peng, Hengyue Liang, Gaoxiang Luo, Taihui Li, Ju Sun
For example, on the BIMCV COVID-19 classification dataset, we obtain improved performance with around $1/4$ model size and $2/3$ inference time compared to the standard full TL model.
no code implementations • 3 Jun 2021 • Ju Sun, Le Peng, Taihui Li, Dyah Adila, Zach Zaiman, Genevieve B. Melton, Nicholas Ingraham, Eric Murray, Daniel Boley, Sean Switzer, John L. Burns, Kun Huang, Tadashi Allen, Scott D. Steenburg, Judy Wawira Gichoya, Erich Kummerfeld, Christopher Tignanelli
Conclusions and Relevance: AI-based diagnostic tools may serve as an adjunct, but not replacement, for clinical decision support of COVID-19 diagnosis, which largely hinges on exposure history, signs, and symptoms.
no code implementations • 24 May 2021 • David A. Barmherzig, Ju Sun
A new algorithmic framework is presented for holographic phase retrieval via maximum likelihood optimization, which allows for practical and robust image reconstruction.
no code implementations • 23 Oct 2020 • Raunak Manekar, Zhong Zhuang, Kshitij Tayal, Vipin Kumar, Ju Sun
Phase retrieval (PR) consists of estimating 2D or 3D objects from their Fourier magnitudes and takes a central place in scientific imaging.
no code implementations • 23 Oct 2020 • Kshitij Tayal, Chieh-Hsin Lai, Raunak Manekar, Zhong Zhuang, Vipin Kumar, Ju Sun
In many physical systems, inputs related by intrinsic system symmetries generate the same output.
no code implementations • 20 Mar 2020 • Kshitij Tayal, Chieh-Hsin Lai, Vipin Kumar, Ju Sun
In many physical systems, inputs related by intrinsic system symmetries are mapped to the same output.
no code implementations • 7 Feb 2019 • David A. Barmherzig, Ju Sun, Emmanuel J. Candès, T. J. Lane, Po-Nan Li
A new reference design is introduced for holographic coherent diffraction imaging.
1 code implementation • ICLR 2019 • Yu Bai, Qijia Jiang, Ju Sun
This paper concerns dictionary learning, i. e., sparse coding, a fundamental representation learning problem.
no code implementations • 10 Aug 2018 • Fangyu Zou, Li Shen, Zequn Jie, Ju Sun, Wei Liu
Integrating adaptive learning rate and momentum techniques into SGD leads to a large class of efficiently accelerated adaptive stochastic algorithms, such as Nadam, AccAdaGrad, \textit{etc}.
no code implementations • 6 Dec 2017 • David Barmherzig, Ju Sun
While convergence of the Alternating Direction Method of Multipliers (ADMM) on convex problems is well studied, convergence on nonconvex problems is only partially understood.
1 code implementation • 22 Feb 2016 • Ju Sun, Qing Qu, John Wright
complex Gaussian) and the number of measurements is large enough ($m \ge C n \log^3 n$), with high probability, a natural least-squares formulation for GPR has the following benign geometric structure: (1) there are no spurious local minimizers, and all global minimizers are equal to the target signal $\mathbf x$, up to a global phase; and (2) the objective function has a negative curvature around each saddle point.
no code implementations • 15 Nov 2015 • Ju Sun, Qing Qu, John Wright
We consider the problem of recovering a complete (i. e., square and invertible) matrix $\mathbf A_0$, from $\mathbf Y \in \mathbb{R}^{n \times p}$ with $\mathbf Y = \mathbf A_0 \mathbf X_0$, provided $\mathbf X_0$ is sufficiently sparse.
no code implementations • 11 Nov 2015 • Ju Sun, Qing Qu, John Wright
We give the first efficient algorithm that provably recovers $\mathbf A_0$ when $\mathbf X_0$ has $O(n)$ nonzeros per column, under suitable probability model for $\mathbf X_0$.
3 code implementations • 21 Oct 2015 • Ju Sun, Qing Qu, John Wright
In this note, we focus on smooth nonconvex optimization problems that obey: (1) all local minimizers are also global; and (2) around any saddle point or local maximizer, the objective has a negative directional curvature.
1 code implementation • 26 Apr 2015 • Ju Sun, Qing Qu, John Wright
We consider the problem of recovering a complete (i. e., square and invertible) matrix $\mathbf A_0$, from $\mathbf Y \in \mathbb R^{n \times p}$ with $\mathbf Y = \mathbf A_0 \mathbf X_0$, provided $\mathbf X_0$ is sufficiently sparse.
1 code implementation • NeurIPS 2014 • Qing Qu, Ju Sun, John Wright
In this paper, we focus on a **planted sparse model** for the subspace: the target sparse vector is embedded in an otherwise random subspace.
no code implementations • 2 Aug 2012 • Ju Sun, Yuqian Zhang, John Wright
Motivated by vision tasks such as robust face and object recognition, we consider the following general problem: given a collection of low-dimensional linear subspaces in a high-dimensional ambient (image) space, and a query point (image), efficiently determine the nearest subspace to the query in $\ell^1$ distance.
1 code implementation • 14 Oct 2010 • Guangcan Liu, Zhouchen Lin, Shuicheng Yan, Ju Sun, Yong Yu, Yi Ma
In this work we address the subspace recovery problem.