Search Results for author: Greg Ongie

Found 15 papers, 3 papers with code

Depth Separation in Norm-Bounded Infinite-Width Neural Networks

no code implementations13 Feb 2024 Suzanna Parkinson, Greg Ongie, Rebecca Willett, Ohad Shamir, Nathan Srebro

We also show that a similar statement in the reverse direction is not possible: any function learnable with polynomial sample complexity by a norm-controlled depth-2 ReLU network with infinite width is also learnable with polynomial sample complexity by a norm-controlled depth-3 ReLU network.

How do Minimum-Norm Shallow Denoisers Look in Function Space?

no code implementations NeurIPS 2023 Chen Zeno, Greg Ongie, Yaniv Blumenfeld, Nir Weinberger, Daniel Soudry

Neural network (NN) denoisers are an essential building block in many common tasks, ranging from image reconstruction to image generation.

Image Generation Image Reconstruction

The Implicit Bias of Minima Stability in Multivariate Shallow ReLU Networks

no code implementations30 Jun 2023 Mor Shpigel Nacson, Rotem Mulayoff, Greg Ongie, Tomer Michaeli, Daniel Soudry

Finally, we prove that if a function is sufficiently smooth (in a Sobolev sense) then it can be approximated arbitrarily well using shallow ReLU networks that correspond to stable solutions of gradient descent.

Linear Neural Network Layers Promote Learning Single- and Multiple-Index Models

no code implementations24 May 2023 Suzanna Parkinson, Greg Ongie, Rebecca Willett

This paper explores the implicit bias of overparameterized neural networks of depth greater than two layers.

The Role of Linear Layers in Nonlinear Interpolating Networks

no code implementations2 Feb 2022 Greg Ongie, Rebecca Willett

This paper explores the implicit bias of overparameterized neural networks of depth greater than two layers.

A Function Space View of Bounded Norm Infinite Width ReLU Nets: The Multivariate Case

no code implementations ICLR 2020 Greg Ongie, Rebecca Willett, Daniel Soudry, Nathan Srebro

In this paper, we characterize the norm required to realize a function $f:\mathbb{R}^d\rightarrow\mathbb{R}$ as a single hidden-layer ReLU network with an unbounded number of units (infinite width), but where the Euclidean norm of the weights is bounded, including precisely characterizing which functions can be realized with finite norm.

Learning to Solve Linear Inverse Problems in Imaging with Neumann Networks

no code implementations NeurIPS Workshop Deep_Invers 2019 Greg Ongie, Davis Gilton, Rebecca Willett

Recent advances have illustrated that it is often possible to learn to solve linear inverse problems in imaging using training data that can outperform more traditional regularized least squares solutions.

Neumann Networks for Inverse Problems in Imaging

2 code implementations13 Jan 2019 Davis Gilton, Greg Ongie, Rebecca Willett

We present an end-to-end, data-driven method of solving inverse problems inspired by the Neumann series, which we call a Neumann network.

Deblurring

Tensor Methods for Nonlinear Matrix Completion

no code implementations26 Apr 2018 Greg Ongie, Daniel Pimentel-Alarcón, Laura Balzano, Rebecca Willett, Robert D. Nowak

This approach will succeed in many cases where traditional LRMC is guaranteed to fail because the data are low-rank in the tensorized representation but not in the original representation.

Low-Rank Matrix Completion

Algebraic Variety Models for High-Rank Matrix Completion

1 code implementation ICML 2017 Greg Ongie, Rebecca Willett, Robert D. Nowak, Laura Balzano

We consider a generalization of low-rank matrix completion to the case where the data belongs to an algebraic variety, i. e. each data point is a solution to a system of polynomial equations.

Clustering Low-Rank Matrix Completion +1

A Fast Algorithm for Convolutional Structured Low-Rank Matrix Recovery

3 code implementations23 Sep 2016 Greg Ongie, Mathews Jacob

Fourier domain structured low-rank matrix priors are emerging as powerful alternatives to traditional image recovery methods such as total variation and wavelet regularization.

Numerical Analysis Optimization and Control

Off-the-Grid Recovery of Piecewise Constant Images from Few Fourier Samples

no code implementations1 Oct 2015 Greg Ongie, Mathews Jacob

In the first stage we estimate a continuous domain representation of the edge set of the image.

Image Super-Resolution Relation

Recovery of Piecewise Smooth Images from Few Fourier Samples

no code implementations3 Feb 2015 Greg Ongie, Mathews Jacob

We introduce a Prony-like method to recover a continuous domain 2-D piecewise smooth image from few of its Fourier samples.

Matrix Completion Relation

Super-resolution MRI Using Finite Rate of Innovation Curves

no code implementations8 Jan 2015 Greg Ongie, Mathews Jacob

We propose a two-stage algorithm for the super-resolution of MR images from their low-frequency k-space samples.

Super-Resolution

Iterative Non-Local Shrinkage Algorithm for MR Image Reconstruction

no code implementations15 May 2014 Yasir Q. Moshin, Greg Ongie, Mathews Jacob

This approach is enabled by the reformulation of current non-local schemes as an alternating algorithm to minimize a global criterion.

Image Reconstruction

Cannot find the paper you are looking for? You can Submit a new open access paper.