no code implementations • NeurIPS 2021 • Jihun Yun, Aurelie C. Lozano, Eunho Yang
We consider the training of structured neural networks where the regularizer can be non-smooth and possibly non-convex.
no code implementations • 15 Jul 2020 • Jihun Yun, Aurelie C. Lozano, Eunho Yang
We propose a unified framework for stochastic proximal gradient descent, which we term ProxGen, that allows for arbitrary positive preconditioners and lower semi-continuous regularizers.
no code implementations • 26 May 2019 • Jihun Yun, Aurelie C. Lozano, Eunho Yang
Extensive experiments reveal that block-diagonal approaches achieve state-of-the-art results on several deep learning tasks, and can outperform adaptive diagonal methods, vanilla Sgd, as well as a modified version of full-matrix adaptation proposed very recently.
no code implementations • CVPR 2016 • Jialei Wang, Peder A. Olsen, Andrew R. Conn, Aurelie C. Lozano
We consider the problem of removing and replacing clouds in satellite image sequences, which has a wide range of applications in remote sensing.
no code implementations • NeurIPS 2015 • Eunho Yang, Aurelie C. Lozano, Pradeep K. Ravikumar
We propose a class of closed-form estimators for GLMs under high-dimensional sampling regimes.
no code implementations • NeurIPS 2014 • Eunho Yang, Aurelie C. Lozano, Pradeep K. Ravikumar
We propose a class of closed-form estimators for sparsity-structured graphical models, expressed as exponential family distributions, under high-dimensional settings.
no code implementations • 19 Feb 2014 • Aleksandr Y. Aravkin, Anju Kambadur, Aurelie C. Lozano, Ronny Luss
We consider new formulations and methods for sparse quantile regression in the high-dimensional setting.
no code implementations • NeurIPS 2011 • Vikas Sindhwani, Aurelie C. Lozano
We consider regularized risk minimization in a large dictionary of Reproducing kernel Hilbert Spaces (RKHSs) over which the target function has a sparse representation.
no code implementations • NeurIPS 2010 • Vikas Sindhwani, Aurelie C. Lozano
We consider multivariate regression problems involving high-dimensional predictor and response spaces.
no code implementations • NeurIPS 2009 • Grzegorz Swirszcz, Naoki Abe, Aurelie C. Lozano
We consider the problem of variable group selection for least squares regression, namely, that of selecting groups of variables for best regression performance, leveraging and adhering to a natural grouping structure within the explanatory variables.