Search Results for author: Wooseok Ha

Found 10 papers, 3 papers with code

Variance-reduced Zeroth-Order Methods for Fine-Tuning Language Models

no code implementations11 Apr 2024 Tanmay Gautam, Youngsuk Park, Hao Zhou, Parameswaran Raman, Wooseok Ha

Evaluated across a range of both masked and autoregressive LMs on benchmark GLUE tasks, MeZO-SVRG outperforms MeZO with up to 20% increase in test accuracies in both full- and partial-parameter fine-tuning settings.

In-Context Learning

Prominent Roles of Conditionally Invariant Components in Domain Adaptation: Theory and Algorithms

no code implementations19 Sep 2023 Keru Wu, Yuansi Chen, Wooseok Ha, Bin Yu

Domain adaptation (DA) is a statistical learning problem that arises when the distribution of the source data used to train a model differs from that of the target data used to evaluate the model.

Domain Adaptation

The Effect of SGD Batch Size on Autoencoder Learning: Sparsity, Sharpness, and Feature Learning

no code implementations6 Aug 2023 Nikhil Ghosh, Spencer Frei, Wooseok Ha, Bin Yu

On the other hand, for any batch size strictly smaller than the number of samples, SGD finds a global minimum which is sparse and nearly orthogonal to its initialization, showing that the randomness of stochastic gradients induces a qualitatively different type of "feature selection" in this setting.

feature selection

Interpreting and improving deep-learning models with reality checks

4 code implementations16 Aug 2021 Chandan Singh, Wooseok Ha, Bin Yu

Recent deep-learning models have achieved impressive predictive performance by learning complex functions of many variables, often at the cost of interpretability.

Transformation Importance with Applications to Cosmology

2 code implementations4 Mar 2020 Chandan Singh, Wooseok Ha, Francois Lanusse, Vanessa Boehm, Jia Liu, Bin Yu

Machine learning lies at the heart of new possibilities for scientific discovery, knowledge generation, and artificial intelligence.

Statistical guarantees for local graph clustering

no code implementations11 Jun 2019 Wooseok Ha, Kimon Fountoulakis, Michael W. Mahoney

In this paper, we adopt a statistical perspective on local graph clustering, and we analyze the performance of the l1-regularized PageRank method~(Fountoulakis et.

Clustering Graph Clustering

An equivalence between critical points for rank constraints versus low-rank factorizations

no code implementations2 Dec 2018 Wooseok Ha, Haoyang Liu, Rina Foygel Barber

Two common approaches in low-rank optimization problems are either working directly with a rank constraint on the matrix variable, or optimizing over a low-rank factorization so that the rank constraint is implicitly ensured.

Optimization and Control

Alternating minimization and alternating descent over nonconvex sets

no code implementations13 Sep 2017 Wooseok Ha, Rina Foygel Barber

We analyze the performance of alternating minimization for loss functions optimized over two variables, where each variable may be restricted to lie in some potentially nonconvex constraint set.

Robust PCA with compressed data

no code implementations NeurIPS 2015 Wooseok Ha, Rina Foygel Barber

The robust principal component analysis (RPCA) problem seeks to separate low-rank trends from sparse outlierswithin a data matrix, that is, to approximate a $n\times d$ matrix $D$ as the sum of a low-rank matrix $L$ and a sparse matrix $S$. We examine the robust principal component analysis (RPCA) problem under data compression, wherethe data $Y$ is approximately given by $(L + S)\cdot C$, that is, a low-rank $+$ sparse data matrix that has been compressed to size $n\times m$ (with $m$ substantially smaller than the original dimension $d$) via multiplication witha compression matrix $C$.

Data Compression

Cannot find the paper you are looking for? You can Submit a new open access paper.