1 code implementation • 26 Feb 2024 • Agniva Chowdhury, Pradeep Ramuhalli
When the number of observations greatly exceeds the number of predictor variables, we present a simple, randomized sampling-based algorithm for logistic regression problem that guarantees high-quality approximations to both the estimated probabilities and the overall discrepancy of the model.
no code implementations • 14 Dec 2023 • Frank Liu, Agniva Chowdhury
In various scientific and engineering applications, there is typically an approximate model of the underlying complex system, even though it contains both aleatoric and epistemic uncertainties.
no code implementations • NeurIPS 2020 • Agniva Chowdhury, Palma London, Haim Avron, Petros Drineas
Linear programming (LP) is used in many machine learning applications, such as $\ell_1$-regularized SVMs, basis pursuit, nonnegative matrix factorization, etc.
no code implementations • 23 Jun 2020 • Agniva Chowdhury, Petros Drineas, David P. Woodruff, Samson Zhou
To improve the interpretability of PCA, various approaches to obtain sparse principal direction loadings have been proposed, which are termed Sparse Principal Component Analysis (SPCA).
no code implementations • 9 Sep 2018 • Agniva Chowdhury, Jiasen Yang, Petros Drineas
When the number of predictor variables greatly exceeds the number of observations, one of the alternatives for conventional FDA is regularized Fisher discriminant analysis (RFDA).
no code implementations • ICML 2018 • Agniva Chowdhury, Jiasen Yang, Petros Drineas
Ridge regression is a variant of regularized least squares regression that is particularly suitable in settings where the number of predictor variables greatly exceeds the number of observations.
no code implementations • 29 May 2017 • Agniva Chowdhury, Jiasen Yang, Petros Drineas
Projection-cost preservation is a low-rank approximation guarantee which ensures that the cost of any rank-$k$ projection can be preserved using a smaller sketch of the original data matrix.