Search Results for author: Agniva Chowdhury

Found 7 papers, 1 papers with code

A Provably Accurate Randomized Sampling Algorithm for Logistic Regression

1 code implementation26 Feb 2024 Agniva Chowdhury, Pradeep Ramuhalli

When the number of observations greatly exceeds the number of predictor variables, we present a simple, randomized sampling-based algorithm for logistic regression problem that guarantees high-quality approximations to both the estimated probabilities and the overall discrepancy of the model.

Binary Classification regression

Deep Learning with Physics Priors as Generalized Regularizers

no code implementations14 Dec 2023 Frank Liu, Agniva Chowdhury

In various scientific and engineering applications, there is typically an approximate model of the underlying complex system, even though it contains both aleatoric and epistemic uncertainties.

Faster Randomized Infeasible Interior Point Methods for Tall/Wide Linear Programs

no code implementations NeurIPS 2020 Agniva Chowdhury, Palma London, Haim Avron, Petros Drineas

Linear programming (LP) is used in many machine learning applications, such as $\ell_1$-regularized SVMs, basis pursuit, nonnegative matrix factorization, etc.

Approximation Algorithms for Sparse Principal Component Analysis

no code implementations23 Jun 2020 Agniva Chowdhury, Petros Drineas, David P. Woodruff, Samson Zhou

To improve the interpretability of PCA, various approaches to obtain sparse principal direction loadings have been proposed, which are termed Sparse Principal Component Analysis (SPCA).

Dimensionality Reduction

Randomized Iterative Algorithms for Fisher Discriminant Analysis

no code implementations9 Sep 2018 Agniva Chowdhury, Jiasen Yang, Petros Drineas

When the number of predictor variables greatly exceeds the number of observations, one of the alternatives for conventional FDA is regularized Fisher discriminant analysis (RFDA).

Dimensionality Reduction

An Iterative, Sketching-based Framework for Ridge Regression

no code implementations ICML 2018 Agniva Chowdhury, Jiasen Yang, Petros Drineas

Ridge regression is a variant of regularized least squares regression that is particularly suitable in settings where the number of predictor variables greatly exceeds the number of observations.

regression

Structural Conditions for Projection-Cost Preservation via Randomized Matrix Multiplication

no code implementations29 May 2017 Agniva Chowdhury, Jiasen Yang, Petros Drineas

Projection-cost preservation is a low-rank approximation guarantee which ensures that the cost of any rank-$k$ projection can be preserved using a smaller sketch of the original data matrix.

Cannot find the paper you are looking for? You can Submit a new open access paper.