Search Results for author: Taisuke Yasuda

Found 7 papers, 1 papers with code

Performance of $\ell_1$ Regularization for Sparse Convex Optimization

no code implementations14 Jul 2023 Kyriakos Axiotis, Taisuke Yasuda

We give the first recovery guarantees for the Group LASSO for sparse convex optimization with vector-valued features.

feature selection

Sharper Bounds for $\ell_p$ Sensitivity Sampling

no code implementations1 Jun 2023 David P. Woodruff, Taisuke Yasuda

In this work, we show the first bounds for sensitivity sampling for $\ell_p$ subspace embeddings for $p > 2$ that improve over the general $\mathfrak S d$ bound, achieving a bound of roughly $\mathfrak S^{2-2/p}$ for $2<p<\infty$.

Sequential Attention for Feature Selection

1 code implementation29 Sep 2022 Taisuke Yasuda, Mohammadhossein Bateni, Lin Chen, Matthew Fahrbach, Gang Fu, Vahab Mirrokni

Feature selection is the problem of selecting a subset of features for a machine learning model that maximizes model quality subject to a budget constraint.

Feature Importance feature selection

Online Lewis Weight Sampling

no code implementations17 Jul 2022 David P. Woodruff, Taisuke Yasuda

Towards our result, we give the first analysis of "one-shot'' Lewis weight sampling of sampling rows proportionally to their Lewis weights, with sample complexity $\tilde O(d^{p/2}/\epsilon^2)$ for $p>2$.

Open-Ended Question Answering regression

Active Linear Regression for $\ell_p$ Norms and Beyond

no code implementations9 Nov 2021 Cameron Musco, Christopher Musco, David P. Woodruff, Taisuke Yasuda

By combining this with our techniques for $\ell_p$ regression, we obtain an active regression algorithm making $\tilde O(d^{1+\max\{1, p/2\}}/\mathrm{poly}(\epsilon))$ queries for such loss functions, including the Tukey and Huber losses, answering another question of [CD21].

Dimensionality Reduction Open-Ended Question Answering +1

Tight Kernel Query Complexity of Kernel Ridge Regression and Kernel $k$-means Clustering

no code implementations15 May 2019 Manuel Fernandez, David P. Woodruff, Taisuke Yasuda

We present tight lower bounds on the number of kernel evaluations required to approximately solve kernel ridge regression (KRR) and kernel $k$-means clustering (KKMC) on $n$ input points.

Clustering Open-Ended Question Answering +1

Cannot find the paper you are looking for? You can Submit a new open access paper.