Search Results for author: Kenneth L. Clarkson

Found 15 papers, 2 papers with code

Low Rank Approximation and Regression in Input Sparsity Time

1 code implementation26 Jul 2012 Kenneth L. Clarkson, David P. Woodruff

We design a new distribution over $\poly(r \eps^{-1}) \times n$ matrices $S$ so that for any fixed $n \times d$ matrix $A$ of rank $r$, with probability at least 9/10, $\norm{SAx}_2 = (1 \pm \eps)\norm{Ax}_2$ simultaneously for all $x \in \mathbb{R}^d$.

Data Structures and Algorithms

Faster Kernel Ridge Regression Using Sketching and Preconditioning

1 code implementation10 Nov 2016 Haim Avron, Kenneth L. Clarkson, David P. Woodruff

The preconditioner is based on random feature maps, such as random Fourier features, which have recently emerged as a powerful technique for speeding up and scaling the training of kernel-based methods, such as kernel ridge regression, by resorting to approximations.

regression

Sharper Bounds for Regularized Data Fitting

no code implementations10 Nov 2016 Haim Avron, Kenneth L. Clarkson, David P. Woodruff

We study regularization both in a fairly broad setting, and in the specific context of the popular and widely used technique of ridge regularization; for the latter, as applied to each of these problems, we show algorithmic resource bounds in which the {\em statistical dimension} appears in places where in previous bounds the rank would appear.

The Fast Cauchy Transform and Faster Robust Linear Regression

no code implementations19 Jul 2012 Kenneth L. Clarkson, Petros Drineas, Malik Magdon-Ismail, Michael W. Mahoney, Xiangrui Meng, David P. Woodruff

We provide fast algorithms for overconstrained $\ell_p$ regression and related problems: for an $n\times d$ input matrix $A$ and vector $b\in\mathbb{R}^n$, in $O(nd\log n)$ time we reduce the problem $\min_{x\in\mathbb{R}^d} \|Ax-b\|_p$ to the same problem with input matrix $\tilde A$ of dimension $s \times d$ and corresponding $\tilde b$ of dimension $s\times 1$.

regression

Dimensionality Reduction for Tukey Regression

no code implementations14 May 2019 Kenneth L. Clarkson, Ruosong Wang, David P. Woodruff

We give the first dimensionality reduction methods for the overconstrained Tukey regression problem.

Dimensionality Reduction regression

Projection techniques to update the truncated SVD of evolving matrices

no code implementations13 Oct 2020 Vassilis Kalantzis, Georgios Kollias, Shashanka Ubaru, Athanasios N. Nikolakopoulos, Lior Horesh, Kenneth L. Clarkson

This paper considers the problem of updating the rank-k truncated Singular Value Decomposition (SVD) of matrices subject to the addition of new rows and/or columns over time.

Recommendation Systems

Quantum-Inspired Algorithms from Randomized Numerical Linear Algebra

no code implementations9 Nov 2020 Nadiia Chepurko, Kenneth L. Clarkson, Lior Horesh, Honghao Lin, David P. Woodruff

We create classical (non-quantum) dynamic data structures supporting queries for recommender systems and least-squares regression that are comparable to their quantum analogues.

Recommendation Systems

Order Embeddings from Merged Ontologies using Sketching

no code implementations6 Jan 2021 Kenneth L. Clarkson, Sanjana Sahayaraj

We give a simple, low resource method to produce order embeddings from ontologies.

Dimensionality Reduction

Near-Optimal Algorithms for Linear Algebra in the Current Matrix Multiplication Time

no code implementations16 Jul 2021 Nadiia Chepurko, Kenneth L. Clarkson, Praneeth Kacham, David P. Woodruff

This question is regarding the logarithmic factors in the sketching dimension of existing oblivious subspace embeddings that achieve constant-factor approximation.

Open-Ended Question Answering regression

Quantum Topological Data Analysis with Linear Depth and Exponential Speedup

no code implementations5 Aug 2021 Shashanka Ubaru, Ismail Yunus Akhalwaya, Mark S. Squillante, Kenneth L. Clarkson, Lior Horesh

In this paper, we completely overhaul the QTDA algorithm to achieve an improved exponential speedup and depth complexity of $O(n\log(1/(\delta\epsilon)))$.

Quantum Machine Learning Topological Data Analysis

Low-Rank Approximation with $1/ε^{1/3}$ Matrix-Vector Products

no code implementations10 Feb 2022 Ainesh Bakshi, Kenneth L. Clarkson, David P. Woodruff

For the special cases of $p=2$ (Frobenius norm) and $p = \infty$ (Spectral norm), Musco and Musco (NeurIPS 2015) obtained an algorithm based on Krylov methods that uses $\tilde{O}(k/\sqrt{\epsilon})$ matrix-vector products, improving on the na\"ive $\tilde{O}(k/\epsilon)$ dependence obtainable by the power method, where $\tilde{O}$ suppresses poly$(\log(dk/\epsilon))$ factors.

Topological data analysis on noisy quantum computers

no code implementations19 Sep 2022 Ismail Yunus Akhalwaya, Shashanka Ubaru, Kenneth L. Clarkson, Mark S. Squillante, Vishnu Jejjala, Yang-Hui He, Kugendran Naidoo, Vasileios Kalantzis, Lior Horesh

In this study, we present NISQ-TDA, a fully implemented end-to-end quantum machine learning algorithm needing only a short circuit-depth, that is applicable to high-dimensional classical data, and with provable asymptotic speedup for certain classes of problems.

Quantum Machine Learning Topological Data Analysis

Bayesian Experimental Design for Symbolic Discovery

no code implementations29 Nov 2022 Kenneth L. Clarkson, Cristina Cornelio, Sanjeeb Dash, Joao Goncalves, Lior Horesh, Nimrod Megiddo

This study concerns the formulation and application of Bayesian optimal experimental design to symbolic discovery, which is the inference from observational data of predictive models taking general functional forms.

Experimental Design Numerical Integration

Capacity Analysis of Vector Symbolic Architectures

no code implementations24 Jan 2023 Kenneth L. Clarkson, Shashanka Ubaru, Elizabeth Yang

The ensemble of a particular vector space and a prescribed set of vector operations (including one addition-like for "bundling" and one outer-product-like for "binding") form a *vector symbolic architecture* (VSA).

Dimensionality Reduction

Cannot find the paper you are looking for? You can Submit a new open access paper.