Search Results for author: Michael Kapralov

Found 11 papers, 3 papers with code

Efficient and Local Parallel Random Walks

no code implementations NeurIPS 2021 Michael Kapralov, Silvio Lattanzi, Navid Nouri, Jakab Tardos

Random walks are a fundamental primitive used in many machine learning algorithms with several applications in clustering and semi-supervised learning.

Clustering

Spectral Clustering Oracles in Sublinear Time

no code implementations14 Jan 2021 Grzegorz Gluch, Michael Kapralov, Silvio Lattanzi, Aida Mousavifar, Christian Sohler

The main technical contribution is a sublinear time oracle that provides dot product access to the spectral embedding of $G$ by estimating distributions of short random walks from vertices in $G$.

Data Structures and Algorithms

Scaling up Kernel Ridge Regression via Locality Sensitive Hashing

no code implementations21 Mar 2020 Michael Kapralov, Navid Nouri, Ilya Razenshteyn, Ameya Velingker, Amir Zandieh

Random binning features, introduced in the seminal paper of Rahimi and Recht (2007), are an efficient method for approximating a kernel matrix using locality sensitive hashing.

Gaussian Processes regression

Practice of Streaming Processing of Dynamic Graphs: Concepts, Models, and Systems

no code implementations29 Dec 2019 Maciej Besta, Marc Fischer, Vasiliki Kalavri, Michael Kapralov, Torsten Hoefler

We also crystallize the meaning of different concepts associated with streaming graph processing, such as dynamic, temporal, online, and time-evolving graphs, edge-centric processing, models for the maintenance of updates, and graph databases.

Distributed, Parallel, and Cluster Computing Databases Data Structures and Algorithms Performance

Efficiently Learning Fourier Sparse Set Functions

1 code implementation NeurIPS 2019 Andisheh Amrollahi, Amir Zandieh, Michael Kapralov, Andreas Krause

In this paper we consider the problem of efficiently learning set functions that are defined over a ground set of size $n$ and that are sparse (say $k$-sparse) in the Fourier domain.

Oblivious Sketching of High-Degree Polynomial Kernels

1 code implementation3 Sep 2019 Thomas D. Ahle, Michael Kapralov, Jakob B. T. Knudsen, Rasmus Pagh, Ameya Velingker, David Woodruff, Amir Zandieh

Oblivious sketching has emerged as a powerful approach to speeding up numerical linear algebra over the past decade, but our understanding of oblivious sketching solutions for kernel matrices has remained quite limited, suffering from the aforementioned exponential dependence on input parameters.

Data Structures and Algorithms

A Universal Sampling Method for Reconstructing Signals with Simple Fourier Transforms

no code implementations20 Dec 2018 Haim Avron, Michael Kapralov, Cameron Musco, Christopher Musco, Ameya Velingker, Amir Zandieh

We formalize this intuition by showing that, roughly, a continuous signal from a given class can be approximately reconstructed using a number of samples proportional to the *statistical dimension* of the allowed power spectrum of that class.

Random Fourier Features for Kernel Ridge Regression: Approximation Bounds and Statistical Guarantees

no code implementations ICML 2017 Haim Avron, Michael Kapralov, Cameron Musco, Christopher Musco, Ameya Velingker, Amir Zandieh

Qualitatively, our results are twofold: on the one hand, we show that random Fourier feature approximation can provably speed up kernel ridge regression under reasonable assumptions.

regression

Prediction strategies without loss

no code implementations NeurIPS 2011 Michael Kapralov, Rina Panigrahy

Moreover, for {\em any window of size $n$} the regret of our algorithm to any expert never exceeds $O(\sqrt{n(\log N+\log T)})$, where $N$ is the number of experts and $T$ is the time horizon, while maintaining the essentially zero loss property.

Factor Modeling for Advertisement Targeting

no code implementations NeurIPS 2009 Ye Chen, Michael Kapralov, John Canny, Dmitry Y. Pavlov

We adapt a probabilistic latent variable model, namely GaP (Gamma-Poisson), to ad targeting in the contexts of sponsored search (SS) and behaviorally targeted (BT) display advertising.

Perfect Matchings in O(n \log n) Time in Regular Bipartite Graphs

1 code implementation18 Sep 2009 Ashish Goel, Michael Kapralov, Sanjeev Khanna

Our techniques also give an algorithm that successively finds a matching in the support of a doubly stochastic matrix in expected time O(n\log^2 n) time, with O(m) pre-processing time; this gives a simple O(m+mn\log^2 n) time algorithm for finding the Birkhoff-von Neumann decomposition of a doubly stochastic matrix.

Data Structures and Algorithms Discrete Mathematics

Cannot find the paper you are looking for? You can Submit a new open access paper.