Search Results for author: Ameya Velingker

Found 9 papers, 1 papers with code

Private Robust Estimation by Stabilizing Convex Relaxations

no code implementations7 Dec 2021 Pravesh K. Kothari, Pasin Manurangsi, Ameya Velingker

Prior works obtained private robust algorithms for mean estimation of subgaussian distributions with bounded covariance.

Robust Learning for Congestion-Aware Routing

no code implementations1 Jan 2021 Sreenivas Gollapudi, Kostas Kollias, Benjamin Plaut, Ameya Velingker

We consider the problem of routing users through a network with unknown congestion functions over an infinite time horizon.

Scaling up Kernel Ridge Regression via Locality Sensitive Hashing

no code implementations21 Mar 2020 Michael Kapralov, Navid Nouri, Ilya Razenshteyn, Ameya Velingker, Amir Zandieh

Random binning features, introduced in the seminal paper of Rahimi and Recht (2007), are an efficient method for approximating a kernel matrix using locality sensitive hashing.

Gaussian Processes

Private Aggregation from Fewer Anonymous Messages

no code implementations24 Sep 2019 Badih Ghazi, Pasin Manurangsi, Rasmus Pagh, Ameya Velingker

Using a reduction of Balle et al. (2019), our improved analysis of the protocol of Ishai et al. yields, in the same model, an $\left(\varepsilon, \delta\right)$-differentially private protocol for aggregation that, for any constant $\varepsilon > 0$ and any $\delta = \frac{1}{\mathrm{poly}(n)}$, incurs only a constant error and requires only a constant number of messages per party.

Cryptography and Security Data Structures and Algorithms

Oblivious Sketching of High-Degree Polynomial Kernels

1 code implementation3 Sep 2019 Thomas D. Ahle, Michael Kapralov, Jakob B. T. Knudsen, Rasmus Pagh, Ameya Velingker, David Woodruff, Amir Zandieh

Oblivious sketching has emerged as a powerful approach to speeding up numerical linear algebra over the past decade, but our understanding of oblivious sketching solutions for kernel matrices has remained quite limited, suffering from the aforementioned exponential dependence on input parameters.

Data Structures and Algorithms

On the Power of Multiple Anonymous Messages

no code implementations29 Aug 2019 Badih Ghazi, Noah Golowich, Ravi Kumar, Rasmus Pagh, Ameya Velingker

- Protocols in the multi-message shuffled model with $poly(\log{B}, \log{n})$ bits of communication per user and $poly\log{B}$ error, which provide an exponential improvement on the error compared to what is possible with single-message algorithms.

Scalable and Differentially Private Distributed Aggregation in the Shuffled Model

no code implementations19 Jun 2019 Badih Ghazi, Rasmus Pagh, Ameya Velingker

Federated learning promises to make machine learning feasible on distributed, private datasets by implementing gradient descent using secure aggregation methods.

Federated Learning

A Universal Sampling Method for Reconstructing Signals with Simple Fourier Transforms

no code implementations20 Dec 2018 Haim Avron, Michael Kapralov, Cameron Musco, Christopher Musco, Ameya Velingker, Amir Zandieh

We formalize this intuition by showing that, roughly, a continuous signal from a given class can be approximately reconstructed using a number of samples proportional to the *statistical dimension* of the allowed power spectrum of that class.

Random Fourier Features for Kernel Ridge Regression: Approximation Bounds and Statistical Guarantees

no code implementations ICML 2017 Haim Avron, Michael Kapralov, Cameron Musco, Christopher Musco, Ameya Velingker, Amir Zandieh

Qualitatively, our results are twofold: on the one hand, we show that random Fourier feature approximation can provably speed up kernel ridge regression under reasonable assumptions.

Cannot find the paper you are looking for? You can Submit a new open access paper.