Search Results for author: Rasmus Pagh

Found 15 papers, 5 papers with code

Infinitely Divisible Noise in the Low Privacy Regime

no code implementations13 Oct 2021 Rasmus Pagh, Nina Mesing Stausholm

Federated learning, in which training data is distributed among users and never shared, has emerged as a popular approach to privacy-preserving machine learning.

Federated Learning

DEANN: Speeding up Kernel-Density Estimation using Approximate Nearest Neighbor Search

1 code implementation6 Jul 2021 Matti Karppa, Martin Aumüller, Rasmus Pagh

We present an algorithm called Density Estimation from Approximate Nearest Neighbors (DEANN) where we apply Approximate Nearest Neighbor (ANN) algorithms as a black box subroutine to compute an unbiased KDE.

Density Estimation

CountSketches, Feature Hashing and the Median of Three

no code implementations3 Feb 2021 Kasper Green Larsen, Rasmus Pagh, Jakub Tětek

For $t > 1$, the estimator takes the median of $2t-1$ independent estimates, and the probability that the estimate is off by more than $2 \|v\|_2/\sqrt{s}$ is exponentially small in $t$.

Sampling a Near Neighbor in High Dimensions -- Who is the Fairest of Them All?

1 code implementation26 Jan 2021 Martin Aumüller, Sariel Har-Peled, Sepideh Mahabadi, Rasmus Pagh, Francesco Silvestri

Given a set of points $S$ and a radius parameter $r>0$, the $r$-near neighbor ($r$-NN) problem asks for a data structure that, given any query point $q$, returns a point $p$ within distance at most $r$ from $q$.

Fairness

WOR and $p$'s: Sketches for $\ell_p$-Sampling Without Replacement

no code implementations NeurIPS 2020 Edith Cohen, Rasmus Pagh, David P. Woodruff

We design novel composable sketches for WOR $\ell_p$ sampling, weighted sampling of keys according to a power $p\in[0, 2]$ of their frequency (or for signed data, sum of updates).

Private Aggregation from Fewer Anonymous Messages

no code implementations24 Sep 2019 Badih Ghazi, Pasin Manurangsi, Rasmus Pagh, Ameya Velingker

Using a reduction of Balle et al. (2019), our improved analysis of the protocol of Ishai et al. yields, in the same model, an $\left(\varepsilon, \delta\right)$-differentially private protocol for aggregation that, for any constant $\varepsilon > 0$ and any $\delta = \frac{1}{\mathrm{poly}(n)}$, incurs only a constant error and requires only a constant number of messages per party.

Cryptography and Security Data Structures and Algorithms

The space complexity of inner product filters

no code implementations24 Sep 2019 Rasmus Pagh, Johan Sivertsen

Motivated by the problem of filtering candidate pairs in inner product similarity joins we study the following inner product estimation problem: Given parameters $d\in {\bf N}$, $\alpha>\beta\geq 0$ and unit vectors $x, y\in {\bf R}^{d}$ consider the task of distinguishing between the cases $\langle x, y\rangle\leq\beta$ and $\langle x, y\rangle\geq \alpha$ where $\langle x, y\rangle = \sum_{i=1}^d x_i y_i$ is the inner product of vectors $x$ and $y$.

Dimensionality Reduction

Oblivious Sketching of High-Degree Polynomial Kernels

no code implementations3 Sep 2019 Thomas D. Ahle, Michael Kapralov, Jakob B. T. Knudsen, Rasmus Pagh, Ameya Velingker, David Woodruff, Amir Zandieh

Oblivious sketching has emerged as a powerful approach to speeding up numerical linear algebra over the past decade, but our understanding of oblivious sketching solutions for kernel matrices has remained quite limited, suffering from the aforementioned exponential dependence on input parameters.

Data Structures and Algorithms

On the Power of Multiple Anonymous Messages

no code implementations29 Aug 2019 Badih Ghazi, Noah Golowich, Ravi Kumar, Rasmus Pagh, Ameya Velingker

- Protocols in the multi-message shuffled model with $poly(\log{B}, \log{n})$ bits of communication per user and $poly\log{B}$ error, which provide an exponential improvement on the error compared to what is possible with single-message algorithms.

PUFFINN: Parameterless and Universally Fast FInding of Nearest Neighbors

2 code implementations28 Jun 2019 Martin Aumüller, Tobias Christiani, Rasmus Pagh, Michael Vesterli

We describe a novel synthetic data set that is difficult to solve for almost all existing nearest neighbor search approaches, and for which PUFFINN significantly outperform previous methods.

Data Structures and Algorithms Computational Geometry

Scalable and Differentially Private Distributed Aggregation in the Shuffled Model

no code implementations19 Jun 2019 Badih Ghazi, Rasmus Pagh, Ameya Velingker

Federated learning promises to make machine learning feasible on distributed, private datasets by implementing gradient descent using secure aggregation methods.

Federated Learning

Fair Near Neighbor Search: Independent Range Sampling in High Dimensions

1 code implementation5 Jun 2019 Martin Aumüller, Rasmus Pagh, Francesco Silvestri

There are several variants of the similarity search problem, and one of the most relevant is the $r$-near neighbor ($r$-NN) problem: given a radius $r>0$ and a set of points $S$, construct a data structure that, for any given query point $q$, returns a point $p$ within distance at most $r$ from $q$.

Fairness

Space-efficient Feature Maps for String Alignment Kernels

no code implementations18 Feb 2018 Yasuo Tabei, Yoshihiro Yamanishi, Rasmus Pagh

We present novel space-efficient feature maps (SFMs) of RFFs for a space reduction from O(dD) of the original FMs to O(d) of SFMs with a theoretical guarantee with respect to concentration bounds.

Cannot find the paper you are looking for? You can Submit a new open access paper.