You need to log in to edit.

You can create a new account if you don't have one.

Or, discuss a change on Slack.

You can create a new account if you don't have one.

Or, discuss a change on Slack.

no code implementations • 4 Apr 2022 • Ora Nova Fandina, Mikael Møller Høgsgaard, Kasper Green Larsen

In this work, we give a surprising new analysis of the Fast JL transform, showing that the $k \ln^2 n$ term in the embedding time can be improved to $(k \ln^2 n)/\alpha$ for an $\alpha = \Omega(\min\{\varepsilon^{-1}\ln(1/\varepsilon), \ln n\})$.

no code implementations • 25 Feb 2022 • Vincent Cohen-Addad, Kasper Green Larsen, David Saulpic, Chris Schwiegelshohn

Given a set of points in a metric space, the $(k, z)$-clustering problem consists of finding a set of $k$ points called centers, such that the sum of distances raised to the power of $z$ of every data point to its closest center is minimized.

no code implementations • 14 Jul 2021 • Yair Bartal, Ora Nova Fandina, Kasper Green Larsen

They provided upper bounds on its quality for a wide range of practical measures and showed that indeed these are best possible in many cases.

no code implementations • 15 Jun 2021 • Allan Grønlund, Mikael Høgsgaard, Lior Kamma, Kasper Green Larsen

The framework is simple and powerful enough to extend the generalization bounds by Arora et al. to also hold for the original network.

no code implementations • 3 Feb 2021 • Kasper Green Larsen, Rasmus Pagh, Jakub Tětek

For $t > 1$, the estimator takes the median of $2t-1$ independent estimates, and the probability that the estimate is off by more than $2 \|v\|_2/\sqrt{s}$ is exponentially small in $t$.

no code implementations • NeurIPS 2020 • Allan Grønlund, Lior Kamma, Kasper Green Larsen

We then explain the short comings of the $k$'th margin bound and prove a stronger and more refined margin-based generalization bound for boosted classifiers that indeed succeeds in explaining the performance of modern gradient boosters.

no code implementations • ICML 2020 • Allan Grønlund, Lior Kamma, Kasper Green Larsen

Support Vector Machines (SVMs) are among the most fundamental tools for binary classification.

no code implementations • NeurIPS 2019 • Allan Grønlund, Lior Kamma, Kasper Green Larsen, Alexander Mathiasen, Jelani Nelson

To date, the strongest known generalization (upper bound) is the $k$th margin bound of Gao and Zhou (2013).

no code implementations • 21 Sep 2019 • Kasper Green Larsen, Michael Mitzenmacher, Charalampos E. Tsourakakis

The goal is to recover $n$ discrete variables $g_i \in \{0, \ldots, k-1\}$ (up to some global offset) given noisy observations of a set of their pairwise differences $\{(g_i - g_j) \bmod k\}$; specifically, with probability $\frac{1}{k}+\delta$ for some $\delta > 0$ one obtains the correct answer, and with the remaining probability one obtains a uniformly random incorrect answer.

no code implementations • 30 Jan 2019 • Allan Grønlund, Kasper Green Larsen, Alexander Mathiasen

A common goal in a long line of research, is to maximize the smallest margin using as few base hypotheses as possible, culminating with the AdaBoostV algorithm by (R{\"a}tsch and Warmuth [JMLR'04]).

no code implementations • 20 Dec 2018 • Peyman Afshani, Manindra Agrawal, Benjamin Doerr, Carola Doerr, Kasper Green Larsen, Kurt Mehlhorn

We study the query complexity of a permutation-based variant of the guessing game Mastermind.

no code implementations • NeurIPS 2018 • Casper Benjamin Freksen, Lior Kamma, Kasper Green Larsen

We settle this question by giving tight asymptotic bounds on the exact tradeoff between the central parameters, thus providing a complete understanding of the performance of feature hashing.

1 code implementation • 19 Sep 2017 • Charalampos E. Tsourakakis, Michael Mitzenmacher, Kasper Green Larsen, Jarosław Błasiok, Ben Lawson, Preetum Nakkiran, Vasileios Nakos

The {\em edge sign prediction problem} aims to predict whether an interaction between a pair of nodes will be positive or negative.

1 code implementation • 25 Jan 2017 • Allan Grønlund, Kasper Green Larsen, Alexander Mathiasen, Jesper Sindahl Nielsen, Stefan Schneider, Mingzhou Song

We present all the existing work that had been overlooked and compare the various solutions theoretically.

no code implementations • 5 Apr 2016 • Kasper Green Larsen, Jelani Nelson, Huy L. Nguyen, Mikkel Thorup

Our main innovation is an efficient reduction from the heavy hitters to a clustering problem in which each heavy hitter is encoded as some form of noisy spectral cluster in a much bigger graph, and the goal is to identify every cluster.

Cannot find the paper you are looking for? You can
Submit a new open access paper.

Contact us on:
hello@paperswithcode.com
.
Papers With Code is a free resource with all data licensed under CC-BY-SA.