no code implementations • 12 Mar 2024 • Ishaq Aden-Ali, Mikael Møller Høgsgaard, Kasper Green Larsen, Nikita Zhivotovskiy
Furthermore, we prove a near-optimal high-probability bound on this algorithm's error.
no code implementations • 13 Feb 2023 • Mikael Møller Høgsgaard, Lion Kamma, Kasper Green Larsen, Jelani Nelson, Chris Schwiegelshohn
In this work, we revisit sparse embeddings and identify a loophole in the lower bound.
no code implementations • 27 Jan 2023 • Mikael Møller Høgsgaard, Kasper Green Larsen, Martin Ritzert
AdaBoost is a classic boosting algorithm for combining multiple inaccurate classifiers produced by a weak learner, to produce a strong learner with arbitrarily high accuracy when given enough training data.
1 code implementation • 4 Apr 2022 • Ora Nova Fandina, Mikael Møller Høgsgaard, Kasper Green Larsen
In this work, we give a surprising new analysis of the Fast JL transform, showing that the $k \ln^2 n$ term in the embedding time can be improved to $(k \ln^2 n)/\alpha$ for an $\alpha = \Omega(\min\{\varepsilon^{-1}\ln(1/\varepsilon), \ln n\})$.