1 code implementation • 2 Apr 2024 • Andrew Draganov, David Saulpic, Chris Schwiegelshohn
We study the theoretical and practical runtime limits of k-means and k-median clustering on large datasets.
no code implementations • 20 Jun 2023 • Andrew Draganov, Simon Dohn
It has become standard to explain neural network latent spaces with attraction/repulsion dimensionality reduction (ARDR) methods like tSNE and UMAP.
1 code implementation • 12 May 2023 • Andrew Draganov, Jakob Rødsgaard Jørgensen, Katrine Scheel Nellemann, Davide Mottin, Ira Assent, Tyrus Berry, Cigdem Aslay
tSNE and UMAP are popular dimensionality reduction algorithms due to their speed and interpretable low-dimensional embeddings.
1 code implementation • 20 Jun 2022 • Andrew Draganov, Tyrus Berry, Jakob Rødsgaard Jørgensen, Katrine Scheel Nellemann, Ira Assent, Davide Mottin
In this work, we show that this is indeed possible by combining the two approaches into a single method.
no code implementations • 8 May 2021 • Carter N. Brown, Enrico Mattei, Andrew Draganov
We introduce the Lie group of transformations that a signal experiences under the multipath propagation model and define operations that are equivariant and invariant to the frequency response of a Finite Impulse Response (FIR) filter to build a ChaRRNet.