1 code implementation • 22 Feb 2024 • Ifeoma Veronica Nwabufo, Jan Niklas Böhm, Philipp Berens, Dmitry Kobak
Self-supervised learning methods based on data augmentations, such as SimCLR, BYOL, or DINO, allow obtaining semantically meaningful representations of image datasets and are widely used prior to supervised fine-tuning.
1 code implementation • 18 Oct 2022 • Jan Niklas Böhm, Philipp Berens, Dmitry Kobak
This problem can be circumvented by self-supervised approaches based on contrastive learning, such as SimCLR, relying on data augmentation to generate implicit neighbors, but these methods do not produce two-dimensional embeddings suitable for visualization.
2 code implementations • 3 Jun 2022 • Sebastian Damrich, Jan Niklas Böhm, Fred A. Hamprecht, Dmitry Kobak
We exploit this new conceptual connection to propose and implement a generalization of negative sampling, allowing us to interpolate between (and even extrapolate beyond) $t$-SNE and UMAP and their respective embeddings.
1 code implementation • 17 Jul 2020 • Jan Niklas Böhm, Philipp Berens, Dmitry Kobak
Neighbor embeddings are a family of methods for visualizing complex high-dimensional datasets using $k$NN graphs.