1 code implementation • 3 Feb 2022 • Tal Shnitzer, Mikhail Yurochkin, Kristjan Greenewald, Justin Solomon
We use manifold learning to compare the intrinsic geometric structures of different datasets by comparing their diffusion operators, symmetric positive-definite (SPD) matrices that relate to approximations of the continuous Laplace-Beltrami operator from discrete samples.
no code implementations • 21 Jan 2022 • Tal Shnitzer, Hau-Tieng Wu, Ronen Talmon
Our approach combines three components that are often considered separately: (i) manifold learning for building operators representing the geometry of the variables, (ii) Riemannian geometry of symmetric positive-definite matrices for multiscale composition of operators corresponding to different time samples, and (iii) spectral analysis of the composite operators for extracting different dynamic modes.
no code implementations • 18 Jul 2022 • David Cohen, Tal Shnitzer, Yuval Kluger, Ronen Talmon
This in turn allows for the extraction of the hidden manifold underlying the features and avoids overfitting, facilitating few-sample FS.
no code implementations • 27 Sep 2023 • Tal Shnitzer, Anthony Ou, Mírian Silva, Kate Soule, Yuekai Sun, Justin Solomon, Neil Thompson, Mikhail Yurochkin
There is a rapidly growing number of open-source Large Language Models (LLMs) and benchmark datasets to compare them.
no code implementations • 1 Oct 2023 • Dustin Klebe, Tal Shnitzer, Mikhail Yurochkin, Leonid Karlinsky, Justin Solomon
We introduce a semi-supervised Geometrically Regularized Alignment (GeRA) method to align the embedding spaces of pretrained unimodal encoders in a label-efficient way.