no code implementations • 3 Aug 2024 • Nicholas Karris, Evangelos A. Nikitopoulos, Ioannis Kevrekidis, Seungjoon Lee, Alexander Cloninger

By allowing the limiting behavior to emerge, the optimal transport maps closely approximate the vector field describing the bulk distribution's smooth evolution instead of the individual particles' more chaotic movements.

1 code implementation • 5 Jul 2024 • Varun Khurana, Xiuyuan Cheng, Alexander Cloninger

In particular, we derive the theoretical minimum training time needed to ensure the NTK two-sample test detects a deviation-level between the datasets.

1 code implementation • 25 Jun 2024 • Andrew Dennehy, Xiaoyu Zou, Shabnam J. Semnani, Yuri Fialko, Alexander Cloninger

DBSCAN and OPTICS are powerful algorithms for identifying clusters of points in domains where few assumptions can be made about the structure of the data.

no code implementations • 23 Apr 2024 • Sawyer Robertson, Zhengchao Wan, Alexander Cloninger

The fields of effective resistance and optimal transport on graphs are filled with rich connections to combinatorics, geometry, machine learning, and beyond.

no code implementations • 4 Mar 2024 • Alireza Pirhadi, Mohammad Hossein Moslemi, Alexander Cloninger, Mostafa Milani, Babak Salimi

Ensuring Conditional Independence (CI) constraints is pivotal for the development of fair and trustworthy machine learning models.

no code implementations • 2 Jan 2024 • Scott Mahan, Caroline Moosmüller, Alexander Cloninger

Our approach is motivated by the observation that $L^2-$distances between optimal transport maps for distinct point clouds, originating from a shared fixed reference distribution, provide an approximation of the Wasserstein-2 distance between these point clouds, under certain assumptions.

no code implementations • 31 Jul 2023 • Chester Holtz, PengWen Chen, Alexander Cloninger, Chung-Kuan Cheng, Gal Mishne

Motivated by the need to address the degeneracy of canonical Laplace learning algorithms in low label rates, we propose to reformulate graph-based semi-supervised learning as a nonconvex generalization of a \emph{Trust-Region Subproblem} (TRS).

no code implementations • 27 Jun 2023 • Robi Bhattacharjee, Alexander Cloninger, Yoav Freund, Andreas Oslandsbotn

One attractive application of ER is to point clouds, i. e. graphs whose vertices correspond to IID samples from a distribution over a metric space.

no code implementations • 14 Feb 2023 • Alexander Cloninger, Keaton Hamm, Varun Khurana, Caroline Moosmüller

We introduce LOT Wassmap, a computationally feasible algorithm to uncover low-dimensional structures in the Wasserstein space.

no code implementations • 4 Oct 2022 • Chester Holtz, Gal Mishne, Alexander Cloninger

Probabilistic generative models provide a flexible and systematic framework for learning the underlying geometry of data.

no code implementations • 22 Aug 2022 • Stefan C. Schonsheck, Scott Mahan, Timo Klock, Alexander Cloninger, Rongjie Lai

Our numerical experiments on synthetic and real-world data verify that the proposed model can effectively manage data with multi-class nearby but disjoint manifolds of different classes, overlapping manifolds, and manifolds with non-trivial topology.

no code implementations • 25 Jan 2022 • Varun Khurana, Harish Kannan, Alexander Cloninger, Caroline Moosmüller

In this paper we study supervised learning tasks on the space of probability measures.

no code implementations • 29 Sep 2021 • Srinjoy Das, Hrushikesh Mhaskar, Alexander Cloninger

Applications are demonstrated for clustering of synthetic and real-life time series and image data, and the performance of kdiff is compared to competing distance measures for clustering.

no code implementations • 23 Aug 2021 • Andreas Oslandsbotn, Zeljko Kereta, Valeriya Naumova, Yoav Freund, Alexander Cloninger

With a novel sub-sampling scheme, StreaMRAK reduces memory and computational complexities by creating a sketch of the original data, where the sub-sampling density is adapted to the bandwidth of the kernel and the local dimensionality of the data.

no code implementations • 4 Jun 2021 • Jinjie Zhang, Harish Kannan, Alexander Cloninger, Rayan Saab

We propose the use of low bit-depth Sigma-Delta and distributed noise-shaping methods for quantizing the Random Fourier features (RFFs) associated with shift-invariant kernels.

1 code implementation • ICLR Workshop GTRL 2021 • Dhruv Kohli, Alexander Cloninger, Gal Mishne

We present Low Distortion Local Eigenmaps (LDLE), a manifold learning technique which constructs a set of low distortion local views of a dataset in lower dimension and registers them to obtain a global embedding.

1 code implementation • 18 Sep 2020 • Alexander Cloninger, Haotian Li, Naoki Saito

We introduce a set of novel multiscale basis transforms for signals on graphs that utilize their "dual" domains by incorporating the "natural" distances between graph Laplacian eigenvectors, rather than simply using the eigenvalue ordering.

no code implementations • 20 Aug 2020 • Caroline Moosmüller, Alexander Cloninger

The transform is defined by computing the optimal transport of each distribution to a fixed reference distribution, and has a number of benefits when it comes to speed of computation and to determining classification boundaries.

no code implementations • 6 Aug 2020 • Alexander Cloninger, Timo Klock

We study the approximation of two-layer compositions $f(x) = g(\phi(x))$ via deep networks with ReLU activation, where $\phi$ is a geometrically intuitive, dimensionality reducing feature map.

no code implementations • 3 Aug 2020 • Alexander Cloninger, Hrushikesh Mhaskar

Our approach is to consider the unknown probability measure as a convex combination of the conditional probabilities for each class.

no code implementations • 28 Oct 2019 • Alexander Potapov, Ian Colbert, Ken Kreutz-Delgado, Alexander Cloninger, Srinjoy Das

Stochastic-sampling-based Generative Neural Networks, such as Restricted Boltzmann Machines and Generative Adversarial Networks, are now used for applications such as denoising, image occlusion removal, pattern completion, and motion synthesis.

1 code implementation • 25 Sep 2019 • Xiuyuan Cheng, Alexander Cloninger

The recent success of generative adversarial networks and variational learning suggests training a classifier network may work well in addressing the classical two-sample problem.

no code implementations • 3 Jun 2019 • Saeed Vahidian, Baharan Mirzasoleiman, Alexander Cloninger

In a number of situations, collecting a function value for every data point may be prohibitively expensive, and random sampling ignores any structure in the underlying data.

1 code implementation • ECCV 2020 • Henry Li, Ofir Lindenbaum, Xiuyuan Cheng, Alexander Cloninger

Variational autoencoders (VAEs) and generative adversarial networks (GANs) enjoy an intuitive connection to manifold learning: in training the decoder/generator is optimized to approximate a homeomorphism between the data distribution and the sampling space.

no code implementations • 11 Dec 2018 • Alexander Cloninger

In this paper, we bound the error induced by using a weighted skeletonization of two data sets for computing a two sample test with kernel maximum mean discrepancy.

no code implementations • 25 Apr 2018 • Alexander Cloninger, Stefan Steinerberger

We discuss the geometry of Laplacian eigenfunctions $-\Delta \phi = \lambda \phi$ on compact manifolds $(M, g)$ and combinatorial graphs $G=(V, E)$.

1 code implementation • 14 Sep 2017 • Xiuyuan Cheng, Alexander Cloninger, Ronald R. Coifman

The paper introduces a new kernel-based Maximum Mean Discrepancy (MMD) statistic for measuring the distance between two distributions given finitely-many multivariate samples.

no code implementations • 3 Jul 2017 • Alexander Cloninger, Brita Roy, Carley Riley, Harlan M. Krumholz

We address the problem of defining a network graph on a large collection of classes.

no code implementations • 31 Oct 2016 • Alexander Cloninger

We consider the problem of constructing diffusion operators high dimensional data $X$ to address counterfactual functions $F$, such as individualized treatment effectiveness.

no code implementations • 15 Jul 2016 • Alexander Cloninger, Stefan Steinerberger

Spectral embedding uses eigenfunctions of the discrete Laplacian on a weighted graph to obtain coordinates for an embedding of an abstract data set into Euclidean space.

4 code implementations • 2 Jun 2016 • Jared Katzman, Uri Shaham, Jonathan Bates, Alexander Cloninger, Tingting Jiang, Yuval Kluger

We introduce DeepSurv, a Cox proportional hazards deep neural network and state-of-the-art survival method for modeling interactions between a patient's covariates and treatment effectiveness in order to provide personalized treatment recommendations.

no code implementations • 24 Sep 2015 • Uri Shaham, Alexander Cloninger, Ronald R. Coifman

We discuss approximation of functions using deep neural nets.

no code implementations • 1 Jul 2015 • Alexander Cloninger, Ronald R. Coifman, Nicholas Downing, Harlan M. Krumholz

In this paper, we build an organization of high-dimensional datasets that cannot be cleanly embedded into a low-dimensional representation due to missing entries and a subset of the features being irrelevant to modeling functions of interest.

no code implementations • 25 Jun 2015 • Gal Mishne, Uri Shaham, Alexander Cloninger, Israel Cohen

In this paper, we propose a manifold learning algorithm based on deep learning to create an encoder, which maps a high-dimensional dataset and its low-dimensional embedding, and a decoder, which takes the embedded data back to the high-dimensional space.

Cannot find the paper you are looking for? You can
Submit a new open access paper.

Contact us on:
hello@paperswithcode.com
.
Papers With Code is a free resource with all data licensed under CC-BY-SA.