Search Results for author: Alexander Cloninger

Found 31 papers, 6 papers with code

All You Need is Resistance: On the Equivalence of Effective Resistance and Certain Optimal Transport Problems on Graphs

no code implementations23 Apr 2024 Sawyer Robertson, Zhengchao Wan, Alexander Cloninger

The fields of effective resistance and optimal transport on graphs are filled with rich connections to combinatorics, geometry, machine learning, and beyond.

OTClean: Data Cleaning for Conditional Independence Violations using Optimal Transport

no code implementations4 Mar 2024 Alireza Pirhadi, Mohammad Hossein Moslemi, Alexander Cloninger, Mostafa Milani, Babak Salimi

Ensuring Conditional Independence (CI) constraints is pivotal for the development of fair and trustworthy machine learning models.

Point Cloud Classification via Deep Set Linearized Optimal Transport

no code implementations2 Jan 2024 Scott Mahan, Caroline Moosmüller, Alexander Cloninger

Our approach is motivated by the observation that $L^2-$distances between optimal transport maps for distinct point clouds, originating from a shared fixed reference distribution, provide an approximation of the Wasserstein-2 distance between these point clouds, under certain assumptions.

Classification Point Cloud Classification

Semi-Supervised Laplacian Learning on Stiefel Manifolds

no code implementations31 Jul 2023 Chester Holtz, PengWen Chen, Alexander Cloninger, Chung-Kuan Cheng, Gal Mishne

Motivated by the need to address the degeneracy of canonical Laplace learning algorithms in low label rates, we propose to reformulate graph-based semi-supervised learning as a nonconvex generalization of a \emph{Trust-Region Subproblem} (TRS).

Effective resistance in metric spaces

no code implementations27 Jun 2023 Robi Bhattacharjee, Alexander Cloninger, Yoav Freund, Andreas Oslandsbotn

One attractive application of ER is to point clouds, i. e. graphs whose vertices correspond to IID samples from a distribution over a metric space.

Linearized Wasserstein dimensionality reduction with approximation guarantees

no code implementations14 Feb 2023 Alexander Cloninger, Keaton Hamm, Varun Khurana, Caroline Moosmüller

We introduce LOT Wassmap, a computationally feasible algorithm to uncover low-dimensional structures in the Wasserstein space.

Dimensionality Reduction

Evaluating Disentanglement in Generative Models Without Knowledge of Latent Factors

no code implementations4 Oct 2022 Chester Holtz, Gal Mishne, Alexander Cloninger

Probabilistic generative models provide a flexible and systematic framework for learning the underlying geometry of data.

Disentanglement Fairness +2

Semi-Supervised Manifold Learning with Complexity Decoupled Chart Autoencoders

no code implementations22 Aug 2022 Stefan C. Schonsheck, Scott Mahan, Timo Klock, Alexander Cloninger, Rongjie Lai

Our numerical experiments on synthetic and real-world data verify that the proposed model can effectively manage data with multi-class nearby but disjoint manifolds of different classes, overlapping manifolds, and manifolds with non-trivial topology.

Representation Learning

Kernel distance measures for time series, random fields and other structured data

no code implementations29 Sep 2021 Srinjoy Das, Hrushikesh Mhaskar, Alexander Cloninger

Applications are demonstrated for clustering of synthetic and real-life time series and image data, and the performance of kdiff is compared to competing distance measures for clustering.

Clustering Time Series +1

StreaMRAK a Streaming Multi-Resolution Adaptive Kernel Algorithm

no code implementations23 Aug 2021 Andreas Oslandsbotn, Zeljko Kereta, Valeriya Naumova, Yoav Freund, Alexander Cloninger

With a novel sub-sampling scheme, StreaMRAK reduces memory and computational complexities by creating a sketch of the original data, where the sub-sampling density is adapted to the bandwidth of the kernel and the local dimensionality of the data.

Sigma-Delta and Distributed Noise-Shaping Quantization Methods for Random Fourier Features

no code implementations4 Jun 2021 Jinjie Zhang, Harish Kannan, Alexander Cloninger, Rayan Saab

We propose the use of low bit-depth Sigma-Delta and distributed noise-shaping methods for quantizing the Random Fourier features (RFFs) associated with shift-invariant kernels.

Quantization

LDLE: Low Distortion Local Eigenmaps

1 code implementation ICLR Workshop GTRL 2021 Dhruv Kohli, Alexander Cloninger, Gal Mishne

We present Low Distortion Local Eigenmaps (LDLE), a manifold learning technique which constructs a set of low distortion local views of a dataset in lower dimension and registers them to obtain a global embedding.

Natural Graph Wavelet Packet Dictionaries

1 code implementation18 Sep 2020 Alexander Cloninger, Haotian Li, Naoki Saito

We introduce a set of novel multiscale basis transforms for signals on graphs that utilize their "dual" domains by incorporating the "natural" distances between graph Laplacian eigenvectors, rather than simply using the eigenvalue ordering.

Linear Optimal Transport Embedding: Provable Wasserstein classification for certain rigid transformations and perturbations

no code implementations20 Aug 2020 Caroline Moosmüller, Alexander Cloninger

The transform is defined by computing the optimal transport of each distribution to a fixed reference distribution, and has a number of benefits when it comes to speed of computation and to determining classification boundaries.

General Classification

A deep network construction that adapts to intrinsic dimensionality beyond the domain

no code implementations6 Aug 2020 Alexander Cloninger, Timo Klock

We study the approximation of two-layer compositions $f(x) = g(\phi(x))$ via deep networks with ReLU activation, where $\phi$ is a geometrically intuitive, dimensionality reducing feature map.

Cautious Active Clustering

no code implementations3 Aug 2020 Alexander Cloninger, Hrushikesh Mhaskar

Our approach is to consider the unknown probability measure as a convex combination of the conditional probabilities for each class.

Classification Clustering +1

PT-MMD: A Novel Statistical Framework for the Evaluation of Generative Systems

no code implementations28 Oct 2019 Alexander Potapov, Ian Colbert, Ken Kreutz-Delgado, Alexander Cloninger, Srinjoy Das

Stochastic-sampling-based Generative Neural Networks, such as Restricted Boltzmann Machines and Generative Adversarial Networks, are now used for applications such as denoising, image occlusion removal, pattern completion, and motion synthesis.

Denoising Model Selection +1

Classification Logit Two-sample Testing by Neural Networks

1 code implementation25 Sep 2019 Xiuyuan Cheng, Alexander Cloninger

The recent success of generative adversarial networks and variational learning suggests training a classifier network may work well in addressing the classical two-sample problem.

Classification General Classification +2

Coresets for Estimating Means and Mean Square Error with Limited Greedy Samples

no code implementations3 Jun 2019 Saeed Vahidian, Baharan Mirzasoleiman, Alexander Cloninger

In a number of situations, collecting a function value for every data point may be prohibitively expensive, and random sampling ignores any structure in the underlying data.

Clustering Node Classification

Variational Diffusion Autoencoders with Random Walk Sampling

1 code implementation ECCV 2020 Henry Li, Ofir Lindenbaum, Xiuyuan Cheng, Alexander Cloninger

Variational autoencoders (VAEs) and generative adversarial networks (GANs) enjoy an intuitive connection to manifold learning: in training the decoder/generator is optimized to approximate a homeomorphism between the data distribution and the sampling space.

Bounding the Error From Reference Set Kernel Maximum Mean Discrepancy

no code implementations11 Dec 2018 Alexander Cloninger

In this paper, we bound the error induced by using a weighted skeletonization of two data sets for computing a two sample test with kernel maximum mean discrepancy.

On the Dual Geometry of Laplacian Eigenfunctions

no code implementations25 Apr 2018 Alexander Cloninger, Stefan Steinerberger

We discuss the geometry of Laplacian eigenfunctions $-\Delta \phi = \lambda \phi$ on compact manifolds $(M, g)$ and combinatorial graphs $G=(V, E)$.

Two-sample Statistics Based on Anisotropic Kernels

1 code implementation14 Sep 2017 Xiuyuan Cheng, Alexander Cloninger, Ronald R. Coifman

The paper introduces a new kernel-based Maximum Mean Discrepancy (MMD) statistic for measuring the distance between two distributions given finitely-many multivariate samples.

Vocal Bursts Valence Prediction

Function Driven Diffusion for Personalized Counterfactual Inference

no code implementations31 Oct 2016 Alexander Cloninger

We consider the problem of constructing diffusion operators high dimensional data $X$ to address counterfactual functions $F$, such as individualized treatment effectiveness.

counterfactual Counterfactual Inference

Spectral Echolocation via the Wave Embedding

no code implementations15 Jul 2016 Alexander Cloninger, Stefan Steinerberger

Spectral embedding uses eigenfunctions of the discrete Laplacian on a weighted graph to obtain coordinates for an embedding of an abstract data set into Euclidean space.

Dimensionality Reduction Position

DeepSurv: Personalized Treatment Recommender System Using A Cox Proportional Hazards Deep Neural Network

4 code implementations2 Jun 2016 Jared Katzman, Uri Shaham, Jonathan Bates, Alexander Cloninger, Tingting Jiang, Yuval Kluger

We introduce DeepSurv, a Cox proportional hazards deep neural network and state-of-the-art survival method for modeling interactions between a patient's covariates and treatment effectiveness in order to provide personalized treatment recommendations.

Feature Engineering Predicting Patient Outcomes +2

Bigeometric Organization of Deep Nets

no code implementations1 Jul 2015 Alexander Cloninger, Ronald R. Coifman, Nicholas Downing, Harlan M. Krumholz

In this paper, we build an organization of high-dimensional datasets that cannot be cleanly embedded into a low-dimensional representation due to missing entries and a subset of the features being irrelevant to modeling functions of interest.

Diffusion Nets

no code implementations25 Jun 2015 Gal Mishne, Uri Shaham, Alexander Cloninger, Israel Cohen

In this paper, we propose a manifold learning algorithm based on deep learning to create an encoder, which maps a high-dimensional dataset and its low-dimensional embedding, and a decoder, which takes the embedded data back to the high-dimensional space.

Outlier Detection

Cannot find the paper you are looking for? You can Submit a new open access paper.