You need to log in to edit.

You can create a new account if you don't have one.

Or, discuss a change on Slack.

You can create a new account if you don't have one.

Or, discuss a change on Slack.

no code implementations • 22 Aug 2022 • Stefan C. Schonsheck, Scott Mahan, Timo Klock, Alexander Cloninger, Rongjie Lai

Our numerical experiments on synthetic and real-world data verify that the proposed model can effectively manage data with multi-class nearby but disjoint manifolds of different classes, overlapping manifolds, and manifolds with non-trivial topology.

no code implementations • 25 Jan 2022 • Varun Khurana, Harish Kannan, Alexander Cloninger, Caroline Moosmüller

In this paper we study supervised learning tasks on the space of probability measures.

no code implementations • 29 Sep 2021 • Srinjoy Das, Hrushikesh Mhaskar, Alexander Cloninger

Applications are demonstrated for clustering of synthetic and real-life time series and image data, and the performance of kdiff is compared to competing distance measures for clustering.

no code implementations • 23 Aug 2021 • Andreas Oslandsbotn, Zeljko Kereta, Valeriya Naumova, Yoav Freund, Alexander Cloninger

With a novel sub-sampling scheme, StreaMRAK reduces memory and computational complexities by creating a sketch of the original data, where the sub-sampling density is adapted to the bandwidth of the kernel and the local dimensionality of the data.

no code implementations • 4 Jun 2021 • Jinjie Zhang, Harish Kannan, Alexander Cloninger, Rayan Saab

We propose the use of low bit-depth Sigma-Delta and distributed noise-shaping methods for quantizing the Random Fourier features (RFFs) associated with shift-invariant kernels.

1 code implementation • ICLR Workshop GTRL 2021 • Dhruv Kohli, Alexander Cloninger, Gal Mishne

We present Low Distortion Local Eigenmaps (LDLE), a manifold learning technique which constructs a set of low distortion local views of a dataset in lower dimension and registers them to obtain a global embedding.

1 code implementation • 18 Sep 2020 • Alexander Cloninger, Haotian Li, Naoki Saito

We introduce a set of novel multiscale basis transforms for signals on graphs that utilize their "dual" domains by incorporating the "natural" distances between graph Laplacian eigenvectors, rather than simply using the eigenvalue ordering.

no code implementations • 20 Aug 2020 • Caroline Moosmüller, Alexander Cloninger

The transform is defined by computing the optimal transport of each distribution to a fixed reference distribution, and has a number of benefits when it comes to speed of computation and to determining classification boundaries.

no code implementations • 6 Aug 2020 • Alexander Cloninger, Timo Klock

We study the approximation of two-layer compositions $f(x) = g(\phi(x))$ via deep networks with ReLU activation, where $\phi$ is a geometrically intuitive, dimensionality reducing feature map.

no code implementations • 3 Aug 2020 • Alexander Cloninger, Hrushikesh Mhaskar

Our approach is to consider the unknown probability measure as a convex combination of the conditional probabilities for each class.

no code implementations • 28 Oct 2019 • Alexander Potapov, Ian Colbert, Ken Kreutz-Delgado, Alexander Cloninger, Srinjoy Das

Stochastic-sampling-based Generative Neural Networks, such as Restricted Boltzmann Machines and Generative Adversarial Networks, are now used for applications such as denoising, image occlusion removal, pattern completion, and motion synthesis.

no code implementations • 25 Sep 2019 • Xiuyuan Cheng, Alexander Cloninger

The recent success of generative adversarial networks and variational learning suggests training a classifier network may work well in addressing the classical two-sample problem.

no code implementations • 3 Jun 2019 • Saeed Vahidian, Baharan Mirzasoleiman, Alexander Cloninger

In a number of situations, collecting a function value for every data point may be prohibitively expensive, and random sampling ignores any structure in the underlying data.

1 code implementation • ECCV 2020 • Henry Li, Ofir Lindenbaum, Xiuyuan Cheng, Alexander Cloninger

Variational autoencoders (VAEs) and generative adversarial networks (GANs) enjoy an intuitive connection to manifold learning: in training the decoder/generator is optimized to approximate a homeomorphism between the data distribution and the sampling space.

no code implementations • 11 Dec 2018 • Alexander Cloninger

In this paper, we bound the error induced by using a weighted skeletonization of two data sets for computing a two sample test with kernel maximum mean discrepancy.

no code implementations • 25 Apr 2018 • Alexander Cloninger, Stefan Steinerberger

We discuss the geometry of Laplacian eigenfunctions $-\Delta \phi = \lambda \phi$ on compact manifolds $(M, g)$ and combinatorial graphs $G=(V, E)$.

1 code implementation • 14 Sep 2017 • Xiuyuan Cheng, Alexander Cloninger, Ronald R. Coifman

The paper introduces a new kernel-based Maximum Mean Discrepancy (MMD) statistic for measuring the distance between two distributions given finitely-many multivariate samples.

no code implementations • 3 Jul 2017 • Alexander Cloninger, Brita Roy, Carley Riley, Harlan M. Krumholz

We address the problem of defining a network graph on a large collection of classes.

no code implementations • 31 Oct 2016 • Alexander Cloninger

We consider the problem of constructing diffusion operators high dimensional data $X$ to address counterfactual functions $F$, such as individualized treatment effectiveness.

no code implementations • 15 Jul 2016 • Alexander Cloninger, Stefan Steinerberger

Spectral embedding uses eigenfunctions of the discrete Laplacian on a weighted graph to obtain coordinates for an embedding of an abstract data set into Euclidean space.

4 code implementations • 2 Jun 2016 • Jared Katzman, Uri Shaham, Jonathan Bates, Alexander Cloninger, Tingting Jiang, Yuval Kluger

We introduce DeepSurv, a Cox proportional hazards deep neural network and state-of-the-art survival method for modeling interactions between a patient's covariates and treatment effectiveness in order to provide personalized treatment recommendations.

no code implementations • 24 Sep 2015 • Uri Shaham, Alexander Cloninger, Ronald R. Coifman

We discuss approximation of functions using deep neural nets.

no code implementations • 1 Jul 2015 • Alexander Cloninger, Ronald R. Coifman, Nicholas Downing, Harlan M. Krumholz

In this paper, we build an organization of high-dimensional datasets that cannot be cleanly embedded into a low-dimensional representation due to missing entries and a subset of the features being irrelevant to modeling functions of interest.

no code implementations • 25 Jun 2015 • Gal Mishne, Uri Shaham, Alexander Cloninger, Israel Cohen

In this paper, we propose a manifold learning algorithm based on deep learning to create an encoder, which maps a high-dimensional dataset and its low-dimensional embedding, and a decoder, which takes the embedded data back to the high-dimensional space.

Cannot find the paper you are looking for? You can
Submit a new open access paper.

Contact us on:
hello@paperswithcode.com
.
Papers With Code is a free resource with all data licensed under CC-BY-SA.