Search Results for author: Ery Arias-Castro

Found 11 papers, 2 papers with code

Embedding Functional Data: Multidimensional Scaling and Manifold Learning

no code implementations30 Aug 2022 Ery Arias-Castro, Wanli Qiao

We adapt concepts, methodology, and theory originally developed in the areas of multidimensional scaling and dimensionality reduction for multivariate data to the functional setting.

Dimensionality Reduction

On the Selection of Tuning Parameters for Patch-Stitching Embedding Methods

no code implementations14 Jul 2022 Ery Arias-Castro, Phong Alain Chau

While classical scaling, just like principal component analysis, is parameter-free, other methods for embedding multivariate data require the selection of one or several tuning parameters.

Dimensionality Reduction

Clustering by Hill-Climbing: Consistency Results

no code implementations18 Feb 2022 Ery Arias-Castro, Wanli Qiao

We consider several hill-climbing approaches to clustering as formulated by Fukunaga and Hostetler in the 1970's.

Clustering

An Asymptotic Equivalence between the Mean-Shift Algorithm and the Cluster Tree

no code implementations19 Nov 2021 Ery Arias-Castro, Wanli Qiao

Two important nonparametric approaches to clustering emerged in the 1970's: clustering by level sets or cluster tree as proposed by Hartigan, and clustering by gradient lines or gradient flow as proposed by Fukunaga and Hosteler.

Clustering

Moving Up the Cluster Tree with the Gradient Flow

no code implementations17 Sep 2021 Ery Arias-Castro, Wanli Qiao

The paper establishes a strong correspondence between two important clustering approaches that emerged in the 1970's: clustering by level sets or cluster tree as proposed by Hartigan and clustering by gradient lines or gradient flow as proposed by Fukunaga and Hostetler.

Clustering

Perturbation Bounds for Procrustes, Classical Scaling, and Trilateration, with Applications to Manifold Learning

no code implementations22 Oct 2018 Ery Arias-Castro, Adel Javanmard, Bruno Pelletier

One of the common tasks in unsupervised learning is dimensionality reduction, where the goal is to find meaningful low-dimensional structures hidden in high-dimensional data.

Dimensionality Reduction

A Simple Approach to Sparse Clustering

1 code implementation23 Feb 2016 Ery Arias-Castro, Xiao Pu

Consider the problem of sparse clustering, where it is assumed that only a subset of the features are useful for clustering purposes.

Clustering

A Nonparametric Framework for Quantifying Generative Inference on Neuromorphic Systems

no code implementations18 Feb 2016 Ojash Neopane, Srinjoy Das, Ery Arias-Castro, Kenneth Kreutz-Delgado

Restricted Boltzmann Machines and Deep Belief Networks have been successfully used in probabilistic generative model applications such as image occlusion removal, pattern completion and motion synthesis.

Motion Synthesis

Community Detection in Sparse Random Networks

no code implementations13 Aug 2013 Ery Arias-Castro, Nicolas Verzelen

This is formalized as testing for the existence of a dense random subgraph in a random graph.

Community Detection

Does median filtering truly preserve edges better than linear filtering?

1 code implementation14 Dec 2006 Ery Arias-Castro, David L. Donoho

We show that median filtering and linear filtering have similar asymptotic worst-case mean-squared error (MSE) when the signal-to-noise ratio (SNR) is of order 1, which corresponds to the case of constant per-pixel noise level in a digital signal.

Statistics Theory Statistics Theory 62G08, 62G20 (Primary) 60G35 (Secondary)

Cannot find the paper you are looking for? You can Submit a new open access paper.