Search Results for author: Chris Peterson

Found 14 papers, 2 papers with code

ReLU Neural Networks, Polyhedral Decompositions, and Persistent Homolog

no code implementations30 Jun 2023 Yajing Liu, Christina M Cole, Chris Peterson, Michael Kirby

A ReLU neural network leads to a finite polyhedral decomposition of input space and a corresponding finite dual graph.

Quantization

The Flag Median and FlagIRLS

1 code implementation CVPR 2022 Nathan Mankovich, Emily King, Chris Peterson, Michael Kirby

We provide evidence that the flag median is robust to outliers and can be used effectively in algorithms like Linde-Buzo-Grey (LBG) to produce improved clusterings on Grassmannians.

Locally Linear Attributes of ReLU Neural Networks

no code implementations30 Nov 2020 Ben Sattelberg, Renzo Cavalieri, Michael Kirby, Chris Peterson, Ross Beveridge

The weights in the neural network determine a decomposition of the input space into convex polytopes and on each of these polytopes the network can be described by a single affine mapping.

The flag manifold as a tool for analyzing and comparing data sets

no code implementations24 Jun 2020 Xiaofeng Ma, Michael Kirby, Chris Peterson

Subspace methods, utilizing Grassmann manifolds, have been a great aid in dealing with such variability.

More chemical detection through less sampling: amplifying chemical signals in hyperspectral data cubes through compressive sensing

no code implementations27 Jun 2019 Henry Kvinge, Elin Farnell, Julia R. Dupuis, Michael Kirby, Chris Peterson, Elizabeth C. Schundler

In this paper we explore a phenomenon in which bandwise CS sampling of a hyperspectral data cube followed by reconstruction can actually result in amplification of chemical signals contained in the cube.

Compressive Sensing

A data-driven approach to sampling matrix selection for compressive sensing

no code implementations20 Jun 2019 Elin Farnell, Henry Kvinge, John P. Dixon, Julia R. Dupuis, Michael Kirby, Chris Peterson, Elizabeth C. Schundler, Christian W. Smith

We propose a method for defining an order for a sampling basis that is optimal with respect to capturing variance in data, thus allowing for meaningful sensing at any desired level of compression.

Compressive Sensing

Too many secants: a hierarchical approach to secant-based dimensionality reduction on large data sets

no code implementations5 Aug 2018 Henry Kvinge, Elin Farnell, Michael Kirby, Chris Peterson

Intuitively, the SAP algorithm seeks to determine a projection which best preserves the lengths of all secants between points in a data set; by applying the algorithm to find the best projections to vector spaces of various dimensions, one may infer the dimension of the manifold of origination.

Dimensionality Reduction

A GPU-Oriented Algorithm Design for Secant-Based Dimensionality Reduction

no code implementations10 Jul 2018 Henry Kvinge, Elin Farnell, Michael Kirby, Chris Peterson

Dimensionality-reduction techniques are a fundamental tool for extracting useful information from high-dimensional data sets.

Dimensionality Reduction

Endmember Extraction on the Grassmannian

no code implementations3 Jul 2018 Elin Farnell, Henry Kvinge, Michael Kirby, Chris Peterson

Endmember extraction plays a prominent role in a variety of data analysis problems as endmembers often correspond to data representing the purest or best representative of some feature.

Persistent Homology on Grassmann Manifolds for Analysis of Hyperspectral Movies

no code implementations7 Jul 2016 Sofya Chepushtanova, Michael Kirby, Chris Peterson, Lori Ziegelmeier

This realization has motivated the development of new tools such as persistent homology for exploring topological invariants, or features, in large data sets.

Topological Data Analysis

Persistence Images: A Stable Vector Representation of Persistent Homology

4 code implementations22 Jul 2015 Henry Adams, Sofya Chepushtanova, Tegan Emerson, Eric Hanson, Michael Kirby, Francis Motta, Rachel Neville, Chris Peterson, Patrick Shipman, Lori Ziegelmeier

We convert a PD to a finite-dimensional vector representation which we call a persistence image (PI), and prove the stability of this transformation with respect to small perturbations in the inputs.

BIG-bench Machine Learning Graph Classification +1

Finding the Subspace Mean or Median to Fit Your Need

no code implementations CVPR 2014 Tim Marrinan, J. Ross Beveridge, Bruce Draper, Michael Kirby, Chris Peterson

The extrinsic manifold mean, the L2-median, and the flag mean are alternative averages that can be substituted directly for the Karcher mean in many applications.

Cannot find the paper you are looking for? You can Submit a new open access paper.