no code implementations • 30 Jun 2023 • Yajing Liu, Christina M Cole, Chris Peterson, Michael Kirby
A ReLU neural network leads to a finite polyhedral decomposition of input space and a corresponding finite dual graph.
1 code implementation • CVPR 2022 • Nathan Mankovich, Emily King, Chris Peterson, Michael Kirby
We provide evidence that the flag median is robust to outliers and can be used effectively in algorithms like Linde-Buzo-Grey (LBG) to produce improved clusterings on Grassmannians.
no code implementations • 21 Oct 2021 • Ehsan K. Ardestani, Changkyu Kim, Seung Jae Lee, Luoshang Pan, Valmiki Rampersad, Jens Axboe, Banit Agrawal, Fuxun Yu, Ansha Yu, Trung Le, Hector Yuen, Shishir Juluri, Akshat Nanda, Manoj Wodekar, Dheevatsa Mudigere, Krishnakumar Nair, Maxim Naumov, Chris Peterson, Mikhail Smelyanskiy, Vijay Rao
Deep Learning Recommendation Models (DLRM) are widespread, account for a considerable data center footprint, and grow by more than 1. 5x per year.
no code implementations • 30 Nov 2020 • Ben Sattelberg, Renzo Cavalieri, Michael Kirby, Chris Peterson, Ross Beveridge
The weights in the neural network determine a decomposition of the input space into convex polytopes and on each of these polytopes the network can be described by a single affine mapping.
no code implementations • 24 Jun 2020 • Xiaofeng Ma, Michael Kirby, Chris Peterson
Subspace methods, utilizing Grassmann manifolds, have been a great aid in dealing with such variability.
no code implementations • 27 Jun 2019 • Henry Kvinge, Elin Farnell, Julia R. Dupuis, Michael Kirby, Chris Peterson, Elizabeth C. Schundler
In this paper we explore a phenomenon in which bandwise CS sampling of a hyperspectral data cube followed by reconstruction can actually result in amplification of chemical signals contained in the cube.
no code implementations • 20 Jun 2019 • Elin Farnell, Henry Kvinge, John P. Dixon, Julia R. Dupuis, Michael Kirby, Chris Peterson, Elizabeth C. Schundler, Christian W. Smith
We propose a method for defining an order for a sampling basis that is optimal with respect to capturing variance in data, thus allowing for meaningful sensing at any desired level of compression.
no code implementations • 27 Oct 2018 • Henry Kvinge, Elin Farnell, Michael Kirby, Chris Peterson
In this paper, we propose a new statistic that we call the $\kappa$-profile for analysis of large data sets.
no code implementations • 5 Aug 2018 • Henry Kvinge, Elin Farnell, Michael Kirby, Chris Peterson
Intuitively, the SAP algorithm seeks to determine a projection which best preserves the lengths of all secants between points in a data set; by applying the algorithm to find the best projections to vector spaces of various dimensions, one may infer the dimension of the manifold of origination.
no code implementations • 10 Jul 2018 • Henry Kvinge, Elin Farnell, Michael Kirby, Chris Peterson
Dimensionality-reduction techniques are a fundamental tool for extracting useful information from high-dimensional data sets.
no code implementations • 3 Jul 2018 • Elin Farnell, Henry Kvinge, Michael Kirby, Chris Peterson
Endmember extraction plays a prominent role in a variety of data analysis problems as endmembers often correspond to data representing the purest or best representative of some feature.
no code implementations • 7 Jul 2016 • Sofya Chepushtanova, Michael Kirby, Chris Peterson, Lori Ziegelmeier
This realization has motivated the development of new tools such as persistent homology for exploring topological invariants, or features, in large data sets.
4 code implementations • 22 Jul 2015 • Henry Adams, Sofya Chepushtanova, Tegan Emerson, Eric Hanson, Michael Kirby, Francis Motta, Rachel Neville, Chris Peterson, Patrick Shipman, Lori Ziegelmeier
We convert a PD to a finite-dimensional vector representation which we call a persistence image (PI), and prove the stability of this transformation with respect to small perturbations in the inputs.
Ranked #4 on
Graph Classification
on NEURON-BINARY
no code implementations • CVPR 2014 • Tim Marrinan, J. Ross Beveridge, Bruce Draper, Michael Kirby, Chris Peterson
The extrinsic manifold mean, the L2-median, and the flag mean are alternative averages that can be substituted directly for the Karcher mean in many applications.