Search Results for author: Robert Underwood

Found 3 papers, 1 papers with code

Understanding The Effectiveness of Lossy Compression in Machine Learning Training Sets

no code implementations23 Mar 2024 Robert Underwood, Jon C. Calhoun, Sheng Di, Franck Cappello

We designed a systematic methodology for evaluating data reduction techniques for ML/AI, and we use it to perform a very comprehensive evaluation with 17 data reduction methods on 7 ML/AI applications to show modern lossy compression methods can achieve a 50-100x compression ratio improvement for a 1% or less loss in quality.

Data Compression

Understanding Patterns of Deep Learning ModelEvolution in Network Architecture Search

no code implementations22 Sep 2023 Robert Underwood, Meghana Madhastha, Randal Burns, Bogdan Nicolae

We describe how evolutionary patterns appear in distributed settings and opportunities for caching and improved scheduling.

Scheduling

cuSZ: An Efficient GPU-Based Error-Bounded Lossy Compression Framework for Scientific Data

2 code implementations19 Jul 2020 Jiannan Tian, Sheng Di, Kai Zhao, Cody Rivera, Megan Hickman Fulp, Robert Underwood, Sian Jin, Xin Liang, Jon Calhoun, Dingwen Tao, Franck Cappello

To the best of our knowledge, cuSZ is the first error-bounded lossy compressor on GPUs for scientific data.

Distributed, Parallel, and Cluster Computing

Cannot find the paper you are looking for? You can Submit a new open access paper.