Search Results for author: Josh Fromm

Found 6 papers, 1 papers with code

Automated Backend-Aware Post-Training Quantization

no code implementations27 Mar 2021 Ziheng Jiang, Animesh Jain, Andrew Liu, Josh Fromm, Chengqian Ma, Tianqi Chen, Luis Ceze

Quantization is a key technique to reduce the resource requirement and improve the performance of neural network deployment.

Quantization

SplitSR: An End-to-End Approach to Super-Resolution on Mobile Devices

no code implementations20 Jan 2021 Xin Liu, Yuang Li, Josh Fromm, Yuntao Wang, Ziheng Jiang, Alex Mariakakis, Shwetak Patel

In this work, we demonstrate state-of-the-art latency and accuracy for on-device super-resolution using a novel hybrid architecture called SplitSR and a novel lightweight residual block called SplitSRBlock.

Super-Resolution

MetaPhys: Few-Shot Adaptation for Non-Contact Physiological Measurement

no code implementations5 Oct 2020 Xin Liu, Ziheng Jiang, Josh Fromm, Xuhai Xu, Shwetak Patel, Daniel McDuff

There are large individual differences in physiological processes, making designing personalized health sensing algorithms challenging.

Meta-Learning

Multi-Task Temporal Shift Attention Networks for On-Device Contactless Vitals Measurement

2 code implementations NeurIPS 2020 Xin Liu, Josh Fromm, Shwetak Patel, Daniel McDuff

Telehealth and remote health monitoring have become increasingly important during the SARS-CoV-2 pandemic and it is widely expected that this will have a lasting impact on healthcare practices.

A Hardware-Software Blueprint for Flexible Deep Learning Specialization

no code implementations11 Jul 2018 Thierry Moreau, Tianqi Chen, Luis Vega, Jared Roesch, Eddie Yan, Lianmin Zheng, Josh Fromm, Ziheng Jiang, Luis Ceze, Carlos Guestrin, Arvind Krishnamurthy

Specialized Deep Learning (DL) acceleration stacks, designed for a specific set of frameworks, model architectures, operators, and data types, offer the allure of high performance while sacrificing flexibility.

Code Generation Style Transfer

Cannot find the paper you are looking for? You can Submit a new open access paper.