Search Results for author: Jae-Won Chung

Found 5 papers, 4 papers with code

Toward Cross-Layer Energy Optimizations in Machine Learning Systems

no code implementations10 Apr 2024 Jae-Won Chung, Mosharaf Chowdhury

The enormous energy consumption of machine learning (ML) and generative AI workloads shows no sign of waning, taking a toll on operating costs, power delivery, and environmental sustainability.

Perseus: Removing Energy Bloat from Large Model Training

2 code implementations12 Dec 2023 Jae-Won Chung, Yile Gu, Insu Jang, Luoxi Meng, Nikhil Bansal, Mosharaf Chowdhury

Training large AI models on numerous GPUs consumes a massive amount of energy.

Chasing Low-Carbon Electricity for Practical and Sustainable DNN Training

1 code implementation4 Mar 2023 Zhenning Yang, Luoxi Meng, Jae-Won Chung, Mosharaf Chowdhury

Specifically, our solution observes real-time carbon intensity shifts during training and controls the energy consumption of GPUs, thereby reducing carbon footprint while maintaining training performance.

Zeus: Understanding and Optimizing GPU Energy Consumption of DNN Training

1 code implementation12 Aug 2022 Jie You, Jae-Won Chung, Mosharaf Chowdhury

In this paper, we observe that common practices to improve training performance can often lead to inefficient energy usage.

ShadowTutor: Distributed Partial Distillation for Mobile Video DNN Inference

1 code implementation24 Mar 2020 Jae-Won Chung, Jae-Yun Kim, Soo-Mook Moon

That is, we propose ShadowTutor, a distributed video DNN inference framework that reduces the number of network transmissions through intermittent knowledge distillation to a student model.

Distributed, Parallel, and Cluster Computing

Cannot find the paper you are looking for? You can Submit a new open access paper.