DEAL: Decremental Energy-Aware Learning in a Federated System

5 Feb 2021  ·  Wenting Zou, Li Li, Zichen Xu, Chengzhong Xu ·

Federated learning struggles with their heavy energy footprint on battery-powered devices. The learning process keeps all devices awake while draining expensive battery power to train a shared model collaboratively, yet it may still leak sensitive personal information. Traditional energy management techniques in system kernel mode can force the training device entering low power states, but it may violate the SLO of the collaborative learning. To address the conflict between learning SLO and energy efficiency, we propose DEAL, an energy efficient learning system that saves energy and preserves privacy with a decremental learning design. DEAL reduces the energy footprint from two layers: 1) an optimization layer that selects a subset of workers with sufficient capacity and maximum rewards. 2) a specified decremental learning algorithm that actively provides a decremental and incremental update functions, which allows kernel to correctly tune the local DVFS. We prototyped DEAL in containerized services with modern smartphone profiles and evaluated it with several learning benchmarks with realistic traces. We observed that DEAL achieves 75.6%-82.4% less energy footprint in different datasets, compared to the traditional methods. All learning processes are faster than state-of-the-practice FL frameworks up to 2-4X in model convergence.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here