Search Results for author: Kibaek Kim

Found 13 papers, 4 papers with code

Addressing Heterogeneity in Federated Load Forecasting with Personalization Layers

no code implementations1 Apr 2024 Shourya Bose, Yu Zhang, Kibaek Kim

The advent of smart meters has enabled pervasive collection of energy consumption data for training short-term load forecasting models.

Federated Learning Load Forecasting +1

Gradient-Free Adaptive Global Pruning for Pre-trained Language Models

1 code implementation28 Feb 2024 Guangji Bai, Yijiang Li, Chen Ling, Kibaek Kim, Liang Zhao

The transformative impact of large language models (LLMs) like LLaMA and GPT on natural language processing is countered by their prohibitive computational demands.

Computational Efficiency Problem Decomposition

Secure Federated Learning Across Heterogeneous Cloud and High-Performance Computing Resources -- A Case Study on Federated Fine-tuning of LLaMA 2

no code implementations19 Feb 2024 Zilinghan Li, Shilan He, Pranshu Chaturvedi, Volodymyr Kindratenko, Eliu A Huerta, Kibaek Kim, Ravi Madduri

Federated learning enables multiple data owners to collaboratively train robust machine learning models without transferring large or sensitive local datasets by only sharing the parameters of the locally trained models.

Cloud Computing Federated Learning +1

QCQP-Net: Reliably Learning Feasible Alternating Current Optimal Power Flow Solutions Under Constraints

no code implementations11 Jan 2024 Sihan Zeng, Youngdae Kim, Yuxuan Ren, Kibaek Kim

At the heart of power system operations, alternating current optimal power flow (ACOPF) studies the generation of electric power in the most economical way under network-wide load requirement, and can be formulated as a highly structured non-convex quadratically constrained quadratic program (QCQP).

Privacy-Preserving Load Forecasting via Personalized Model Obfuscation

no code implementations21 Nov 2023 Shourya Bose, Yu Zhang, Kibaek Kim

The widespread adoption of smart meters provides access to detailed and localized load consumption data, suitable for training building-level load forecasting models.

Federated Learning Load Forecasting +1

FedCompass: Efficient Cross-Silo Federated Learning on Heterogeneous Client Devices using a Computing Power Aware Scheduler

1 code implementation26 Sep 2023 Zilinghan Li, Pranshu Chaturvedi, Shilan He, Han Chen, Gagandeep Singh, Volodymyr Kindratenko, E. A. Huerta, Kibaek Kim, Ravi Madduri

Nonetheless, because of the disparity of computing resources among different clients (i. e., device heterogeneity), synchronous federated learning algorithms suffer from degraded efficiency when waiting for straggler clients.

Federated Learning

Federated Short-Term Load Forecasting with Personalization Layers for Heterogeneous Clients

no code implementations22 Sep 2023 Shourya Bose, Kibaek Kim

The advent of smart meters has enabled pervasive collection of energy consumption data for training short-term load forecasting (STLF) models.

Federated Learning Load Forecasting +1

APPFLx: Providing Privacy-Preserving Cross-Silo Federated Learning as a Service

1 code implementation17 Aug 2023 Zilinghan Li, Shilan He, Pranshu Chaturvedi, Trung-Hieu Hoang, Minseok Ryu, E. A. Huerta, Volodymyr Kindratenko, Jordan Fuhrman, Maryellen Giger, Ryan Chard, Kibaek Kim, Ravi Madduri

Cross-silo privacy-preserving federated learning (PPFL) is a powerful tool to collaboratively train robust and generalized machine learning (ML) models without sharing sensitive (e. g., healthcare of financial) local data.

Federated Learning Privacy Preserving

Differentially Private Distributed Convex Optimization

no code implementations28 Feb 2023 Minseok Ryu, Kibaek Kim

This paper considers distributed optimization (DO) where multiple agents cooperate to minimize a global objective function, expressed as a sum of local objectives, subject to some constraints.

Distributed Optimization Federated Learning +1

Differentially Private Federated Learning via Inexact ADMM with Multiple Local Updates

no code implementations18 Feb 2022 Minseok Ryu, Kibaek Kim

Differential privacy (DP) techniques can be applied to the federated learning model to statistically guarantee data privacy against inference attacks to communication among the learning agents.

Federated Learning Image Classification

APPFL: Open-Source Software Framework for Privacy-Preserving Federated Learning

1 code implementation8 Feb 2022 Minseok Ryu, Youngdae Kim, Kibaek Kim, Ravi K. Madduri

Federated learning (FL) enables training models at different sites and updating the weights from the training instead of transferring data to a central location and training as in classical machine learning.

Federated Learning Privacy Preserving

A Reinforcement Learning Approach to Parameter Selection for Distributed Optimal Power Flow

no code implementations22 Oct 2021 Sihan Zeng, Alyssa Kody, Youngdae Kim, Kibaek Kim, Daniel K. Molzahn

We train our RL policy using deep Q-learning, and show that this policy can result in significantly accelerated convergence (up to a 59% reduction in the number of iterations compared to existing, curvature-informed penalty parameter selection methods).

Distributed Optimization Q-Learning +2

Differentially Private Federated Learning via Inexact ADMM

no code implementations11 Jun 2021 Minseok Ryu, Kibaek Kim

Differential privacy (DP) techniques can be applied to the federated learning model to protect data privacy against inference attacks to communication among the learning agents.

Federated Learning Image Classification

Cannot find the paper you are looking for? You can Submit a new open access paper.