no code implementations • 6 May 2023 • SangHyeon Park, Junmo Lee, Soo-Mook Moon
Decentralized solutions such as blockchain have been proposed to tackle these issues, but they often struggle when dealing with large-scale models, leading to time-consuming inference and inefficient training verification.
1 code implementation • ICLR 2023 • Minjae Kim, Sangyoon Yu, Suhyun Kim, Soo-Mook Moon
Federated learning is for training a global model without collecting private local data from clients.
1 code implementation • 24 Mar 2020 • Jae-Won Chung, Jae-Yun Kim, Soo-Mook Moon
That is, we propose ShadowTutor, a distributed video DNN inference framework that reduces the number of network transmissions through intermittent knowledge distillation to a student model.
Distributed, Parallel, and Cluster Computing
no code implementations • 14 Nov 2019 • Jae-Yun Kim, Jun-Mo Lee, Yeon-Jae Koo, Sang-Hyeon Park, Soo-Mook Moon
Our experimental result shows that ethanos can reduce the size of the account state by half, which, if combined with removing old transactions, may reduce the storage size for bootstrapping to around 1GB.