Search Results for author: Soo-Mook Moon

Found 4 papers, 2 papers with code

A Blockchain-based Platform for Reliable Inference and Training of Large-Scale Models

no code implementations6 May 2023 SangHyeon Park, Junmo Lee, Soo-Mook Moon

Decentralized solutions such as blockchain have been proposed to tackle these issues, but they often struggle when dealing with large-scale models, leading to time-consuming inference and inefficient training verification.

DepthFL: Depthwise Federated Learning for Heterogeneous Clients

1 code implementation ICLR 2023 Minjae Kim, Sangyoon Yu, Suhyun Kim, Soo-Mook Moon

Federated learning is for training a global model without collecting private local data from clients.

Federated Learning

ShadowTutor: Distributed Partial Distillation for Mobile Video DNN Inference

1 code implementation24 Mar 2020 Jae-Won Chung, Jae-Yun Kim, Soo-Mook Moon

That is, we propose ShadowTutor, a distributed video DNN inference framework that reduces the number of network transmissions through intermittent knowledge distillation to a student model.

Distributed, Parallel, and Cluster Computing

Ethanos: Lightweight Bootstrapping for Ethereum

no code implementations14 Nov 2019 Jae-Yun Kim, Jun-Mo Lee, Yeon-Jae Koo, Sang-Hyeon Park, Soo-Mook Moon

Our experimental result shows that ethanos can reduce the size of the account state by half, which, if combined with removing old transactions, may reduce the storage size for bootstrapping to around 1GB.

Cannot find the paper you are looking for? You can Submit a new open access paper.