no code implementations • 17 Jun 2020 • Seungeun Oh, Jihong Park, Eunjeong Jeong, Hyesung Kim, Mehdi Bennis, Seong-Lyun Kim
This letter proposes a novel communication-efficient and privacy-preserving distributed machine learning framework, coined Mix2FLD.
no code implementations • 13 May 2020 • Han Cha, Jihong Park, Hyesung Kim, Mehdi Bennis, Seong-Lyun Kim
Traditional distributed deep reinforcement learning (RL) commonly relies on exchanging the experience replay memory (RM) of each agent.
no code implementations • 16 Aug 2019 • Jihong Park, Shiqiang Wang, Anis Elgabli, Seungeun Oh, Eunjeong Jeong, Han Cha, Hyesung Kim, Seong-Lyun Kim, Mehdi Bennis
Devices at the edge of wireless networks are the last mile data sources for machine learning (ML).
no code implementations • 15 Jul 2019 • Eunjeong Jeong, Seungeun Oh, Jihong Park, Hyesung Kim, Mehdi Bennis, Seong-Lyun Kim
On-device machine learning (ML) has brought about the accessibility to a tremendous amount of data from the users while keeping their local data private instead of storing it in a central entity.
no code implementations • 15 Jul 2019 • Han Cha, Jihong Park, Hyesung Kim, Seong-Lyun Kim, Mehdi Bennis
In distributed reinforcement learning, it is common to exchange the experience memory of each agent and thereby collectively train their local models.
no code implementations • 28 Nov 2018 • Eunjeong Jeong, Seungeun Oh, Hyesung Kim, Jihong Park, Mehdi Bennis, Seong-Lyun Kim
On-device machine learning (ML) enables the training process to exploit a massive amount of user-generated private data samples.
2 code implementations • 12 Aug 2018 • Hyesung Kim, Jihong Park, Mehdi Bennis, Seong-Lyun Kim
By leveraging blockchain, this letter proposes a blockchained federated learning (BlockFL) architecture where local learning model updates are exchanged and verified.
Information Theory Networking and Internet Architecture Information Theory