no code implementations • 19 Feb 2024 • Sujin Kook, Won-Yong Shin, Seong-Lyun Kim, Seung-Woo Ko
The vision of pervasive artificial intelligence (AI) services can be realized by training an AI model on time using real-time data collected by internet of things (IoT) devices.
no code implementations • 23 Jan 2024 • Yongjun Kim, Sejin Seo, Jihong Park, Mehdi Bennis, Seong-Lyun Kim, Junil Choi
In this work, we compare emergent communication (EC) built upon multi-agent deep reinforcement learning (MADRL) and language-oriented semantic communication (LSC) empowered by a pre-trained large language model (LLM) using human language.
no code implementations • 10 Jan 2024 • Eleonora Grassucci, Jihong Park, Sergio Barbarossa, Seong-Lyun Kim, Jinho Choi, Danilo Comminiello
Disclosing generative models capabilities in semantic communication paves the way for a paradigm shift with respect to conventional communication systems, which has great potential to reduce the amount of data traffic and offers a revolutionary versatility to novel tasks and applications that were not even conceivable a few years ago.
no code implementations • 14 Nov 2023 • Kyuwon Han, Seung Min Yu, Seong-Lyun Kim, Seung-Woo Ko
A smartphone-based user mobility tracking could be effective in finding his/her location, while the unpredictable error therein due to low specification of built-in inertial measurement units (IMUs) rejects its standalone usage but demands the integration to another positioning technique like WiFi positioning.
no code implementations • 14 Oct 2023 • Jihong Park, Seung-Woo Ko, Jinho Choi, Seong-Lyun Kim, Mehdi Bennis
neural network-oriented symbolic protocols developed by converting Level 1 MAC outputs into explicit symbols; and Level 3 MAC.
no code implementations • 13 Oct 2023 • Jinhyuk Choi, Jihong Park, Seung-Woo Ko, Jinho Choi, Mehdi Bennis, Seong-Lyun Kim
In this method, referred to as SL with layer freezing (SLF), each encoder downloads a misaligned decoder, and locally fine-tunes a fraction of these encoder-decoder NN layers.
no code implementations • 20 Sep 2023 • Hyelin Nam, Jihong Park, Jinho Choi, Mehdi Bennis, Seong-Lyun Kim
By integrating recent advances in large language models (LLMs) and generative models into the emerging semantic communication (SC) paradigm, in this article we put forward to a novel framework of language-oriented semantic communication (LSC).
no code implementations • 8 Sep 2023 • Hyelin Nam, Jihong Park, Jinho Choi, Seong-Lyun Kim
Our work is expected to pave a new road of utilizing state-of-the-art generative models to real communication systems
no code implementations • 17 Apr 2023 • Jihoon Park, Seungeun Oh, Seong-Lyun Kim
Automatic modulation classification (AMC) is a technology that identifies a modulation scheme without prior signal information and plays a vital role in various applications, including cognitive radio and link adaptation.
no code implementations • 13 Dec 2022 • Jihong Park, Jinho Choi, Seong-Lyun Kim, Mehdi Bennis
Metaverse over wireless networks is an emerging use case of the sixth generation (6G) wireless systems, posing unprecedented challenges in terms of its multi-modal data transmissions with stringent latency and reliability requirements.
no code implementations • 15 Nov 2022 • Jinhyuk Choi, Seong-Lyun Kim, Seung-Woo Ko
Specifically, feature network is designed based on feature hierarchy, a one-directional feature dependency with a different scale.
no code implementations • 28 Oct 2022 • Seungeun Oh, Jihong Park, Sihun Baek, Hyelin Nam, Praneeth Vepakomma, Ramesh Raskar, Mehdi Bennis, Seong-Lyun Kim
Split learning (SL) detours this by communicating smashed data at a cut-layer, yet suffers from data privacy leakage and large communication costs caused by high similarity between ViT' s smashed data and input data.
no code implementations • 8 Jul 2022 • Sejin Seo, Jihong Park, Seung-Woo Ko, Jinho Choi, Mehdi Bennis, Seong-Lyun Kim
Classical medium access control (MAC) protocols are interpretable, yet their task-agnostic control signaling messages (CMs) are ill-suited for emerging mission-critical applications.
no code implementations • 1 Jul 2022 • Sihun Baek, Jihong Park, Praneeth Vepakomma, Ramesh Raskar, Mehdi Bennis, Seong-Lyun Kim
Leveraging this, we develop a novel SL framework for ViT, coined CutMixSL, communicating CutSmashed data.
no code implementations • 10 Feb 2022 • Kyeong-Joong Jeong, Jin-Duk Park, Kyusoon Hwang, Seong-Lyun Kim, Won-Yong Shin
We introduce a data-driven anomaly detection framework using a manufacturing dataset collected from a factory assembly line.
no code implementations • 26 Apr 2021 • Sejin Seo, Seung-Woo Ko, Jihong Park, Seong-Lyun Kim, Mehdi Bennis
The lottery ticket hypothesis (LTH) claims that a deep neural network (i. e., ground network) contains a number of subnetworks (i. e., winning tickets), each of which exhibiting identically accurate inference capability as that of the ground network.
4 code implementations • 4 Nov 2020 • Hyowoon Seo, Jihong Park, Seungeun Oh, Mehdi Bennis, Seong-Lyun Kim
The goal of this chapter is to provide a deep understanding of FD while demonstrating its communication efficiency and applicability to a variety of tasks.
no code implementations • 6 Aug 2020 • Jihong Park, Sumudu Samarakoon, Anis Elgabli, Joongheon Kim, Mehdi Bennis, Seong-Lyun Kim, Mérouane Debbah
Machine learning (ML) is a promising enabler for the fifth generation (5G) communication systems and beyond.
no code implementations • 17 Jun 2020 • Seungeun Oh, Jihong Park, Eunjeong Jeong, Hyesung Kim, Mehdi Bennis, Seong-Lyun Kim
This letter proposes a novel communication-efficient and privacy-preserving distributed machine learning framework, coined Mix2FLD.
no code implementations • 9 Jun 2020 • MyungJae Shin, Chihoon Hwang, Joongheon Kim, Jihong Park, Mehdi Bennis, Seong-Lyun Kim
User-generated data distributions are often imbalanced across devices and labels, hampering the performance of federated learning (FL).
no code implementations • 1 Jun 2020 • Sejin Seo, Sang Won Choi, Sujin Kook, Seong-Lyun Kim, Seung-Woo Ko
Due to the edge's position between the cloud and the users, and the recent surge of deep neural network (DNN) applications, edge computing brings about uncertainties that must be understood separately.
Information Theory Networking and Internet Architecture Information Theory
no code implementations • 13 May 2020 • Han Cha, Jihong Park, Hyesung Kim, Mehdi Bennis, Seong-Lyun Kim
Traditional distributed deep reinforcement learning (RL) commonly relies on exchanging the experience replay memory (RM) of each agent.
no code implementations • 16 Aug 2019 • Jihong Park, Shiqiang Wang, Anis Elgabli, Seungeun Oh, Eunjeong Jeong, Han Cha, Hyesung Kim, Seong-Lyun Kim, Mehdi Bennis
Devices at the edge of wireless networks are the last mile data sources for machine learning (ML).
no code implementations • 15 Jul 2019 • Eunjeong Jeong, Seungeun Oh, Jihong Park, Hyesung Kim, Mehdi Bennis, Seong-Lyun Kim
On-device machine learning (ML) has brought about the accessibility to a tremendous amount of data from the users while keeping their local data private instead of storing it in a central entity.
no code implementations • 15 Jul 2019 • Han Cha, Jihong Park, Hyesung Kim, Seong-Lyun Kim, Mehdi Bennis
In distributed reinforcement learning, it is common to exchange the experience memory of each agent and thereby collectively train their local models.
no code implementations • 28 Nov 2018 • Eunjeong Jeong, Seungeun Oh, Hyesung Kim, Jihong Park, Mehdi Bennis, Seong-Lyun Kim
On-device machine learning (ML) enables the training process to exploit a massive amount of user-generated private data samples.
2 code implementations • 12 Aug 2018 • Hyesung Kim, Jihong Park, Mehdi Bennis, Seong-Lyun Kim
By leveraging blockchain, this letter proposes a blockchained federated learning (BlockFL) architecture where local learning model updates are exchanged and verified.
Information Theory Networking and Internet Architecture Information Theory