no code implementations • 24 Dec 2024 • Honggu Kang, Seohyeon Cha, Joonhyuk Kang
We empirically demonstrate the consistent performance gains of GeFL-F, while demonstrating better privacy preservation and robustness to a large number of clients.
1 code implementation • 26 Nov 2024 • Dongwon Kim, Matteo Zecchin, Sangwoo Park, Joonhyuk Kang, Osvaldo Simeone
Bayesian optimization (BO) is a sequential approach for optimizing black-box objective functions using zeroth-order noisy observations.
1 code implementation • 15 Nov 2024 • YoungJoon Lee, Jinu Gong, Joonhyuk Kang
By utilizing this scoring mechanism before the aggregation phase, the proposed plugin enables existing FL techniques to become robust against Byzantine attacks while maintaining their original benefits.
1 code implementation • 31 Oct 2024 • YoungJoon Lee, Jinu Gong, Joonhyuk Kang
Federated learning enables edge devices to collaboratively train a global model while maintaining data privacy by keeping data localized.
no code implementations • 1 Oct 2024 • YoungJoon Lee, Taehyun Park, Yeongjoon Kang, Jonghoe Kim, Joonhyuk Kang
Integrating hyperscale AI into national defense modeling and simulation (M&S) is crucial for enhancing strategic and operational capabilities.
no code implementations • 30 May 2024 • Hyunsang Cho, Seonghoon Yoo, Bang Chul Jung, Joonhyuk Kang
This paper considers a joint communication and sensing technique for enhancing situational awareness in practical battlefield scenarios.
no code implementations • 27 May 2024 • Hyojin Lee, Sangwoo Park, Osvaldo Simeone, Yonina C. Eldar, Joonhyuk Kang
Such existing solutions can only provide guarantees in terms of false negative rate (FNR) in the asymptotic regime of large held-out data sets.
1 code implementation • 23 May 2024 • Jiwan Seo, Joonhyuk Kang
Vector Quantized Variational AutoEncoder (VQ-VAE) is an established technique in machine learning for learning discrete representations across various modalities.
no code implementations • 21 Mar 2024 • Sooyeob Jung, Seongah Jeong, Jinkyu Kang, Gyeongrae Im, Sangjae Lee, Mi-Kyung Oh, Joon Gyu Ryu, Joonhyuk Kang
This paper proposes a long range-frequency hopping spread spectrum (LR-FHSS) transceiver design for the Direct-to-Satellite Internet of Things (DtS-IoT) communication system.
no code implementations • 7 Dec 2023 • Jeongbin Kim, Seongah Jeong, Seonghoon Yoo, Woong Son, Joonhyuk Kang
In this correspondence, we propose hierarchical high-altitude platform (HAP)-low-altitude platform (LAP) networks with the aim of maximizing the sum-rate of ground user equipments (UEs).
no code implementations • 23 Oct 2023 • Jiyun Shin, JinHyun Ahn, Honggu Kang, Joonhyuk Kang
Foundation models (FMs) have demonstrated remarkable performance in machine learning but demand extensive training data and computational resources.
no code implementations • 17 Oct 2023 • Seohyeon Cha, Honggu Kang, Joonhyuk Kang
Building on a recent work that introduced a scaling parameter for constructing valid credible regions from posterior estimate, our study explores the advantages of incorporating a temperature parameter into Bayesian GNNs within CP framework.
no code implementations • 6 Sep 2023 • Seonghoon Yoo, Seongah Jeong, Jeongbin Kim, Joonhyuk Kang
Extended reality-enabled Internet of Things (XRI) provides the new user experience and the sense of immersion by adding virtual elements to the real world through Internet of Things (IoT) devices and emerging 6G technologies.
no code implementations • 15 Aug 2023 • Honggu Kang, Seohyeon Cha, Jinwoo Shin, Jongmyeong Lee, Joonhyuk Kang
Federated learning (FL) enables distributed training while preserving data privacy, but stragglers-slow or incapable clients-can significantly slow down the total training time and degrade performance.
no code implementations • 31 Jan 2023 • Youngsu Jang, Seongah Jeong, Joonhyuk Kang
With the advent of ever-growing vehicular applications, vehicular edge computing (VEC) has been a promising solution to augment the computing capacity of future smart vehicles.
no code implementations • 10 Jan 2023 • Sooyeob Jung, Seongah Jeong, Jinkyu Kang, Joonhyuk Kang
Marine Internet of Things (IoT) systems have grown substantially with the development of non-terrestrial networks (NTN) via aerial and space vehicles in the upcoming sixth-generation (6G), thereby assisting environment protection, military reconnaissance, and sea transportation.
1 code implementation • 29 Oct 2022 • YoungJoon Lee, Sangwoo Park, Joonhyuk Kang
While being an effective framework of learning a shared model across multiple edge devices, federated learning (FL) is generally vulnerable to Byzantine attacks from adversarial edge devices.
1 code implementation • 29 Oct 2022 • YoungJoon Lee, Sangwoo Park, Joonhyuk Kang
Federated learning (FL) aims at optimizing a shared global model over multiple edge devices without transmitting (private) data to the central server.
no code implementations • 14 Sep 2022 • Jinu Gong, Osvaldo Simeone, Joonhyuk Kang
Conventional frequentist FL schemes are known to yield overconfident decisions.
no code implementations • 16 Aug 2022 • Seonghoon Yoo, Seongah Jeong, Joonhyuk Kang
Unmanned aerial vehicles (UAVs) have been actively studied as moving cloudlets to provide application offloading opportunities and to enhance the security level of user equipments (UEs).
no code implementations • 23 Nov 2021 • Jinu Gong, Osvaldo Simeone, Rahif Kassab, Joonhyuk Kang
Variational particle-based Bayesian learning methods have the advantage of not being limited by the bias affecting more conventional parametric techniques.
no code implementations • 8 Apr 2021 • Jinu Gong, Osvaldo Simeone, Joonhyuk Kang
Federated Bayesian learning offers a principled framework for the definition of collaborative training algorithms that are able to quantify epistemic uncertainty and to produce trustworthy decisions.
1 code implementation • 3 Mar 2020 • Sangwoo Park, Osvaldo Simeone, Joonhyuk Kang
The proposed approach is based on a meta-training phase in which the online gradient-based meta-learning of the decoder is coupled with the joint training of the encoder via the transmission of pilots and the use of a feedback link.
no code implementations • 3 Feb 2020 • Jin-Hyun Ahn, Osvaldo Simeone, Joonhyuk Kang
Cooperative training methods for distributed machine learning are typically based on the exchange of local gradients or local model parameters.
Signal Processing Distributed, Parallel, and Cluster Computing Information Theory Information Theory
1 code implementation • 5 Jan 2020 • Osvaldo Simeone, Sangwoo Park, Joonhyuk Kang
Machine learning methods adapt the parameters of a model, constrained to lie in a given model class, by using a fixed learning procedure based on data or active observations.
1 code implementation • 22 Oct 2019 • Sangwoo Park, Osvaldo Simeone, Joonhyuk Kang
When a channel model is available, learning how to communicate on fading noisy channels can be formulated as the (unsupervised) training of an autoencoder consisting of the cascade of encoder, channel, and decoder.
1 code implementation • 23 Aug 2019 • Sangwoo Park, Hyeryung Jang, Osvaldo Simeone, Joonhyuk Kang
This paper considers an Internet-of-Things (IoT) scenario in which devices sporadically transmit short packets with few pilot symbols over a fading channel.
no code implementations • 5 Jul 2019 • Jin-Hyun Ahn, Osvaldo Simeone, Joonhyuk Kang
Cooperative training methods for distributed machine learning typically assume noiseless and ideal communication channels.
1 code implementation • 6 Mar 2019 • Sangwoo Park, Hyeryung Jang, Osvaldo Simeone, Joonhyuk Kang
Consider an Internet-of-Things (IoT) scenario in which devices transmit sporadically using short packets with few pilot symbols.
no code implementations • 16 Jan 2019 • Sukjong Ha, Jingjing Zhang, Osvaldo Simeone, Joonhyuk Kang
Distributed computing platforms typically assume the availability of reliable and dedicated connections among the processors.
Information Theory Information Theory