1 code implementation • IEEE Open Journal of the Communications Society (Conference version: IJCNN) 2020 • Yuwei Sun, Hiroshi Esaki, Hideya Ochiai.
We propose Segmented-Federated Learning (Segmented-FL), where by employing periodic local model evaluation and network segmentation, we aim to bring similar network environments to the same group.
1 code implementation • International Joint Conference on Neural Networks (IJCNN) 2020 • Yuwei Sun, Hideya Ochiai, Hiroshi Esaki
In this research, a segmented federated learning is proposed, different from a collaborative learning based on single global model in a traditional federated learning model, it keeps multiple global models which allow each segment of participants to conduct collaborative learning separately and rearranges the segmentation of participants dynamically as well.
Ranked #1 on Network Intrusion Detection on SIDD-Image
1 code implementation • 22 Mar 2022 • Yuwei Sun, Ng Chong, Hideya Ochiai
The empirical results show that FedKA achieves performance gains of 8. 8% and 3. 5% in Digit-Five and Office-Caltech10, respectively, and a gain of 0. 7% in Amazon Review with extremely limited training data.
1 code implementation • 11 Oct 2021 • Yuwei Sun, Hideya Ochiai
To this end, we propose a decentralized learning model called Homogeneous Learning (HL) for tackling non-IID data with a self-attention mechanism.
1 code implementation • 22 Mar 2022 • Yuwei Sun, Hideya Ochiai, Jun Sakuma
To overcome this challenge, we propose the Attacking Distance-aware Attack (ADA) to enhance a poisoning attack by finding the optimized target class in the feature space.
Ranked #1 on Model Poisoning on Fashion-MNIST
1 code implementation • 22 Sep 2023 • Yuwei Sun, Hideya Ochiai, Zhirong Wu, Stephen Lin, Ryota Kanai
Existing studies such as the Coordination method employ iterative cross-attention mechanisms with a bottleneck to enable the sparse association of inputs.
no code implementations • 2 Aug 2021 • Yuwei Sun, Ng Chong, Hideya Ochiai
At last, we successfully reconstructed the real data of the victim from the shared global model parameters with all the applied datasets.
no code implementations • 30 Jul 2021 • Yuwei Sun, Hideya Ochiai, Hiroshi Esaki
Wider coverage and a better solution to a latency reduction in 5G necessitate its combination with multi-access edge computing (MEC) technology.
no code implementations • 12 Oct 2021 • Yuwei Sun, Ng Chong, Hideya Ochiai
We collected the most recent phishing samples to study the effectiveness of the proposed method using different client numbers and data distributions.
no code implementations • 24 May 2022 • Hideya Ochiai, Yuwei Sun, Qingzhe Jin, Nattanon Wongwiwatchai, Hiroshi Esaki
WAFL can develop generalized models from Non-IID datasets stored in distributed nodes locally by exchanging and aggregating them with each other over opportunistic node-to-node contacts.
no code implementations • 24 Aug 2022 • Yuwei Sun, Hideya Ochiai
To this end, we propose Bidirectional Contrastive Split Learning (BiCSL) to train a global multi-modal model on the entire data distribution of decentralized clients.
no code implementations • 7 Nov 2022 • Naoya Tezuka, Hideya Ochiai, Yuwei Sun, Hiroshi Esaki
Compared to conventional federated learning, WAFL performs model training by weakly synchronizing the model parameters with others, and this shows great resilience to a poisoned model injected by an attacker.
no code implementations • 2 Feb 2023 • Yuwei Sun
Meta-learning usually refers to a learning algorithm that learns from other learning algorithms.
no code implementations • 2 Apr 2023 • Yuwei Sun, Hideya Ochiai, Jun Sakuma
To this end, we propose an instance-level multimodal Trojan attack on VQA that efficiently adapts to fine-tuned models through a dual-modality adversarial learning method.
no code implementations • 20 May 2023 • Yuwei Sun
This requires a model of how other learning algorithms operate and perform in different contexts, which is similar to representing and reasoning about mental states in the theory of mind.