no code implementations • 19 May 2023 • ingchun Wang, Jingcai Guo, Yi Liu, Song Guo, Weizhan Zhang, Xiangyong Cao, Qinghua Zheng
Based on the idea that in-distribution (ID) data with spurious features may have a lower experience risk, in this paper, we propose a novel Spurious Feature-targeted model Pruning framework, dubbed SFP, to automatically explore invariant substructures without referring to the above drawbacks.
no code implementations • 2 May 2023 • Jingcai Guo, Song Guo, Shiheng Ma, Yuxia Sun, Yuanyuan Xu
Previous works usually assume the malware families are known to the classifier in a close-set scenario, i. e., testing families are the subset or at most identical to training families.
no code implementations • 2 May 2023 • Jingcai Guo, Yuanyuan Xu, Wenchao Xu, Yufeng Zhan, Yuxia Sun, Song Guo
Malware open-set recognition (MOSR) aims at jointly classifying malware samples from known families and detect the ones from novel unknown families, respectively.
no code implementations • 2 May 2023 • Yifan Shi, Kang Wei, Li Shen, Jun Li, Xueqian Wang, Bo Yuan, Song Guo
However, it suffers from issues in terms of communication, resource of MTs, and privacy.
no code implementations • 2 May 2023 • Xiaocheng Lu, Ziming Liu, Song Guo, Jingcai Guo, Fushuo Huo, Sikai Bai, Tao Han
Compositional Zero-shot Learning (CZSL) aims to recognize novel concepts composed of known knowledge without training samples.
no code implementations • 1 May 2023 • Jie Zhang, Xiaosong Ma, Song Guo, Wenchao Xu
Federated Semi-supervised Learning (FedSSL) has emerged as a new paradigm for allowing distributed clients to collaboratively train a machine learning model over scarce labeled data and abundant unlabeled data.
no code implementations • 20 Mar 2023 • Fushuo Huo, Wenchao Xu, Jingcai Guo, Haozhao Wang, Yunfeng Fan, Song Guo
In this paper, we propose a novel Dual-prototype Self-augment and Refinement method (DSR) for O$^2$CL problem, which consists of two strategies: 1) Dual class prototypes: Inner and hyper-dimensional prototypes are exploited to utilize the pre-trained information and obtain robust quasi-orthogonal representations rather than example buffers for both privacy preservation and memory reduction.
no code implementations • 14 Mar 2023 • Yunfeng Fan, Wenchao Xu, Haozhao Wang, Jiaqi Zhu, Junxiao Wang, Song Guo
Unfortunately, OCI learning can suffer from catastrophic forgetting (CF) as the decision boundaries for old classes can become inaccurate when perturbated by new ones.
no code implementations • 9 Feb 2023 • Yingchun Wang, Jingcai Guo, Jie Zhang, Song Guo, Weizhan Zhang, Qinghua Zheng
Federated learning (FL) is an emerging technique that trains massive and geographically distributed edge data while maintaining privacy.
no code implementations • 9 Feb 2023 • Yingchun Wang, Jingcai Guo, Song Guo, Weizhan Zhang
Mixed-precision quantization mostly predetermines the model bit-width settings before actual training due to the non-differential bit-width sampling process, obtaining sub-optimal performance.
no code implementations • CVPR 2023 • Ziming Liu, Song Guo, Xiaocheng Lu, Jingcai Guo, Jiewei Zhang, Yue Zeng, Fushuo Huo
Recent studies usually approach multi-label zero-shot learning (MLZSL) with visual-semantic mapping on spatial-class correlation, which can be computationally costly, and worse still, fails to capture fine-grained class-specific semantics.
no code implementations • 19 Dec 2022 • Yingchun Wang, Jingcai Guo, Song Guo, Weizhan Zhang, Jie Zhang
Recent studies show that even highly biased dense networks contain an unbiased substructure that can achieve better out-of-distribution (OOD) generalization than the original model.
no code implementations • 7 Dec 2022 • Yingchun Wang, Song Guo, Jingcai Guo, Weizhan Zhang, Yida Xu, Jie Zhang, Yi Liu
Extensive experiments based on small Cifar-10 and large-scaled ImageNet demonstrate that our method can obtain sparser networks with great generalization performance while providing quantified reliability for the pruned model.
no code implementations • 21 Nov 2022 • Xueyang Tang, Song Guo, Jie Zhang
Recently, data heterogeneity among the training datasets on the local clients (a. k. a., Non-IID data) has attracted intense interest in Federated Learning (FL), and many personalized federated learning methods have been proposed to handle it.
Out-of-Distribution Generalization
Personalized Federated Learning
no code implementations • 19 Nov 2022 • Fushuo Huo, Wenchao Xu, Song Guo, Jingcai Guo, Haozhao Wang, Ziming Liu
Open-World Compositional Zero-shot Learning (OW-CZSL) aims to recognize novel compositions of state and object primitives in images with no priors on the compositional space, which induces a tremendously large output space containing all possible state-object compositions.
1 code implementation • CVPR 2023 • Xiaocheng Lu, Ziming Liu, Song Guo, Jingcai Guo
Existing methods either learn the combined state-object representation, challenging the generalization of unseen compositions, or design two classifiers to identify state and object separately from image features, ignoring the intrinsic relationship between them.
no code implementations • 15 Nov 2022 • Jinyu Chen, Wenchao Xu, Song Guo, Junxiao Wang, Jie Zhang, Haozhao Wang
Federated Learning (FL) is an emerging paradigm that enables distributed users to collaboratively and iteratively train machine learning models without sharing their private data.
no code implementations • 15 Nov 2022 • Qihua Zhou, Ruibin Li, Song Guo, Peiran Dong, Yi Liu, Jingcai Guo, Zhenda Xu
Recent years have witnessed the dramatic growth of Internet video traffic, where the video bitstreams are often compressed and delivered in low quality to fit the streamer's uplink bandwidth.
no code implementations • CVPR 2023 • Yunfeng Fan, Wenchao Xu, Haozhao Wang, Junxiao Wang, Song Guo
Multimodal learning (MML) aims to jointly exploit the common priors of different modalities to compensate for their inherent limitations.
no code implementations • 14 Nov 2022 • Yi Liu, Song Guo, Jie Zhang, Qihua Zhou, Yingchun Wang, Xiaohan Zhao
We prove that FedFoA is a model-agnostic training framework and can be easily compatible with state-of-the-art unsupervised FL methods.
no code implementations • 13 Nov 2022 • Leijie Wu, Song Guo, Yaohong Ding, Junxiao Wang, Wenchao Xu, Richard Yida Xu, Jie Zhang
In contrast, visual data exhibits a fundamentally different structure: Its basic unit (pixel) is a natural low-level representation with significant redundancies in the neighbourhood, which poses obvious challenges to the interpretability of MSA mechanism in ViT.
no code implementations • 24 Aug 2022 • Tao Guo, Song Guo, Junxiao Wang, Wenchao Xu
Quick global aggregation of effective distributed parameters is crucial to federated learning (FL), which requires adequate bandwidth for parameters communication and sufficient user data for local training.
no code implementations • 21 Aug 2022 • Jingcai Guo, Song Guo, Jie Zhang, Ziming Liu
Concretely, we maintain an edge-agnostic hidden model in the cloud server to estimate a less-accurate while direction-aware inversion of the global model.
no code implementations • 16 Jul 2022 • Weiqing Ren, Yuben Qu, Chao Dong, Yuqian Jing, Hao Sun, Qihui Wu, Song Guo
With the vigorous development of artificial intelligence (AI), the intelligent applications based on deep neural network (DNN) change people's lifestyles and the production efficiency.
no code implementations • 15 Jun 2022 • Rui Zhang, Song Guo, Junxiao Wang, Xin Xie, DaCheng Tao
In particular, we dig out some critical ingredients from the iteration-based attacks, including data initialization, model training and gradient matching.
no code implementations • 13 Jun 2022 • Feijie Wu, Song Guo, Zhihao Qu, Shiqi He, Ziming Liu
Clients in the miner group perform multiple local updates using serial mini-batches, and each local update is also indirectly regulated by the global target derived from the average of clients' bullseyes.
no code implementations • 4 May 2022 • Youhuan Yang, Lei Sun, Leyu Dai, Song Guo, Xiuqing Mao, Xiaoqin Wang, Bayi Xu
This is especially dangerous for some systems with high-security requirements, so this paper proposes a new defense method by using the model super-fitting state to improve the model's adversarial robustness (i. e., the accuracy under adversarial attacks).
no code implementations • 4 May 2022 • Youhuan Yang, Lei Sun, Leyu Dai, Song Guo, Xiuqing Mao, Xiaoqin Wang, Bayi Xu
Various defense models have been proposed to resist adversarial attack algorithms, but existing adversarial robustness evaluation methods always overestimate the adversarial robustness of these models (i. e., not approaching the lower bound of robustness).
no code implementations • 14 Apr 2022 • Feijie Wu, Shiqi He, Song Guo, Zhihao Qu, Haozhao Wang, Weihua Zhuang, Jie Zhang
Traditional one-bit compressed stochastic gradient descent can not be directly employed in multi-hop all-reduce, a widely adopted distributed training paradigm in network-intensive high-performance computing systems such as public clouds.
no code implementations • 7 Mar 2022 • Ziming Liu, Song Guo, Jingcai Guo, Yuanyuan Xu, Fushuo Huo
We argue that disregarding the connection between major and minor classes, i. e., correspond to the global and local information, respectively, is the cause of the problem.
no code implementations • 27 Feb 2022 • Tao Guo, Song Guo, Jiewei Zhang, Wenchao Xu, Junxiao Wang
Existing studies of machine unlearning mainly focus on sample-wise unlearning, such that a learnt model will not expose user's privacy at the sample level.
no code implementations • 5 Feb 2022 • Leijie Wu, Song Guo, Yaohong Ding, Yufeng Zhan, Jie Zhang
Facing the challenge of statistical diversity in client local data distribution, personalized federated learning (PFL) has become a growing research hotspot.
1 code implementation • 17 Dec 2021 • Feijie Wu, Song Guo, Haozhao Wang, Zhihao Qu, Haobo Zhang, Jie Zhang, Ziming Liu
In the setting of federated optimization, where a global model is aggregated periodically, step asynchronism occurs when participants conduct model training by efficiently utilizing their computational resources.
1 code implementation • NeurIPS 2021 • Jie Zhang, Song Guo, Xiaosong Ma, Haozhao Wang, Wencao Xu, Feijie Wu
To deal with such model constraints, we exploit the potentials of heterogeneous model settings and propose a novel training framework to employ personalized models for different clients.
no code implementations • 22 Oct 2021 • Junxiao Wang, Song Guo, Xin Xie, Heng Qi
Evaluated on CIFAR10 dataset, our method accelerates the speed of unlearning by 8. 9x for the ResNet model, and 7. 9x for the VGG model under no degradation in accuracy, compared to retraining from scratch.
no code implementations • 26 Sep 2021 • Jun Du, Chunxiao Jiang, Abderrahim Benslimane, Song Guo, Yong Ren
Based on this dynamic access model, a Stackelberg differential game based cloud computing resource sharing mechanism is proposed to facilitate the resource trading between the cloud computing service provider (CCP) and different edge computing service providers (ECPs).
no code implementations • 24 Jun 2021 • Xueyang Tang, Song Guo, Jingcai Guo
The prevalent personalized federated learning (PFL) usually pursues a trade-off between personalization and generalization by maintaining a shared global model to guide the training process of local models.
no code implementations • 15 Apr 2021 • Yuben Qu, Haipeng Dai, Yan Zhuang, Jiafa Chen, Chao Dong, Fan Wu, Song Guo
Unmanned aerial vehicles (UAVs), or say drones, are envisioned to support extensive applications in next-generation wireless networks in both civil and military fields.
1 code implementation • CVPR 2021 • Song Guo, Jingya Wang, Xinchao Wang, DaCheng Tao
On the other hand, such reliable embeddings can boost identity-awareness through memory aggregation, hence strengthen attention modules and suppress drifts.
2 code implementations • 25 Sep 2020 • Song Guo
2) The segmentation speed of DPN is over 20-160 times faster than other methods on the DRIVE dataset.
no code implementations • 10 Aug 2020 • Jiawen Kang, Zehui Xiong, Chunxiao Jiang, Yi Liu, Song Guo, Yang Zhang, Dusit Niyato, Cyril Leung, Chunyan Miao
This framework can achieve scalable and flexible decentralized FEL by individually manage local model updates or model sharing records for performance isolation.
Cryptography and Security
no code implementations • 30 Apr 2020 • Jingcai Guo, Song Guo
One common practice in zero-shot learning is to train a projection between the visual and semantic feature spaces with labeled seen classes examples.
no code implementations • 3 Feb 2020 • Huawei Huang, Kangying Lin, Song Guo, Pan Zhou, Zibin Zheng
In the dynamic environment, the mobile devices selected by the existing reactive candidate-selection algorithms very possibly fail to complete the training and reporting phases of FL, because the FL parameter server only knows the currently-observed resources of all candidates.
no code implementations • 22 Jan 2020 • Haozhao Wang, Zhihao Qu, Song Guo, Xin Gao, Ruixuan Li, Baoliu Ye
A major bottleneck on the performance of distributed Stochastic Gradient Descent (SGD) algorithm for large-scale Federated Learning is the communication overhead on pushing local gradients and pulling global model.
no code implementations • 17 Dec 2019 • Sicong Zhou, Huawei Huang, Wuhui Chen, Zibin Zheng, Song Guo
Therefore, to provide the byzantine-resilience for distributed learning in 5G era, this article proposes a secure computing framework based on the sharding-technique of blockchain, namely PIRATE.
Distributed, Parallel, and Cluster Computing Cryptography and Security
no code implementations • 21 Apr 2019 • Hongji Huang, Song Guo, Guan Gui, Zhen Yang, Jianhua Zhang, Hikmet Sari, Fumiyuki Adachi
The new demands for high-reliability and ultra-high capacity wireless communication have led to extensive research into 5G communications.
no code implementations • 12 Apr 2019 • Jingcai Guo, Song Guo
It considers the Alignment of Manifold Structures by Semantic Feature Expansion.
no code implementations • 12 Apr 2019 • Shiheng Ma, Jingcai Guo, Song Guo, Minyi Guo
Our approach employs the inception backbone network to capture rich features of traffic distribution on the whole area.
1 code implementation • 12 Apr 2019 • Jingcai Guo, Shiheng Ma, Song Guo
Specifically, we propose the local aware (LA) and global aware (GA) attention to deal with LR features in unequal manners, which can highlight the high-frequency components and discriminate each feature from LR images in the local and the global views, respectively.
no code implementations • 30 Mar 2019 • Jingcai Guo, Song Guo
To the best of our knowledge, our work is the first to consider the adaptive adjustment of semantic FS in ZSR.
no code implementations • 30 Mar 2019 • Jingcai Guo, Song Guo
In order to deal with this issue, we propose an Exclusivity Enhanced (EE) unsupervised feature learning approach to improve the conventional AE.
no code implementations • 21 Feb 2019 • Chengjie Li, Ruixuan Li, Haozhao Wang, Yuhua Li, Pan Zhou, Song Guo, Keqin Li
Distributed asynchronous offline training has received widespread attention in recent years because of its high performance on large-scale data and complex models.
1 code implementation • 11 Mar 2018 • Song Guo, Kai Wang, Hong Kang, Yujun Zhang, Yingqi Gao, Tao Li
Results: The proposed BTS-DSN has been verified on DRIVE, STARE and CHASE_DB1 datasets, and showed competitive performance over other state-of-the-art methods.