no code implementations • 27 Jul 2024 • Qun Li, Baoquan Sun, Fu Xiao, Yonggang Qi, Bir Bhanu
We propose Sym-Net, a novel framework for Few-Shot Segmentation (FSS) that addresses the critical issue of intra-class variation by jointly learning both query and support prototypes in a symmetrical manner.
no code implementations • 8 Apr 2024 • Qun Li, Yuan Meng, Chen Tang, Jiacheng Jiang, Zhi Wang
Quantization is a promising technique for reducing the bit-width of deep models to improve their runtime performance and storage efficiency, and thus becomes a fundamental step for deployment.
1 code implementation • 5 Feb 2024 • Sheng Luo, Wei Chen, Wanxin Tian, Rui Liu, Luanxuan Hou, Xiubao Zhang, Haifeng Shen, Ruiqi Wu, Shuyi Geng, Yi Zhou, Ling Shao, Yi Yang, Bojun Gao, Qun Li, Guobin Wu
Foundation models have indeed made a profound impact on various fields, emerging as pivotal components that significantly shape the capabilities of intelligent systems.
no code implementations • 20 Sep 2023 • Zeyi Tao, Jindi Wu, Qun Li
Federated Learning (FL) is a distributed machine learning approach that enables model training in communication efficient and privacy-preserving manner.
1 code implementation • 21 Jul 2023 • Jindi Wu, Tianjie Hu, Qun Li
MORE adopts the same variational ansatz as binary classifiers while performing multi-classification by fully utilizing the quantum information of a single readout qubit.
no code implementations • 3 Feb 2023 • Qun Li, Chandra Thapa, Lawrence Ong, Yifeng Zheng, Hua Ma, Seyit A. Camtepe, Anmin Fu, Yansong Gao
In a number of practical scenarios, VFL is more relevant than HFL as different companies (e. g., bank and retailer) hold different features (e. g., credit history and shopping history) for the same set of customers.
1 code implementation • 4 Aug 2022 • Jindi Wu, Zeyi Tao, Qun Li
The quantum feature extractors in the SQNN system are independent of each other, so one can flexibly use quantum devices of varying sizes, with larger quantum devices extracting more local features.
1 code implementation • 5 May 2022 • Zeyi Tao, Jindi Wu, Qi Xia, Qun Li
LAWS is a combinatorial optimization strategy taking advantage of model parameter initialization and fast convergence of QNG.
1 code implementation • 22 Apr 2022 • Qun Li, Ziyi Zhang, Fu Xiao, Feng Zhang, Bir Bhanu
A high-resolution network exhibits remarkable capability in extracting multi-scale features for human pose estimation, but fails to capture long-range interactions between joints and has high computational complexity.
Ranked #30 on Pose Estimation on COCO test-dev
no code implementations • 16 Jun 2021 • Qi Xia, Qun Li
With the fast development of quantum computing and deep learning, quantum neural networks have attracted great attention recently.
no code implementations • 4 Sep 2020 • Chang Liu, Jiahui Sun, Haiming Jin, Meng Ai, Qun Li, Cheng Zhang, Kehua Sheng, Guobin Wu, XiaoHu Qie, Xinbing Wang
Thus, in this paper, we exploit adaptive dispatching intervals to boost the platform's profit under a guarantee of the maximum passenger waiting time.
no code implementations • 25 Sep 2019 • Zeyi Tao, Qi Xia, Qun Li
Moreover, we provide new variant of Adam-Type algorithm, namely AdamAL which can naturally mitigate the non-convergence issue of Adam and improve its performance.
no code implementations • 24 May 2013 • Shmuel Friedland, Qun Li, Dan Schonfeld
We then compare the performance of the proposed method with Kronecker compressive sensing (KCS) and multi way compressive sensing (MWCS).