no code implementations • 12 Feb 2024 • Wei Xu, An Liu, Yiting Zhang, Vincent Lau
In this work, we propose a message passing based Bayesian federated learning (BFL) framework to avoid these drawbacks. Specifically, we formulate the problem of deep neural network (DNN) learning and compression and as a sparse Bayesian inference problem, in which group sparse prior is employed to achieve structured model compression.
1 code implementation • 13 Jun 2023 • Yangchen Li, Ying Cui, Vincent Lau
In this paper, we propose an optimization-based quantized FL algorithm, which can appropriately fit a general edge computing system with uniform or nonuniform computing and communication resources at the workers.
no code implementations • 4 Jun 2023 • Ye Xue, Vincent Lau
Based on our optimization formulation, we propose an alternating Riemannian optimization algorithm with a precoder that enables efficient OTA aggregation of low-rank local models without sacrificing training performance.
no code implementations • 26 Nov 2021 • Yangchen Li, Ying Cui, Vincent Lau
To explore the full potential of FL in such an edge computing system, we first present a general FL algorithm, namely GenQSGD, parameterized by the numbers of global and local iterations, mini-batch size, and step size sequence.
no code implementations • 25 Oct 2021 • Yangchen Li, Ying Cui, Vincent Lau
Then, we optimize the algorithm parameters to minimize the energy cost under the time constraint and convergence error constraint.
1 code implementation • 21 Apr 2021 • Ye Xue, Vincent Lau, Songfu Cai
The proposed scheme leverages the global and local Riemannian geometry of the two-stage optimization problem and facilitates fast implementation for superb dictionary recovery performance by a finite number of samples without atom-by-atom calculation.
no code implementations • 2 Mar 2021 • Ye Xue, Vincent Lau
The proposed scheme includes a novel problem formulation and an efficient online algorithm design with convergence analysis.
no code implementations • 13 Jun 2020 • Ye Xue, Xuanyu Zheng, Vincent Lau
In this paper, we propose a holistic solution containing TO compensation, PHN estimation, precoder/decorrelator optimization of the LoS MIMO for wireless backhaul, and the interleaving of each part.
no code implementations • 26 Apr 2020 • Ye Xue, Yifei Shen, Vincent Lau, Jun Zhang, Khaled B. Letaief
Specifically, we propose a novel $\ell_3$-norm-based formulation to recover the data without channel estimation.
1 code implementation • 24 Feb 2020 • Yifei Shen, Ye Xue, Jun Zhang, Khaled B. Letaief, Vincent Lau
Dictionary learning is a classic representation learning method that has been widely applied in signal processing and data analytics.
1 code implementation • 16 Jan 2020 • Hengjie Yang, Linfang Wang, Vincent Lau, Richard D. Wesel
Lou et al. proposed DSO CRC design methodology for a given zero-terminated convolutional code (ZTCC), in which the fundamental design principle is to maximize the minimum distance at which an undetectable error event of ZTCC first occurs.
Information Theory Information Theory
no code implementations • 25 Jan 2018 • An Liu, Vincent Lau, Borna Kananian
The proposed CSSCA algorithm can also handle stochastic non-convex constraints in optimization problems, and it opens the way to solving more challenging optimization problems that occur in many applications.
Information Theory Information Theory