Search Results for author: Vincent Lau

Found 12 papers, 4 papers with code

Bayesian Federated Learning Via Expectation Maximization and Turbo Deep Approximate Message Passing

no code implementations12 Feb 2024 Wei Xu, An Liu, Yiting Zhang, Vincent Lau

In this work, we propose a message passing based Bayesian federated learning (BFL) framework to avoid these drawbacks. Specifically, we formulate the problem of deep neural network (DNN) learning and compression and as a sparse Bayesian inference problem, in which group sparse prior is employed to achieve structured model compression.

Bayesian Inference Federated Learning +3

GQFedWAvg: Optimization-Based Quantized Federated Learning in General Edge Computing Systems

1 code implementation13 Jun 2023 Yangchen Li, Ying Cui, Vincent Lau

In this paper, we propose an optimization-based quantized FL algorithm, which can appropriately fit a general edge computing system with uniform or nonuniform computing and communication resources at the workers.

Edge-computing Federated Learning +1

Riemannian Low-Rank Model Compression for Federated Learning with Over-the-Air Aggregation

no code implementations4 Jun 2023 Ye Xue, Vincent Lau

Based on our optimization formulation, we propose an alternating Riemannian optimization algorithm with a precoder that enables efficient OTA aggregation of low-rank local models without sacrificing training performance.

Federated Learning Model Compression +1

An Optimization Framework for Federated Edge Learning

no code implementations26 Nov 2021 Yangchen Li, Ying Cui, Vincent Lau

To explore the full potential of FL in such an edge computing system, we first present a general FL algorithm, namely GenQSGD, parameterized by the numbers of global and local iterations, mini-batch size, and step size sequence.

Edge-computing Federated Learning +1

Optimization-Based GenQSGD for Federated Edge Learning

no code implementations25 Oct 2021 Yangchen Li, Ying Cui, Vincent Lau

Then, we optimize the algorithm parameters to minimize the energy cost under the time constraint and convergence error constraint.

Edge-computing Federated Learning

Efficient Sparse Coding using Hierarchical Riemannian Pursuit

1 code implementation21 Apr 2021 Ye Xue, Vincent Lau, Songfu Cai

The proposed scheme leverages the global and local Riemannian geometry of the two-stage optimization problem and facilitates fast implementation for superb dictionary recovery performance by a finite number of samples without atom-by-atom calculation.

Data Compression

Online Orthogonal Dictionary Learning Based on Frank-Wolfe Method

no code implementations2 Mar 2021 Ye Xue, Vincent Lau

The proposed scheme includes a novel problem formulation and an efficient online algorithm design with convergence analysis.

Dictionary Learning

Line-of-Sight MIMO for High Capacity Millimeter Wave Backhaul in FDD Systems

no code implementations13 Jun 2020 Ye Xue, Xuanyu Zheng, Vincent Lau

In this paper, we propose a holistic solution containing TO compensation, PHN estimation, precoder/decorrelator optimization of the LoS MIMO for wireless backhaul, and the interleaving of each part.

Blind Data Detection in Massive MIMO via $\ell_3$-norm Maximization over the Stiefel Manifold

no code implementations26 Apr 2020 Ye Xue, Yifei Shen, Vincent Lau, Jun Zhang, Khaled B. Letaief

Specifically, we propose a novel $\ell_3$-norm-based formulation to recover the data without channel estimation.

Complete Dictionary Learning via $\ell_p$-norm Maximization

1 code implementation24 Feb 2020 Yifei Shen, Ye Xue, Jun Zhang, Khaled B. Letaief, Vincent Lau

Dictionary learning is a classic representation learning method that has been widely applied in signal processing and data analytics.

Computational Efficiency Dictionary Learning +1

An Efficient Algorithm for Designing Optimal CRCs for Tail-Biting Convolutional Codes

1 code implementation16 Jan 2020 Hengjie Yang, Linfang Wang, Vincent Lau, Richard D. Wesel

Lou et al. proposed DSO CRC design methodology for a given zero-terminated convolutional code (ZTCC), in which the fundamental design principle is to maximize the minimum distance at which an undetectable error event of ZTCC first occurs.

Information Theory Information Theory

Stochastic Successive Convex Approximation for Non-Convex Constrained Stochastic Optimization

no code implementations25 Jan 2018 An Liu, Vincent Lau, Borna Kananian

The proposed CSSCA algorithm can also handle stochastic non-convex constraints in optimization problems, and it opens the way to solving more challenging optimization problems that occur in many applications.

Information Theory Information Theory

Cannot find the paper you are looking for? You can Submit a new open access paper.