Federated Matrix Factorization: Algorithm Design and Application to Data Clustering

12 Feb 2020  ·  Shuai Wang, Tsung-Hui Chang ·

Recent demands on data privacy have called for federated learning (FL) as a new distributed learning paradigm in massive and heterogeneous networks. Although many FL algorithms have been proposed, few of them have considered the matrix factorization (MF) model, which is known to have a vast number of signal processing and machine learning applications. Different from the existing FL algorithms that are designed for smooth problems with single block of variables, in federated MF (FedMF), one has to deal with challenging non-convex and non-smooth problems (due to constraints or regularization) with two blocks of variables. In this paper, we address the challenge by proposing two new FedMF algorithms, namely, FedMAvg and FedMGS, based on the model averaging and gradient sharing principles, respectively. Both FedMAvg and FedMGS adopt multiple steps of local updates per communication round to speed up convergence, and allow only a randomly sampled subset of clients to communicate with the server for reducing the communication cost. Convergence analyses for the two algorithms are respectively presented, which delineate the impacts of data distribution, local update number, and partial client communication on the algorithm performance. By focusing on a data clustering task, extensive experiment results are presented to examine the practical performance of both algorithms, as well as demonstrating their efficacy over the existing distributed clustering algorithms.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods