Federated Classification using Parsimonious Functions in Reproducing Kernel Hilbert Spaces

8 Sep 2020  ·  Maria Peifer, Alejandro Ribeiro ·

Federated learning forms a global model using data collected from a federation agent. This type of learning has two main challenges: the agents generally don't collect data over the same distribution, and the agents have limited capabilities of storing and transmitting data. Therefore, it is impractical for each agent to send the entire data over the network. Instead, each agent must form a local model and decide what information is fundamental to the learning problem, which will be sent to a central unit. The central unit can then form the global model using only the information received from the agents. We propose a method that tackles these challenges. First each agent forms a local model using a low complexity reproducing kernel Hilbert space representation. From the model the agents identify the fundamental samples which are sent to the central unit. The fundamental samples are obtained by solving the dual problem. The central unit then forms the global model. We show that the solution of the federated learner converges to that of the centralized learner asymptotically as the sample size increases. The performance of the proposed algorithm is evaluated using experiments with both simulated data and real data sets from an activity recognition task, for which the data is collected from a wearable device. The experimentation results show that the accuracy of our method converges to that of a centralized learner with increasing sample size.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here