Real-Time Decentralized knowledge Transfer at the Edge

11 Nov 2020  ·  Orpaz Goldstein, Mohammad Kachuee, Derek Shiell, Majid Sarrafzadeh ·

The proliferation of edge networks creates islands of learning agents working on local streams of data. Transferring knowledge between these agents in real-time without exposing private data allows for collaboration to decrease learning time and increase model confidence. Incorporating knowledge from data that a local model did not see creates an ability to debias a local model or add to classification abilities on data never before seen. Transferring knowledge in a selective decentralized approach enables models to retain their local insights, allowing for local flavors of a machine learning model. This approach suits the decentralized architecture of edge networks, as a local edge node will serve a community of learning agents that will likely encounter similar data. We propose a method based on knowledge distillation for pairwise knowledge transfer pipelines from models trained on non-i.i.d. data and compare it to other popular knowledge transfer methods. Additionally, we test different scenarios of knowledge transfer network construction and show the practicality of our approach. Our experiments show knowledge transfer using our model outperforms standard methods in a real-time transfer scenario.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods