no code implementations • 4 Mar 2023 • Yamin Sepehri, Pedram Pad, Ahmet Caner Yüzügüler, Pascal Frossard, L. Andrea Dunbar
In this study, a novel hierarchical training method for deep neural networks is proposed that uses early exits in a divided architecture between edge and cloud workers to reduce the communication cost, training runtime and privacy concerns.
no code implementations • 28 Jun 2021 • Yamin Sepehri, Pedram Pad, Pascal Frossard, L. Andrea Dunbar
Also, in contrast with the previous optical privacy-preserving methods that cannot be trained, our method is data-driven and optimized for the specific application at hand.