no code implementations • 31 Mar 2023 • Mart van Baalen, Andrey Kuzmin, Suparna S Nair, Yuwei Ren, Eric Mahurin, Chirag Patel, Sundar Subramanian, Sanghyuk Lee, Markus Nagel, Joseph Soriaga, Tijmen Blankevoort
We theoretically show the difference between the INT and FP formats for neural networks and present a plethora of post-training quantization and quantization-aware-training results to show how this theory translates to practice.
no code implementations • 19 Nov 2021 • Christos Louizos, Matthias Reisser, Joseph Soriaga, Max Welling
Federated learning describes the distributed training of models across multiple clients while keeping the data private on-device.
no code implementations • 23 Apr 2021 • Mohammad Samragh, Hossein Hosseini, Aleksei Triastcyn, Kambiz Azarian, Joseph Soriaga, Farinaz Koushanfar
In our method, the edge device runs the model up to a split layer determined based on its computational capacity.
no code implementations • 18 Apr 2021 • Hossein Hosseini, Hyunsin Park, Sungrack Yun, Christos Louizos, Joseph Soriaga, Max Welling
We consider the problem of training User Verification (UV) models in federated setting, where each user has access to the data of only one class and user embeddings cannot be shared with the server or other users.
no code implementations • 1 Jan 2021 • Hossein Hosseini, Hyunsin Park, Sungrack Yun, Christos Louizos, Joseph Soriaga, Max Welling
We consider the problem of training User Verification (UV) models in federated setup, where the conventional loss functions are not applicable due to the constraints that each user has access to the data of only one class and user embeddings cannot be shared with the server or other users.
no code implementations • 1 Jan 2021 • Christos Louizos, Matthias Reisser, Joseph Soriaga, Max Welling
Federated averaging (FedAvg), despite its simplicity, has been the main approach in training neural networks in the federated learning setting.
no code implementations • 1 Jan 2021 • Mohammad Samragh, Hossein Hosseini, Kambiz Azarian, Joseph Soriaga
Splitting network computations between the edge device and the cloud server is a promising approach for enabling low edge-compute and private inference of neural networks.
no code implementations • 9 Jul 2020 • Hossein Hosseini, Sungrack Yun, Hyunsin Park, Christos Louizos, Joseph Soriaga, Max Welling
In this paper, we propose Federated User Authentication (FedUA), a framework for privacy-preserving training of UA models.