Search Results for author: Kelly Kostopoulou

Found 3 papers, 2 papers with code

Packing Privacy Budget Efficiently

no code implementations26 Dec 2022 Pierre Tholoniat, Kelly Kostopoulou, Mosharaf Chowdhury, Asaf Cidon, Roxana Geambasu, Mathias Lécuyer, Junfeng Yang

This DP budget can be regarded as a new type of compute resource in workloads of multiple ML models training on user data.

Fairness Scheduling

DeepReduce: A Sparse-tensor Communication Framework for Federated Deep Learning

1 code implementation NeurIPS 2021 Hang Xu, Kelly Kostopoulou, Aritra Dutta, Xin Li, Alexandros Ntoulas, Panos Kalnis

DeepReduce is orthogonal to existing gradient sparsifiers and can be applied in conjunction with them, transparently to the end-user, to significantly lower the communication overhead.

DeepReduce: A Sparse-tensor Communication Framework for Distributed Deep Learning

1 code implementation NeurIPS 2021 Kelly Kostopoulou, Hang Xu, Aritra Dutta, Xin Li, Alexandros Ntoulas, Panos Kalnis

This paper introduces DeepReduce, a versatile framework for the compressed communication of sparse tensors, tailored for distributed deep learning.

Cannot find the paper you are looking for? You can Submit a new open access paper.