no code implementations • 25 Jul 2024 • Max Emerick, Stacy Patterson, Bassam Bamieh
Assignments have costs related to the distances between mutually assigned agents, and the overall cost of an assignment is quantified by a Wasserstein distance between the densities of the two agent classes.
1 code implementation • 9 Jul 2024 • Linh Tran, Sanjay Chari, Md. Saikat Islam Khan, Aaron Zachariah, Stacy Patterson, Oshani Seneviratne
We provide the first prototype application of differential privacy with blockchain for vertical federated learning.
no code implementations • 6 Feb 2024 • Lei Yu, Meng Han, Yiming Li, Changting Lin, Yao Zhang, Mingyang Zhang, Yan Liu, Haiqin Weng, Yuseok Jeon, Ka-Ho Chow, Stacy Patterson
Vertical Federated Learning (VFL) is a federated learning paradigm where multiple participants, who share the same set of samples but hold different features, jointly train machine learning models.
no code implementations • 3 May 2023 • Timothy Castiglia, Yi Zhou, Shiqiang Wang, Swanand Kadhe, Nathalie Baracaldo, Stacy Patterson
As part of the training, the parties wish to remove unimportant features in the system to improve generalization, efficiency, and explainability.
no code implementations • 25 Jul 2022 • Yuhao Yi, YuAn Wang, Xingkang He, Stacy Patterson, Karl H. Johansson
In this paper, we propose a sample-based algorithm to approximately test $r$-robustness of a digraph with $n$ vertices and $m$ edges.
no code implementations • 16 Jun 2022 • Timothy Castiglia, Anirban Das, Shiqiang Wang, Stacy Patterson
Our work provides the first theoretical analysis of the effect message compression has on distributed training over vertically partitioned data.
no code implementations • 19 Aug 2021 • Anirban Das, Timothy Castiglia, Shiqiang Wang, Stacy Patterson
Each silo contains a hub and a set of clients, with the silo's vertical data shard partitioned horizontally across its clients.
no code implementations • 6 Feb 2021 • Anirban Das, Stacy Patterson
Each silo contains a hub and a set of clients, with the silo's vertical data shard partitioned horizontally across its clients.
no code implementations • ICLR 2021 • Timothy Castiglia, Anirban Das, Stacy Patterson
We propose Multi-Level Local SGD, a distributed stochastic gradient method for learning a smooth, non-convex objective in a multi-level communication network with heterogeneous workers.
1 code implementation • 27 Jul 2020 • Timothy Castiglia, Anirban Das, Stacy Patterson
In our algorithm, sub-networks execute a distributed SGD algorithm, using a hub-and-spoke paradigm, and the hubs periodically average their models with neighboring hubs.
1 code implementation • 14 Nov 2018 • Anirban Das, Stacy Patterson, Mike P. Wittie
The emerging trend of edge computing has led several cloud providers to release their own platforms for performing computation at the 'edge' of the network.
Networking and Internet Architecture Distributed, Parallel, and Cluster Computing