no code implementations • 29 Sep 2021 • Jing Liu, Chulin Xie, Krishnaram Kenthapadi, Oluwasanmi O Koyejo, Bo Li
Vertical Federated Learning (VFL) is a distributed learning paradigm that allows multiple agents to jointly train a global model when each agent holds a different subset of features for the same sample(s).
no code implementations • 29 Sep 2021 • Maohao Shen, Bowen Jiang, Jacky Y. Zhang, Oluwasanmi O Koyejo
We propose a novel and general framework (i. e., SABAL) that formulates batch active learning as a sparse approximation problem.
no code implementations • 29 Sep 2021 • Andrew Liu, Jacky Y. Zhang, Nishant Kumar, Dakshita Khurana, Oluwasanmi O Koyejo
Federated averaging, the most popular aggregation approach in federated learning, is known to be vulnerable to failures and adversarial updates from clients that wish to disrupt training.
no code implementations • 29 Sep 2021 • Xiaoyang Wang, Han Zhao, Klara Nahrstedt, Oluwasanmi O Koyejo
To this end, we propose a strategy to mitigate the effect of spurious features based on our observation that the global model in the federated learning step has a low accuracy disparity due to statistical heterogeneity.
1 code implementation • 1st Conference on Causal Learning and Reasoning 2022 • Xiaoyang Wang, Klara Nahrstedt, Oluwasanmi O Koyejo
Current approaches for learning disentangled representations assume that independent latent variables generate the data through a single data generation process.
no code implementations • 28 Sep 2020 • Kaizhao Liang, Jacky Y. Zhang, Oluwasanmi O Koyejo, Bo Li
Despite the immense success that deep neural networks (DNNs) have achieved, \emph{adversarial examples}, which are perturbed inputs that aim to mislead DNNs to make mistakes, have recently led to great concerns.