no code implementations • 28 Jun 2023 • Ganyu Wang, Qingsong Zhang, Li Xiang, Boyu Wang, Bin Gu, Charles Ling
Meanwhile, the upstream model (server) is updated with first-order optimization (FOO) locally, which significantly improves the convergence rate, making it feasible to train the large models without compromising privacy and security.
no code implementations • 19 Mar 2022 • Qingsong Zhang, Bin Gu, Zhiyuan Dang, Cheng Deng, Heng Huang
Based on that, we propose a novel and practical VFL framework with black-box models, which is inseparably interconnected to the promising properties of ZOO.
no code implementations • 26 Sep 2021 • Qingsong Zhang, Bin Gu, Cheng Deng, Songxiang Gu, Liefeng Bo, Jian Pei, Heng Huang
To address the challenges of communication and computation resource utilization, we propose an asynchronous stochastic quasi-Newton (AsySQN) framework for VFL, under which three algorithms, i. e. AsySQN-SGD, -SVRG and -SAGA, are proposed.
no code implementations • 1 Mar 2021 • Qingsong Zhang, Bin Gu, Cheng Deng, Heng Huang
Vertical federated learning (VFL) attracts increasing attention due to the emerging demands of multi-party collaborative modeling and concerns of privacy leakage.