no code implementations • 28 Nov 2023 • Jiarong Yang, YuAn Liu, Fangjiong Chen, Wen Chen, Changle Li
Federated learning (FL) is a promising distributed learning framework where distributed clients collaboratively train a machine learning model coordinated by a server.
1 code implementation • 1 Aug 2023 • Geyang Guo, Jiarong Yang, Fengyuan LU, Jiaxin Qin, Tianyi Tang, Wayne Xin Zhao
From an evaluation perspective, we build a benchmark to judge ancient Chinese translation quality in different scenarios and evaluate the ancient Chinese translation capacities of various existing models.
no code implementations • 11 Dec 2022 • Jiarong Yang, YuAn Liu, Rahif Kassab
Distributed Stein Variational Gradient Descent (DSVGD) is a non-parametric distributed learning framework for federated Bayesian learning, where multiple clients jointly train a machine learning model by communicating a number of non-random and interacting particles with the server.