no code implementations • 5 Jun 2023 • Han Xie, Da Zheng, Jun Ma, Houyu Zhang, Vassilis N. Ioannidis, Xiang Song, Qing Ping, Sheng Wang, Carl Yang, Yi Xu, Belinda Zeng, Trishul Chilimbi
Model pre-training on large text corpora has been demonstrated effective for various downstream applications in the NLP domain.
1 code implementation • 19 Nov 2021 • Luojun Lin, Han Xie, Zhishu Sun, WeiJie Chen, Wenxi Liu, Yuanlong Yu, Lei Zhang
From this perspective, we introduce a novel paradigm of DG, termed as Semi-Supervised Domain Generalization (SSDG), to explore how the labeled and unlabeled source domains can interact, and establish two settings, including the close-set and open-set SSDG.
1 code implementation • NeurIPS 2021 • Han Xie, Jing Ma, Li Xiong, Carl Yang
Federated learning has emerged as an important paradigm for training machine learning models in different domains.
1 code implementation • 14 Apr 2021 • Chaoyang He, Keshav Balasubramanian, Emir Ceyani, Carl Yang, Han Xie, Lichao Sun, Lifang He, Liangwei Yang, Philip S. Yu, Yu Rong, Peilin Zhao, Junzhou Huang, Murali Annavaram, Salman Avestimehr
FedGraphNN is built on a unified formulation of graph FL and contains a wide range of datasets from different domains, popular GNN models, and FL algorithms, with secure and efficient system support.