no code implementations • 20 Nov 2021 • Zhixiong Yue, Feiyang Ye, Yu Zhang, Christy Liang, Ivor W. Tsang
We theoretically study the safeness of both learning strategies in the DSMTL model to show that the proposed methods can achieve some versions of safe multi-task learning.
no code implementations • 12 Sep 2021 • Zhixiong Yue, Pengxin Guo, Yu Zhang
Base on the PC function, we propose a new method called Domain Adaptation by Maximizing Population Correlation (DAMPC) to learn a domain-invariant feature representation for DA.
no code implementations • NeurIPS 2021 • Feiyang Ye, Baijiong Lin, Zhixiong Yue, Pengxin Guo, Qiao Xiao, Yu Zhang
Empirically, we show the effectiveness of the proposed MOML framework in several meta learning problems, including few-shot learning, neural architecture search, domain adaptation, and multi-task learning.
no code implementations • 19 Nov 2020 • Zhixiong Yue, Baijiong Lin, Xiaonan Huang, Yu Zhang
Although NAS methods can find network architectures with the state-of-the-art performance, the adversarial robustness and resource constraint are often ignored in NAS.