no code implementations • 29 Jun 2021 • Yang Shu, Zhi Kou, Zhangjie Cao, Jianmin Wang, Mingsheng Long
We propose \emph{Zoo-Tuning} to address these challenges, which learns to adaptively transfer the parameters of pretrained models to the target task.
2 code implementations • NeurIPS 2020 • Zhi Kou, Kaichao You, Mingsheng Long, Jianmin Wang
During training, two branches are stochastically selected to avoid over-depending on some sample statistics, resulting in a strong regularization effect, which we interpret as ``architecture regularization.''
2 code implementations • NeurIPS 2020 • Kaichao You, Zhi Kou, Mingsheng Long, Jianmin Wang
Fine-tuning pre-trained deep neural networks (DNNs) to a target dataset, also known as transfer learning, is widely used in computer vision and NLP.
Ranked #1 on Transfer Learning on COCO70
no code implementations • 12 Nov 2020 • Jincheng Zhong, Ximei Wang, Zhi Kou, Jianmin Wang, Mingsheng Long
It is common within the deep learning community to first pre-train a deep neural network from a large-scale dataset and then fine-tune the pre-trained model to a specific downstream task.