1 code implementation • 22 May 2023 • Yaobo Liang, Quanzhi Zhu, Junhe Zhao, Nan Duan
There are two primary approaches to addressing cross-lingual transfer: multilingual pre-training, which implicitly aligns the hidden representations of various languages, and translate-test, which explicitly translates different languages into an intermediate language, such as English.
1 code implementation • 17 Jun 2022 • Zheng He, Zeke Xie, Quanzhi Zhu, Zengchang Qin
People usually believe that network pruning not only reduces the computational cost of deep networks, but also prevents overfitting by decreasing model capacity.
no code implementations • 29 Sep 2021 • Zheng He, Quanzhi Zhu, Zengchang Qin
Network pruning is a widely-used technique to reduce the computational cost of over-parameterized neural networks.