no code implementations • 21 May 2023 • Xiangxiang Gao, Wei Zhu, Jiasheng Gao, Congrui Yin
Computational complexity and overthinking problems have become the bottlenecks for pre-training language models (PLMs) with millions or even trillions of parameters.
Multi-Label Classification Multi Label Text Classification +2