1 code implementation • 23 Jan 2024 • Valerie Lim, Kai Wen Ng, Kenneth Lim
Natural Language Processing models like BERT can provide state-of-the-art word embeddings for downstream NLP tasks.
Contrastive Learning Knowledge Distillation +3