no code implementations • NAACL 2021 • Subhadarshi Panda, Anjali Agrawal, Jeewon Ha, Benjamin Bloch
Many of these approaches have employed domain agnostic pre-training tasks to train models that yield highly generalized sentence representations that can be fine-tuned for specific downstream tasks.