no code implementations • 26 Jan 2021 • Hyunjin Choi, Judong Kim, Seongho Joe, Seungjai Min, Youngjune Gwon
In zero-shot cross-lingual transfer, a supervised NLP task trained on a corpus in one language is directly applicable to another language without any additional training.
no code implementations • 26 Jan 2021 • Hyunjin Choi, Judong Kim, Seongho Joe, Youngjune Gwon
The pre-trained BERT and A Lite BERT (ALBERT) models can be fine-tuned to give state-ofthe-art results in sentence-pair regressions such as semantic textual similarity (STS) and natural language inference (NLI).