no code implementations • 1 Nov 2023 • Eylon Gueta, Omer Goldman, Reut Tsarfaty
We investigate the hypothesis that incorporating explicit morphological knowledge in the pre-training phase can improve the performance of PLMs for MRLs.
no code implementations • 28 Nov 2022 • Eylon Gueta, Avi Shmidman, Shaltiel Shmidman, Cheyn Shmuel Shmidman, Joshua Guedalia, Moshe Koppel, Dan Bareket, Amit Seker, Reut Tsarfaty
We perform a contrastive analysis of this model against all previous Hebrew PLMs (mBERT, heBERT, AlephBERT) and assess the effects of larger vocabularies on task performance.
Ranked #1 on Named Entity Recognition (NER) on NEMO-Corpus