1 code implementation • 10 Mar 2020 • Vít Novotný, Eniafe Festus Ayetiran, Michal Štefánik, Petr Sojka
In our work, we investigate the individual and joint effect of the two word embedding regularization techniques on the document processing speed and the task performance of the SCM and the WMD on text classification.
Ranked #2 on Document Classification on Amazon
1 code implementation • 19 Apr 2021 • Vít Novotný, Michal Štefánik, Eniafe Festus Ayetiran, Petr Sojka, Radim Řehůřek
In 2018, Mikolov et al. introduced the positional language model, which has characteristics of attention-based neural machine translation models and which achieved state-of-the-art performance on the intrinsic word analogy task.
no code implementations • RANLP 2021 • Vít Novotný, Eniafe Festus Ayetiran, Dalibor Bačovský, Dávid Lupták, Michal Štefánik, Petr Sojka
In our work, we find the optimal subword sizes on the English, German, Czech, Italian, Spanish, French, Hindi, Turkish, and Russian word analogy tasks.
no code implementations • 27 Feb 2021 • Eniafe Festus Ayetiran, Petr Sojka, Vít Novotný
We report evaluation results on 11 benchmark datasets involving WSD and Word Similarity tasks and show that our method for enhancing distributional semantic structures improves embeddings quality on the baselines.