no code implementations • EMNLP 2020 • G{\'a}bor Berend
In this paper, we demonstrate that by utilizing sparse word representations, it becomes possible to surpass the results of more complex task-specific models on the task of fine-grained all-words word sense disambiguation.
Ranked #8 on Word Sense Disambiguation on Supervised:
no code implementations • SEMEVAL 2021 • G{\'a}bor Berend
In this paper, we introduce our system that we participated with at the multilingual and cross-lingual word-in-context disambiguation SemEval 2021 shared task.
1 code implementation • ACL 2021 • Tam{\'a}s Ficsor, G{\'a}bor Berend
The application of transformer-based contextual representations has became a de facto solution for solving complex NLP tasks.
no code implementations • SEMEVAL 2018 • G{\'a}bor Berend, M{\'a}rton Makrai, P{\'e}ter F{\"o}ldi{\'a}k
This paper describes 300-sparsians{'}s participation in SemEval-2018 Task 9: Hypernym Discovery, with a system based on sparse coding and a formal concept hierarchy obtained from word embeddings.
Ranked #3 on Hypernym Discovery on Medical domain
no code implementations • SEMEVAL 2017 • G{\'a}bor Berend
In this paper we introduce our system participating at the 2017 SemEval shared task on keyphrase extraction from scientific documents.
no code implementations • TACL 2017 • G{\'a}bor Berend
In this paper we propose and carefully evaluate a sequence labeling framework which solely utilizes sparse indicator features derived from dense distributed word representations.