|Trend||Dataset||Best Method||Paper title||Paper||Code||Compare|
Despite of the recent success of collective entity linking (EL) methods, these "global" inference methods may yield sub-optimal results when the "all-mention coherence" assumption breaks, and often suffer from high computational cost at the inference stage, due to the complex search space.
However, most neural collective EL methods depend entirely upon neural networks to automatically model the semantic dependencies between different EL decisions, which lack of the guidance from external knowledge.
First, we construct a high recall list of candidate entities for each mention in an unlabeled document.
We propose a simple Named Entity Linking system that can be trained from Wikidata only.
Ever-expanding volumes of biomedical text require automated semantic annotation techniques to curate and put to best use.
To address this problem, we investigate zero-shot cross-lingual entity linking, in which we assume no bilingual lexical resources are available in the source low-resource language.