Learning representations of entity mentions is a core component of modern entity linking systems for both candidate generation and making linking predictions.
Ranked #1 on Entity Linking on ZESHEL
Coreference decisions among event mentions and among co-occurring entity mentions are highly interdependent, thus motivating joint inference.
When the similarity is measured by dot-product between dual-encoder vectors or $\ell_2$-distance, there already exist many scalable and efficient search methods.
Previous work has shown promising results in performing entity linking by measuring not only the affinities between mentions and entities but also those amongst mentions.
Sampling is an established technique to scale graph neural networks to large graphs.
Tools to explore scientific literature are essential for scientists, especially in biomedicine, where about a million new papers are published every year.
To address bias in machine learning, data scientists need tools that help them understand the trade-offs between model quality and fairness in their specific data domains.
In this paper, we introduce a model in which linking decisions can be made not merely by linking to a knowledge base entity but also by grouping multiple mentions together via clustering and jointly making linking predictions.
Archived data from the US network of weather radars hold detailed information about bird migration over the last 25 years, including very high-resolution partial measurements of velocity.