Entity linking -- the task of identifying references in free text to relevant knowledge base representations -- often focuses on single languages.
We find that the multilingual ability of BERT leads to robust performance in monolingual and multilingual settings.
We propose an approach to concept linking that leverages recent work in contextualized neural models, such as ELMo (Peters et al. 2018), which create a token representation that integrates the surrounding context of the mention and concept name.
Clinical notes contain an extensive record of a patient's health status, such as smoking status or the presence of heart conditions.
Ranked #1 on Clinical Note Phenotyping on I2B2 2006: Smoking
Linking mentions of medical concepts in a clinical note to a concept in an ontology enables a variety of tasks that rely on understanding the content of a medical record, such as identifying patient populations and decision support.
The problem of accurately predicting relative reading difficulty across a set of sentences arises in a number of important natural language applications, such as finding and curating effective usage examples for intelligent language tutoring systems.