15 papers with code • 2 benchmarks • 0 datasets
To address the dearth of annotated training data for medical entity linking, we present WikiMed and PubMedDS, two large-scale medical entity linking datasets, and demonstrate that pre-training MedType on these datasets further improves entity linking performance.
We provide comprehensive experimental evaluation of our proposal, along with alternative design choices, on a standard Python dataset, as well as on a Python corpus internal to Facebook.
Ranked #1 on Type prediction on Py150
Automatically annotating column types with knowledge base (KB) concepts is a critical task to gain a basic understanding of web tables.
The kNN attention pooling layer is a generalization of the Graph Attention Model (GAM), and can be applied to not only graphs but also any set of objects regardless of whether a graph is given or not.
In breast cancer, for instance, our model identified well-known markers, such as GATA3 and ESR1.
We design a hierarchical neural network model that learns to discriminate between types of the same kind and dissimilar types in a high-dimensional space, which results in clusters of types.