There has been a significant progress in the field of Extractive Question Answering (EQA) in the recent years.
Despite the advances in digital healthcare systems offering curated structured knowledge, much of the critical information still lies in large volumes of unlabeled and unstructured clinical texts.
Relation extraction in the biomedical domain is challenging due to the lack of labeled data and high annotation costs, needing domain experts.
Temporal knowledge graph completion (TKGC) has become a popular approach for reasoning over the event and temporal knowledge graphs, targeting the completion of knowledge with accurate but missing information.
In this work, we present a Transformers based Transfer Learning framework for Named Entity Recognition (T2NER) created in PyTorch for the task of NER with deep transformer models.
Bilinear models, while expressive, are prone to overfitting and lead to quadratic growth of parameters in number of relations.
Contextualized word embeddings provide better initialization for neural networks that deal with various natural language understanding (NLU) tasks including Question Answering (QA) and more recently, Question Generation(QG).
Fact triples are a common form of structured knowledge used within the biomedical domain.