Medical Named Entity Recognition
14 papers with code • 2 benchmarks • 6 datasets
Latest papers
MC-DRE: Multi-Aspect Cross Integration for Drug Event/Entity Extraction
Extracting meaningful drug-related information chunks, such as adverse drug events (ADE), is crucial for preventing morbidity and saving many lives.
ViHealthBERT: Pre-trained Language Models for Vietnamese in Health Text Mining
We introduce ViHealthBERT, the first domain-specific pre-trained language model for Vietnamese healthcare.
An End-to-End Progressive Multi-Task Learning Framework for Medical Named Entity Recognition and Normalization
Medical named entity recognition (NER) and normalization (NEN) are fundamental for constructing knowledge graphs and building QA systems.
BioELECTRA:Pretrained Biomedical text Encoder using Discriminators
We introduce BioELECTRA, a biomedical domain-specific language encoder model that adapts ELECTRA for the Biomedical domain.
ELECTRAMed: a new pre-trained language representation model for biomedical NLP
The overwhelming amount of biomedical scientific texts calls for the development of effective language models able to tackle a wide range of biomedical natural language processing (NLP) tasks.
Biomedical Named Entity Recognition at Scale
Named entity recognition (NER) is a widely applicable natural language processing task and building block of question answering, topic modeling, information retrieval, etc.
Med7: a transferable clinical natural language processing model for electronic health records
In this work we introduced a named-entity recognition model for clinical natural language processing.
BioFLAIR: Pretrained Pooled Contextualized Embeddings for Biomedical Sequence Labeling Tasks
We also investigate the effects of a small amount of additional pretraining on PubMed content, and of combining FLAIR and ELMO models.
SciBERT: A Pretrained Language Model for Scientific Text
Obtaining large-scale annotated data for NLP tasks in the scientific domain is challenging and expensive.