Medical Named Entity Recognition

12 papers with code • 1 benchmarks • 5 datasets

This task has no description! Would you like to contribute one?

Most implemented papers

BioBERT: a pre-trained biomedical language representation model for biomedical text mining

dmis-lab/biobert 25 Jan 2019

Biomedical text mining is becoming increasingly important as the number of biomedical documents rapidly grows.

Few-shot Learning for Named Entity Recognition in Medical Text

mxhofer/Named-Entity-Recognition-BidirectionalLSTM-CNN-CoNLL 13 Nov 2018

Deep neural network models have recently achieved state-of-the-art performance gains in a variety of natural language processing (NLP) tasks (Young, Hazarika, Poria, & Cambria, 2017).

SciBERT: A Pretrained Language Model for Scientific Text

allenai/scibert IJCNLP 2019

Obtaining large-scale annotated data for NLP tasks in the scientific domain is challenging and expensive.

Embedding Transfer for Low-Resource Medical Named Entity Recognition: A Case Study on Patient Mobility

drgriffis/NeuralVecmap WS 2018

Functioning is gaining recognition as an important indicator of global health, but remains under-studied in medical natural language processing research.

A Neural Multi-Task Learning Framework to Jointly Model Medical Named Entity Recognition and Normalization

SendongZhao/Multi-Task-Learning-for-MER-and-MEN 14 Dec 2018

State-of-the-art studies have demonstrated the superiority of joint modelling over pipeline implementation for medical named entity recognition and normalization due to the mutual benefits between the two processes.

BioFLAIR: Pretrained Pooled Contextualized Embeddings for Biomedical Sequence Labeling Tasks

shreyashub/BioFLAIR 13 Aug 2019

We also investigate the effects of a small amount of additional pretraining on PubMed content, and of combining FLAIR and ELMO models.

Biomedical Named Entity Recognition at Scale

JohnSnowLabs/spark-nlp-workshop 12 Nov 2020

Named entity recognition (NER) is a widely applicable natural language processing task and building block of question answering, topic modeling, information retrieval, etc.

ELECTRAMed: a new pre-trained language representation model for biomedical NLP

gmpoli/electramed 19 Apr 2021

The overwhelming amount of biomedical scientific texts calls for the development of effective language models able to tackle a wide range of biomedical natural language processing (NLP) tasks.

BioELECTRA:Pretrained Biomedical text Encoder using Discriminators

kamalkraj/BioELECTRA ACL Anthology 2021

We introduce BioELECTRA, a biomedical domain-specific language encoder model that adapts ELECTRA for the Biomedical domain.