Drug–drug Interaction Extraction

7 papers with code • 1 benchmarks • 1 datasets

Automatic extraction of Drug-drug interaction (DDI) information from the biomedical literature.

( Image credit: Using Drug Descriptions and Molecular Structures for Drug-Drug Interaction Extraction from Literature )

Datasets


Most implemented papers

CharacterBERT: Reconciling ELMo and BERT for Word-Level Open-Vocabulary Representations From Characters

helboukkouri/character-bert COLING 2020

Due to the compelling improvements brought by BERT, many recent representation models adopted the Transformer architecture as their main building block, consequently inheriting the wordpiece tokenization system despite it not being intrinsically linked to the notion of Transformers.

Drug-Drug Interaction Extraction from Biomedical Text Using Long Short Term Memory Network

sunilitggu/DDI-extraction-through-LSTM 28 Jan 2017

The two models, {\it AB-LSTM} and {\it Joint AB-LSTM} also use attentive pooling in the output of Bi-LSTM layer to assign weights to features.

Using Drug Descriptions and Molecular Structures for Drug-Drug Interaction Extraction from Literature

tticoin/DESC_MOL-DDIE 24 Oct 2020

Specifically, we focus on drug description and molecular structure information as the drug database information.

EGFI: Drug-Drug Interaction Extraction and Generation with Fusion of Enriched Entity and Sentence Information

Layne-Huang/EGFI 25 Jan 2021

To address such a problem, we propose EGFI for extracting and consolidating drug interactions from large-scale medical literature text data.

ELECTRAMed: a new pre-trained language representation model for biomedical NLP

gmpoli/electramed 19 Apr 2021

The overwhelming amount of biomedical scientific texts calls for the development of effective language models able to tackle a wide range of biomedical natural language processing (NLP) tasks.

SciFive: a text-to-text transformer model for biomedical literature

justinphan3110/SciFive 28 May 2021

In this report, we introduce SciFive, a domain-specific T5 model that has been pre-trained on large biomedical corpora.