1 code implementation • 27 Jan 2023 • Yikuan Li, Ramsey M. Wehbe, Faraz S. Ahmad, Hanyin Wang, Yuan Luo
Objective: Clinical knowledge enriched transformer models (e. g., ClinicalBERT) have state-of-the-art results on clinical NLP (natural language processing) tasks.
1 code implementation • 27 Jan 2022 • Yikuan Li, Ramsey M. Wehbe, Faraz S. Ahmad, Hanyin Wang, Yuan Luo
To overcome this, long sequence transformer models, e. g. Longformer and BigBird, were proposed with the idea of sparse attention mechanism to reduce the memory usage from quadratic to the sequence length to a linear scale.