Search Results for author: Oladimeji Farri

Found 11 papers, 1 papers with code

shs-nlp at RadSum23: Domain-Adaptive Pre-training of Instruction-tuned LLMs for Radiology Report Impression Generation

no code implementations5 Jun 2023 Sanjeev Kumar Karn, Rikhiya Ghosh, Kusuma P, Oladimeji Farri

Instruction-tuned generative Large language models (LLMs) like ChatGPT and Bloomz possess excellent generalization abilities, but they face limitations in understanding radiology reports, particularly in the task of generating the IMPRESSIONS section from the FINDINGS section.

Assertion Detection in Multi-Label Clinical Text using Scope Localization

no code implementations19 May 2020 Rajeev Bhatt Ambati, Ahmed Ada Hanifi, Ramya Vunikili, Puneet Sharma, Oladimeji Farri

Multi-label sentences (text) in the clinical domain result from the rich description of scenarios during patient care.

Negation Sentence

DR-BiLSTM: Dependent Reading Bidirectional LSTM for Natural Language Inference

no code implementations NAACL 2018 Reza Ghaeini, Sadid A. Hasan, Vivek Datla, Joey Liu, Kathy Lee, Ashequl Qadir, Yuan Ling, Aaditya Prakash, Xiaoli Z. Fern, Oladimeji Farri

Instead, we propose a novel dependent reading bidirectional LSTM network (DR-BiLSTM) to efficiently model the relationship between a premise and a hypothesis during encoding and inference.

Natural Language Inference

Learning to Diagnose: Assimilating Clinical Narratives using Deep Reinforcement Learning

no code implementations IJCNLP 2017 Yuan Ling, Sadid A. Hasan, Vivek Datla, Ashequl Qadir, Kathy Lee, Joey Liu, Oladimeji Farri

Clinical diagnosis is a critical and non-trivial aspect of patient care which often requires significant medical research and investigation based on an underlying clinical scenario.

Decision Making reinforcement-learning +2

Condensed Memory Networks for Clinical Diagnostic Inferencing

no code implementations6 Dec 2016 Aaditya Prakash, Siyuan Zhao, Sadid A. Hasan, Vivek Datla, Kathy Lee, Ashequl Qadir, Joey Liu, Oladimeji Farri

We introduce condensed memory neural networks (C-MemNNs), a novel model with iterative condensation of memory representations that preserves the hierarchy of features in the memory.

Neural Clinical Paraphrase Generation with Attention

no code implementations WS 2016 Sadid A. Hasan, Bo Liu, Joey Liu, Ashequl Qadir, Kathy Lee, Vivek Datla, Aaditya Prakash, Oladimeji Farri

Paraphrase generation is important in various applications such as search, summarization, and question answering due to its ability to generate textual alternatives while keeping the overall meaning intact.

Document Summarization Information Retrieval +5

Cannot find the paper you are looking for? You can Submit a new open access paper.