Search Results for author: Lucila Ohno-Machado

Found 3 papers, 1 papers with code

Me LLaMA: Foundation Large Language Models for Medical Applications

1 code implementation20 Feb 2024 Qianqian Xie, Qingyu Chen, Aokun Chen, Cheng Peng, Yan Hu, Fongci Lin, Xueqing Peng, Jimin Huang, Jeffrey Zhang, Vipina Keloth, Xinyu Zhou, Huan He, Lucila Ohno-Machado, Yonghui Wu, Hua Xu, Jiang Bian

In response to this challenge, this study introduces Me-LLaMA, a novel medical LLM family that includes foundation models - Me-LLaMA 13/70B, along with their chat-enhanced versions - Me-LLaMA 13/70B-chat, developed through continual pre-training and instruction tuning of LLaMA2 using large medical datasets.

Few-Shot Learning

The Impact of Automatic Pre-annotation in Clinical Note Data Element Extraction - the CLEAN Tool

no code implementations11 Aug 2018 Tsung-Ting Kuo, Jina Huh, Ji-Hoon Kim, Robert El-Kareh, Siddharth Singh, Stephanie Feudjio Feupe, Vincent Kuri, Gordon Lin, Michele E. Day, Lucila Ohno-Machado, Chun-Nan Hsu

Our study introduces CLEAN (CLinical note rEview and ANnotation), a pre-annotation-based cNLP annotation system to improve clinical note annotation of data elements, and comprehensively compares CLEAN with the widely-used annotation system Brat Rapid Annotation Tool (BRAT).

Open-Ended Question Answering

Natural Language Processing in Biomedicine: A Unified System Architecture Overview

no code implementations3 Jan 2014 Son Doan, Mike Conway, Tu Minh Phuong, Lucila Ohno-Machado

In modern electronic medical records (EMR) much of the clinically important data - signs and symptoms, symptom severity, disease status, etc.

Cannot find the paper you are looking for? You can Submit a new open access paper.