no code implementations • 17 Oct 2023 • Hillary Ngai, Rohan Agrawal, Neeraj Gaur, Ronny Huang, Parisa Haghani, Pedro Moreno Mengibar
Adapters are an efficient, composable alternative to full fine-tuning of pre-trained models and help scale the deployment of large ASR models to many tasks.
1 code implementation • BioNLP (ACL) 2022 • Hillary Ngai, Frank Rudzicz
We introduce Doctor XAvIer, a BERT-based diagnostic system that extracts relevant clinical data from transcribed patient-doctor dialogues and explains predictions using feature attribution methods.
no code implementations • 16 Jan 2021 • Hillary Ngai, Yoona Park, John Chen, Mahboobeh Parsapoor
In response to the Kaggle's COVID-19 Open Research Dataset (CORD-19) challenge, we have proposed three transformer-based question-answering systems using BERT, ALBERT, and T5 models.
no code implementations • 31 Dec 2020 • Paul Grouchy, Shobhit Jain, Michael Liu, Kuhan Wang, Max Tian, Nidhi Arora, Hillary Ngai, Faiza Khan Khattak, Elham Dolatabadi, Sedef Akinli Kocak
With the growing amount of text in health data, there have been rapid advances in large pre-trained models that can be applied to a wide variety of biomedical tasks with minimal task-specific modifications.