no code implementations • 1 Apr 2024 • Deqing Fu, Ghazal Khalighinejad, Ollie Liu, Bhuwan Dhingra, Dani Yogatama, Robin Jia, Willie Neiswanger
Current foundation models exhibit impressive capabilities when prompted either with text only or with both image and text inputs.
1 code implementation • 1 Mar 2024 • Ghazal Khalighinejad, Defne Circi, L. C. Brinson, Bhuwan Dhingra
This paper investigates the use of large language models (LLMs) for extracting sample lists of polymer nanocomposites (PNCs) from full-length materials science research papers.
1 code implementation • 3 May 2023 • Ghazal Khalighinejad, Ollie Liu, Sam Wiseman
We investigate the ability of transformer models to approximate the CKY algorithm, using them to directly predict a sentence's parse and thus avoid the CKY algorithm's cubic dependence on sentence length.