In recent years pre-trained language models (PLM) such as BERT have proven to be very effective in diverse NLP tasks such as Information Extraction, Sentiment Analysis and Question Answering.
no code implementations • 1 Dec 2020 • Shubham Kanodia, Joe P. Ninan, Andrew J. Monson, Suvrath Mahadevan, Colin Nitroy, Christian Schwab, Samuel Halverson, Chad F. Bender, Ryan Terrien, Frederick R. Hearty, Emily Lubar, Michael W. McElwain, Lawrence. W. Ramsey, Paul M. Robertson, Arpita Roy, Gudmundur Stefansson, Daniel J. Stevens
The NEID spectrograph is a R $\sim$ 120, 000 resolution fiber-fed and highly stabilized spectrograph for extreme radial velocity (RV) precision.
Instrumentation and Methods for Astrophysics Earth and Planetary Astrophysics
We propose a novel supervised open information extraction (Open IE) framework that leverages an ensemble of unsupervised Open IE systems and a small amount of labeled data to improve system performance.
We describe the systems developed by the UMBC team for 2018 SemEval Task 8, SecureNLP (Semantic Extraction from CybersecUrity REports using Natural Language Processing).
In this pa-per, we describe a novel method to train domain-specificword embeddings from sparse texts.