Fine-tuning Pre-Trained Transformer Language Models to Distantly Supervised Relation Extraction

ACL 2019 Christoph AltMarc HübnerLeonhard Hennig

Distantly supervised relation extraction is widely used to extract relational facts from text, but suffers from noisy labels. Current relation extraction methods try to alleviate the noise by multi-instance learning and by providing supporting linguistic and contextual information to more efficiently guide the relation classification... (read more)

PDF Abstract

Results from the Paper

  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.