DeepStruct: Pretraining of Language Models for Structure Prediction

We introduce a method for improving the structural understanding abilities of language models. Unlike previous approaches that finetune the models with task-specific augmentation, we pretrain language models on a collection of task-agnostic corpora to generate structures from text. Our structure pretraining enables zero-shot transfer of the learned knowledge that models have about the structure tasks. We study the performance of this approach on 28 datasets, spanning 10 structure prediction tasks including open information extraction, joint entity and relation extraction, named entity recognition, relation classification, semantic role labeling, event extraction, coreference resolution, factual probe, intent detection, and dialogue state tracking. We further enhance the pretraining with the task-specific training sets. We show that a 10B parameter language model transfers non-trivially to most tasks and obtains state-of-the-art performance on 21 of 28 datasets that we evaluate.

PDF Abstract Findings (ACL) 2022 PDF Findings (ACL) 2022 Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Relation Extraction ADE Corpus w/ finetune RE+ Micro F1 83.8 # 1
Relation Extraction ADE Corpus DeepStruct zero-shot NER Micro F1 60.7 # 1
RE+ Micro F1 10.6 # 2
Relation Extraction CoNLL04 DeepStruct multi-task w/ finetune RE+ Micro F1 78.3 # 1
NER Micro F1 90.7 # 2
Relation Extraction CoNLL04 w/ finetune NER Micro F1 91.1 # 1
Relation Extraction CoNLL04 DeepStruct Zero-Shot RE+ Micro F1 25.8 # 13
NER Micro F1 48.3 # 12
Relation Extraction FewRel DeepStruct multi-task w/ finetune F1 98.4 # 2
F1 (5-way 5-shot 100 # 1
F1 (10-way 1-shot) 97.8 # 1
F1 (10-way 5-shot) 99.8 # 1
Open Information Extraction NYT DeepStruct multi-task w/ finetune F1 45 # 1
Relation Extraction NYT DeepStruct multi-task F1 93.9 # 1
Relation Extraction NYT DeepStruct multi-task w/ finetune NER Micro F1 95.9 # 1
Open Information Extraction OIE2016 DeepStruct multi-task w/ finetune F1 71.3 # 2
Open Information Extraction Penn Treebank DeepStruct multi-task F1 54.5 # 1
Relation Extraction TACRED DeepStruct multi-task w/ finetune F1 76.8 # 1

Methods


No methods listed for this paper. Add relevant methods here