Zero-Shot Learning with Common Sense Knowledge Graphs

18 Jun 2020  ยท  Nihal V. Nayak, Stephen H. Bach ยท

Zero-shot learning relies on semantic class representations such as hand-engineered attributes or learned embeddings to predict classes without any labeled examples. We propose to learn class representations by embedding nodes from common sense knowledge graphs in a vector space. Common sense knowledge graphs are an untapped source of explicit high-level knowledge that requires little human effort to apply to a range of tasks. To capture the knowledge in the graph, we introduce ZSL-KG, a general-purpose framework with a novel transformer graph convolutional network (TrGCN) for generating class representations. Our proposed TrGCN architecture computes non-linear combinations of node neighbourhoods. Our results show that ZSL-KG improves over existing WordNet-based methods on five out of six zero-shot benchmark datasets in language and vision.

PDF Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Generalized Zero-Shot Learning aPY - 0-Shot ZSL-KG Harmonic mean 61.57 # 1
Zero-Shot Learning aPY - 0-Shot ZSL-KG Top-1 60.54 # 1
Generalized Zero-Shot Learning AwA2 ZSL-KG Harmonic mean 74.58 # 1
Zero-Shot Learning AwA2 ZSL-KG average top-1 classification accuracy 78.08 # 1
Generalized Zero-Shot Learning BBN Pronoun Coreference and Entity Type Corpus ZSL-KG F1 26.69 # 1
Generalized Zero-Shot Learning OntoNotes ZSL-KG F1 45.21 # 1
Zero-Shot Learning SNIPS ZSL-KG Accuracy 88.98 # 1

Methods


No methods listed for this paper. Add relevant methods here