Ask2Transformers: Zero-Shot Domain labelling with Pre-trained Language Models

7 Jan 2021  ·  Oscar Sainz, German Rigau ·

In this paper we present a system that exploits different pre-trained Language Models for assigning domain labels to WordNet synsets without any kind of supervision. Furthermore, the system is not restricted to use a particular set of domain labels. We exploit the knowledge encoded within different off-the-shelf pre-trained Language Models and task formulations to infer the domain label of a particular WordNet definition. The proposed zero-shot system achieves a new state-of-the-art on the English dataset used in the evaluation.

PDF Abstract

Datasets


Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Domain Labelling BabelDomains A2T F1-Score 92.14 # 1

Methods


No methods listed for this paper. Add relevant methods here