Paper

Inducing Relational Knowledge from BERT

One of the most remarkable properties of word embeddings is the fact that they capture certain types of semantic and syntactic relationships. Recently, pre-trained language models such as BERT have achieved groundbreaking results across a wide range of Natural Language Processing tasks... (read more)

Results in Papers With Code
(↓ scroll down to see all results)