no code implementations • 9 Apr 2021 • Keyur Faldu, Amit Sheth, Prashant Kikani, Hemang Akbari
We take BERT as a baseline model and implement the "Knowledge-Infused BERT" by infusing knowledge context from ConceptNet and WordNet, which significantly outperforms BERT and other recent knowledge-aware BERT variants like ERNIE, SenseBERT, and BERT_CS over eight different subtasks of GLUE benchmark.