Hybrid Attention-Based Transformer Block Model for Distant Supervision Relation Extraction

10 Mar 2020Yan XiaoYaochu JinRan ChengKuangrong Hao

With an exponential explosive growth of various digital text information, it is challenging to efficiently obtain specific knowledge from massive unstructured text information. As one basic task for natural language processing (NLP), relation extraction aims to extract the semantic relation between entity pairs based on the given text... (read more)

PDF Abstract

Code


No code implementations yet. Submit your code now

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.