Search Results for author: Bangchang Liu

Found 4 papers, 1 papers with code

Improving Low-resource Reading Comprehension via Cross-lingual Transposition Rethinking

no code implementations11 Jul 2021 Gaochen Wu, Bin Xu, Yuxin Qin, Fei Kong, Bangchang Liu, Hongwen Zhao, Dejie Chang

To address this issue, we propose a Cross-Lingual Transposition ReThinking (XLTT) model by modelling existing high-quality extractive reading comprehension datasets in a multilingual environment.

Reading Comprehension

PatentMiner: Patent Vacancy Mining via Context-enhanced and Knowledge-guided Graph Attention

no code implementations10 Jul 2021 Gaochen Wu, Bin Xu, Yuxin Qin, Fei Kong, Bangchang Liu, Hongwen Zhao, Dejie Chang

In this paper, we propose a new patent vacancy prediction approach named PatentMiner to mine rich semantic knowledge and predict new potential patents based on knowledge graph (KG) and graph attention mechanism.

Graph Attention Link Prediction +3

DiaKG: an Annotated Diabetes Dataset for Medical Knowledge Graph Construction

1 code implementation31 May 2021 Dejie Chang, Mosha Chen, Chaozhen Liu, LiPing Liu, Dongdong Li, Wei Li, Fei Kong, Bangchang Liu, Xiaobin Luo, Ji Qi, Qiao Jin, Bin Xu

In order to accelerate the research for domain-specific knowledge graphs in the medical domain, we introduce DiaKG, a high-quality Chinese dataset for Diabetes knowledge graph, which contains 22, 050 entities and 6, 890 relations in total.

graph construction Knowledge Graphs +4

A Multilingual Modeling Method for Span-Extraction Reading Comprehension

no code implementations31 May 2021 Gaochen Wu, Bin Xu, Dejie Chang, Bangchang Liu

In this paper, in order to solve the scarce availability of extractive reading comprehension training data in the target language, we propose a multilingual extractive reading comprehension approach called XLRC by simultaneously modeling the existing extractive reading comprehension training data in a multilingual environment using self-adaptive attention and multilingual attention.

Multilingual NLP Reading Comprehension

Cannot find the paper you are looking for? You can Submit a new open access paper.