Search Results for author: Ruibin Mao

Found 6 papers, 0 papers with code

基于BERTCA的新闻实体与正文语义相关度计算模型(Semantic Relevance Computing Model of News Entity and Text based on BERTCA)

no code implementations CCL 2020 Junyi Xiang, Huijun Hu, Ruibin Mao, Maofu Liu

目前的搜索引擎仍然存在“重形式, 轻语义”的问题, 无法做到对搜索关键词和文本的深层次语义理解, 因此语义检索成为当代搜索引擎中亟需解决的问题。为了提高搜索引擎的语义理解能力, 提出一种语义相关度的计算方法。首先标注金融类新闻标题实体与新闻正文语义相关度语料1万条, 然后建立新闻实体与正文语义相关度计算的BERTCA(Bidirectional Encoder Representation from Transformers Co-Attention)模型, 通过使用BERT预训练模型, 综合考虑细粒度的实体和粗粒度的正文的语义信息, 然后经过协同注意力, 实现实体与正文的语义匹配, 不仅能计算出金融新闻实体与新闻正文之间的相关度, 还能根据相关度阈值来判定相关度类别, 实验表明该模型在1万条标注语料上准确率超过95%, 优于目前主流模型, 最后通过具体搜索示例展现该模型的优秀性能。

Experimentally realized memristive memory augmented neural network

no code implementations15 Apr 2022 Ruibin Mao, Bo Wen, Yahui Zhao, Arman Kazemi, Ann Franchesca Laguna, Michael Neimier, X. Sharon Hu, Xia Sheng, Catherine E. Graves, John Paul Strachan, Can Li

Memory augmented neural network has been proposed to achieve the goal, but the memory module has to be stored in an off-chip memory due to its size.

One-Shot Learning

Target-based Sentiment Annotation in Chinese Financial News

no code implementations LREC 2020 Chaofa Yuan, Yu-Han Liu, Rongdi Yin, Jun Zhang, Qinling Zhu, Ruibin Mao, Ruifeng Xu

Based on high quality annotation guideline and effective quality control strategy, a corpus with 8, 314 target-level sentiment annotation is constructed on 6, 336 paragraphs from Chinese financial news text.

Sentiment Analysis

The Design and Construction of a Chinese Sarcasm Dataset

no code implementations LREC 2020 Xiaochang Gong, Qin Zhao, Jun Zhang, Ruibin Mao, Ruifeng Xu

Thus, the detection and processing of sarcasm is important to social media analysis. However, most existing sarcasm dataset are in English and there is still a lack of authoritative Chinese sarcasm dataset.

Cannot find the paper you are looking for? You can Submit a new open access paper.