Search Results for author: Yanling Cui

Found 5 papers, 2 papers with code

Enhancing Document Ranking with Task-adaptive Training and Segmented Token Recovery Mechanism

no code implementations EMNLP 2021 Xingwu Sun, Yanling Cui, Hongyin Tang, Fuzheng Zhang, Beihong Jin, Shi Wang

In this paper, we propose a new ranking model DR-BERT, which improves the Document Retrieval (DR) task by a task-adaptive training process and a Segmented Token Recovery Mechanism (STRM).

Document Ranking Retrieval

TITA: A Two-stage Interaction and Topic-Aware Text Matching Model

no code implementations NAACL 2021 Xingwu Sun, Yanling Cui, Hongyin Tang, Qiuyu Zhu, Fuzheng Zhang, Beihong Jin

To tackle this problem, we define a three-level relevance in keyword-document matching task: topic-aware relevance, partially-relevance and irrelevance.

Text Matching Vocal Bursts Valence Prediction

AdsGNN: Behavior-Graph Augmented Relevance Modeling in Sponsored Search

1 code implementation25 Apr 2021 Chaozhuo Li, Bochen Pang, Yuming Liu, Hao Sun, Zheng Liu, Xing Xie, Tianqi Yang, Yanling Cui, Liangjie Zhang, Qi Zhang

Our motivation lies in incorporating the tremendous amount of unsupervised user behavior data from the historical search logs as the complementary graph to facilitate relevance modeling.

Marketing

FRI-Miner: Fuzzy Rare Itemset Mining

no code implementations11 Mar 2021 Yanling Cui, Wensheng Gan, Hong Lin, Weimin Zheng

In some cases, infrequent or rare itemsets and rare association rules also play an important role in real-life applications.

Databases

TextGNN: Improving Text Encoder via Graph Neural Network in Sponsored Search

2 code implementations15 Jan 2021 Jason Yue Zhu, Yanling Cui, Yuming Liu, Hao Sun, Xue Li, Markus Pelger, Tianqi Yang, Liangjie Zhang, Ruofei Zhang, Huasha Zhao

Text encoders based on C-DSSM or transformers have demonstrated strong performance in many Natural Language Processing (NLP) tasks.

Natural Language Understanding

Cannot find the paper you are looking for? You can Submit a new open access paper.