Search Results for author: Jianying Lin

Found 4 papers, 2 papers with code

Persia: An Open, Hybrid System Scaling Deep Learning-based Recommenders up to 100 Trillion Parameters

1 code implementation10 Nov 2021 Xiangru Lian, Binhang Yuan, XueFeng Zhu, Yulong Wang, Yongjun He, Honghuan Wu, Lei Sun, Haodong Lyu, Chengjun Liu, Xing Dong, Yiqiao Liao, Mingnan Luo, Congfei Zhang, Jingru Xie, Haonan Li, Lei Chen, Renjie Huang, Jianying Lin, Chengchun Shu, Xuezhong Qiu, Zhishan Liu, Dongying Kong, Lei Yuan, Hai Yu, Sen yang, Ce Zhang, Ji Liu

Specifically, in order to ensure both the training efficiency and the training accuracy, we design a novel hybrid training algorithm, where the embedding layer and the dense neural network are handled by different synchronization mechanisms; then we build a system called Persia (short for parallel recommendation training system with hybrid acceleration) to support this hybrid training algorithm.

Recommendation Systems

POSO: Personalized Cold Start Modules for Large-scale Recommender Systems

no code implementations10 Aug 2021 Shangfeng Dai, Haobin Lin, Zhichen Zhao, Jianying Lin, Honghuan Wu, Zhe Wang, Sen yang, Ji Liu

Moreover, POSO can be further generalized to regular users, inactive users and returning users (+2%-3% on Watch Time), as well as item cold start (+3. 8% on Watch Time).

Recommendation Systems

Using Query Expansion in Manifold Ranking for Query-Oriented Multi-Document Summarization

1 code implementation CCL 2021 Quanye Jia, Rui Liu, Jianying Lin

It not only makes use of the relationships among the sentences, but also the relationships between the given query and the sentences.

Document Summarization Multi-Document Summarization

Joint Lifelong Topic Model and Manifold Ranking for Document Summarization

no code implementations7 Jul 2019 Jianying Lin, Rui Liu, Quanye Jia

The JTMMR model can improve the effect of the manifold ranking method by using the better semantic feature.

Document Summarization Multi-Document Summarization

Cannot find the paper you are looking for? You can Submit a new open access paper.