Search Results for author: xiaofu Chang

Found 5 papers, 1 papers with code

Temporal Logic Point Processes

no code implementations ICML 2020 Shuang Li, Lu Wang, Ruizhi Zhang, xiaofu Chang, Xuqin Liu, Yao Xie, Yuan Qi, Le Song

We propose a modeling framework for event data, which excels in small data regime with the ability to incorporate domain knowledge.

Point Processes

Dynamic Sequential Graph Learning for Click-Through Rate Prediction

no code implementations26 Sep 2021 Yunfei Chu, xiaofu Chang, Kunyang Jia, Jingzhen Zhou, Hongxia Yang

In this paper, we propose a novel method, named Dynamic Sequential Graph Learning (DSGL), to enhance users or items' representations by utilizing collaborative information from the local sub-graphs associated with users or items.

Click-Through Rate Prediction Graph Learning +1

SHORING: Design Provable Conditional High-Order Interaction Network via Symbolic Testing

no code implementations3 Jul 2021 Hui Li, Xing Fu, Ruofan Wu, Jinyu Xu, Kai Xiao, xiaofu Chang, Weiqiang Wang, Shuai Chen, Leilei Shi, Tao Xiong, Yuan Qi

Deep learning provides a promising way to extract effective representations from raw data in an end-to-end fashion and has proven its effectiveness in various domains such as computer vision, natural language processing, etc.

Management Product Recommendation +1

TCL: Transformer-based Dynamic Graph Modelling via Contrastive Learning

2 code implementations17 May 2021 Lu Wang, xiaofu Chang, Shuang Li, Yunfei Chu, Hui Li, Wei zhang, Xiaofeng He, Le Song, Jingren Zhou, Hongxia Yang

Secondly, on top of the proposed graph transformer, we introduce a two-stream encoder that separately extracts representations from temporal neighborhoods associated with the two interaction nodes and then utilizes a co-attentional transformer to model inter-dependencies at a semantic level.

Contrastive Learning Graph Learning +2

Deep Interaction Processes for Time-Evolving Graphs

no code implementations25 Sep 2019 xiaofu Chang, Jianfeng Wen, Xuqin Liu, Yanming Fang, Le Song, Yuan Qi

To model the dependency between latent dynamic representations of each node, we define a mixture of temporal cascades in which a node's neural representation depends on not only this node's previous representations but also the previous representations of related nodes that have interacted with this node.

Cannot find the paper you are looking for? You can Submit a new open access paper.