In addition, we go from the hypothesis that a user's preference at a time is a combination of long-term and short-term interests.
Approximate inference in deep Bayesian networks exhibits a dilemma of how to yield high fidelity posterior approximations while maintaining computational efficiency and scalability.
Furthermore, many applications often face with massive and dynamic short texts, causing various computational challenges to the current batch learning algorithms.
In this paper, to aim at exploiting a knowledge graph effectively, we propose a novel graph convolutional topic model (GCTM) which integrates graph convolutional networks (GCN) into a topic model and a learning method which learns the networks and the topic model simultaneously for data streams.
High-dimensional observations and unknown dynamics are major challenges when applying optimal control to many real-world decision making tasks.
It has been shown that implicit connectives can be exploited to improve the performance of the models for implicit discourse relation recognition (IDRR).