Paper

Double-Wing Mixture of Experts for Streaming Recommendations

Streaming Recommender Systems (SRSs) commonly train recommendation models on newly received data only to address user preference drift, i.e., the changing user preferences towards items. However, this practice overlooks the long-term user preferences embedded in historical data. More importantly, the common heterogeneity in data stream greatly reduces the accuracy of streaming recommendations. The reason is that different preferences (or characteristics) of different types of users (or items) cannot be well learned by a unified model. To address these two issues, we propose a Variational and Reservoir-enhanced Sampling based Double-Wing Mixture of Experts framework, called VRS-DWMoE, to improve the accuracy of streaming recommendations. In VRS-DWMoE, we first devise variational and reservoir-enhanced sampling to wisely complement new data with historical data, and thus address the user preference drift issue while capturing long-term user preferences. After that, we propose a Double-Wing Mixture of Experts (DWMoE) model to first effectively learn heterogeneous user preferences and item characteristics, and then make recommendations based on them. Specifically, DWMoE contains two Mixture of Experts (MoE, an effective ensemble learning model) to learn user preferences and item characteristics, respectively. Moreover, the multiple experts in each MoE learn the preferences (or characteristics) of different types of users (or items) where each expert specializes in one underlying type. Extensive experiments demonstrate that VRS-DWMoE consistently outperforms the state-of-the-art SRSs.

Results in Papers With Code
(↓ scroll down to see all results)