Search Results for author: Jae Boum Kim

Found 4 papers, 4 papers with code

Is Contrastive Learning Necessary? A Study of Data Augmentation vs Contrastive Learning in Sequential Recommendation

1 code implementation17 Mar 2024 Peilin Zhou, You-Liang Huang, Yueqi Xie, Jingqi Gao, Shoujin Wang, Jae Boum Kim, Sunghun Kim

Intriguingly, the conclusion drawn from our study is that, certain data augmentation strategies can achieve similar or even superior performance compared with some CL-based methods, demonstrating the potential to significantly alleviate the data sparsity issue with fewer computational overhead.

Contrastive Learning Data Augmentation +1

Attention Calibration for Transformer-based Sequential Recommendation

1 code implementation18 Aug 2023 Peilin Zhou, Qichen Ye, Yueqi Xie, Jingqi Gao, Shoujin Wang, Jae Boum Kim, Chenyu You, Sunghun Kim

Our empirical analysis of some representative Transformer-based SR models reveals that it is not uncommon for large attention weights to be assigned to less relevant items, which can result in inaccurate recommendations.

Sequential Recommendation

Equivariant Contrastive Learning for Sequential Recommendation

1 code implementation10 Nov 2022 Peilin Zhou, Jingqi Gao, Yueqi Xie, Qichen Ye, Yining Hua, Jae Boum Kim, Shoujin Wang, Sunghun Kim

Therefore, we propose Equivariant Contrastive Learning for Sequential Recommendation (ECL-SR), which endows SR models with great discriminative power, making the learned user behavior representations sensitive to invasive augmentations (e. g., item substitution) and insensitive to mild augmentations (e. g., featurelevel dropout masking).

Contrastive Learning Data Augmentation +1

Cannot find the paper you are looking for? You can Submit a new open access paper.