Search Results for author: Heng Tang

Found 2 papers, 2 papers with code

Distillation Matters: Empowering Sequential Recommenders to Match the Performance of Large Language Model

1 code implementation1 May 2024 Yu Cui, Feng Liu, Pengbo Wang, Bohao Wang, Heng Tang, Yi Wan, Jun Wang, Jiawei Chen

Owing to their powerful semantic reasoning capabilities, Large Language Models (LLMs) have been effectively utilized as recommenders, achieving impressive performance.

Knowledge Distillation Language Modeling +2

Cannot find the paper you are looking for? You can Submit a new open access paper.