no code implementations • 3 May 2024 • Yaoyiran Li, Xiang Zhai, Moustafa Alzantot, Keyi Yu, Ivan Vulić, Anna Korhonen, Mohamed Hammad
Building upon the success of Large Language Models (LLMs) in a variety of tasks, researchers have recently explored using LLMs that are pretrained on vast corpora of text for sequential recommendation.
no code implementations • IJCNLP 2019 • Zi-Yi Dou, Keyi Yu, Antonios Anastasopoulos
Learning general representations of text is a fundamental problem for many natural language understanding (NLU) tasks.
no code implementations • ICLR 2018 • Keyi Yu, Yang Liu, Alexander G. Schwing, Jian Peng
Recent advances in recurrent neural nets (RNNs) have shown much promise in many applications in natural language processing.