no code implementations • 8 Dec 2023 • Mobashir Sadat, Zhengyu Zhou, Lukas Lange, Jun Araki, Arsalan Gundroo, Bingqing Wang, Rakesh R Menon, Md Rizwan Parvez, Zhe Feng
Hallucination is a well-known phenomenon in text generated by large language models (LLMs).
no code implementations • 30 Aug 2023 • Anthony Colas, Jun Araki, Zhengyu Zhou, Bingqing Wang, Zhe Feng
Explanations accompanied by a recommendation can assist users in understanding the decision made by recommendation systems, which in turn increases a user's confidence and trust in the system.