Search Results for author: Yiqun Ya

Found 1 papers, 0 papers with code

Quantifying and Attributing the Hallucination of Large Language Models via Association Analysis

no code implementations11 Sep 2023 Li Du, Yequan Wang, Xingrun Xing, Yiqun Ya, Xiang Li, Xin Jiang, Xuezhi Fang

Although demonstrating superb performance on various NLP tasks, large language models (LLMs) still suffer from the hallucination problem, which threatens the reliability of LLMs.

Hallucination Instruction Following +2

Cannot find the paper you are looking for? You can Submit a new open access paper.