Search Results for author: Haiying Deng

Found 4 papers, 2 papers with code

Xinyu: An Efficient LLM-based System for Commentary Generation

no code implementations21 Aug 2024 Yiquan Wu, Bo Tang, Chenyang Xi, Yu Yu, Pengyu Wang, Yifei Liu, Kun Kuang, Haiying Deng, Zhiyu Li, Feiyu Xiong, Jie Hu, Peng Cheng, Zhonghao Wang, Yi Wang, Yi Luo, MingChuan Yang

To address the advanced requirements, we present an argument ranking model for arguments and establish a comprehensive evidence database that includes up-to-date events and classic books, thereby strengthening the substantiation of the evidence with retrieval augmented generation (RAG) technology.

RAG Text Generation

NewsBench: A Systematic Evaluation Framework for Assessing Editorial Capabilities of Large Language Models in Chinese Journalism

1 code implementation29 Feb 2024 Miao Li, Ming-Bin Chen, Bo Tang, Shengbin Hou, Pengyu Wang, Haiying Deng, Zhiyu Li, Feiyu Xiong, Keming Mao, Peng Cheng, Yi Luo

We present NewsBench, a novel evaluation framework to systematically assess the capabilities of Large Language Models (LLMs) for editorial capabilities in Chinese journalism.

Ethics Multiple-choice

UHGEval: Benchmarking the Hallucination of Chinese Large Language Models via Unconstrained Generation

1 code implementation26 Nov 2023 Xun Liang, Shichao Song, Simin Niu, Zhiyu Li, Feiyu Xiong, Bo Tang, Yezhaohui Wang, Dawei He, Peng Cheng, Zhonghao Wang, Haiying Deng

These techniques encompass the use of directed hallucination induction and strategies that deliberately alter authentic text to produce hallucinations.

Benchmarking Hallucination +2

MediaGPT : A Large Language Model For Chinese Media

no code implementations20 Jul 2023 Zhonghao Wang, Zijia Lu, Bo Jin, Haiying Deng

Large language models (LLMs) have shown remarkable capabilities in generating high-quality text and making predictions based on large amounts of data, including the media domain.

Language Modelling Large Language Model

Cannot find the paper you are looking for? You can Submit a new open access paper.