Search Results for author: Yingqi Qu

Found 10 papers, 5 papers with code

Self-Evaluation of Large Language Model based on Glass-box Features

no code implementations7 Mar 2024 Hui Huang, Yingqi Qu, Jing Liu, Muyun Yang, Tiejun Zhao

The proliferation of open-source Large Language Models (LLMs) underscores the pressing need for evaluation methods.

Language Modelling Large Language Model

BASES: Large-scale Web Search User Simulation with Large Language Model based Agents

no code implementations27 Feb 2024 Ruiyang Ren, Peng Qiu, Yingqi Qu, Jing Liu, Wayne Xin Zhao, Hua Wu, Ji-Rong Wen, Haifeng Wang

Due to the excellent capacities of large language models (LLMs), it becomes feasible to develop LLM-based agents for reliable user simulation.

Information Retrieval Language Modelling +3

Investigating the Factual Knowledge Boundary of Large Language Models with Retrieval Augmentation

1 code implementation20 Jul 2023 Ruiyang Ren, Yuhao Wang, Yingqi Qu, Wayne Xin Zhao, Jing Liu, Hao Tian, Hua Wu, Ji-Rong Wen, Haifeng Wang

In this study, we present an initial analysis of the factual knowledge boundaries of LLMs and how retrieval augmentation affects LLMs on open-domain QA.

Open-Domain Question Answering Retrieval +1

A Thorough Examination on Zero-shot Dense Retrieval

no code implementations27 Apr 2022 Ruiyang Ren, Yingqi Qu, Jing Liu, Wayne Xin Zhao, Qifei Wu, Yuchen Ding, Hua Wu, Haifeng Wang, Ji-Rong Wen

Recent years have witnessed the significant advance in dense retrieval (DR) based on powerful pre-trained language models (PLM).

Retrieval

Question Answering over Freebase via Attentive RNN with Similarity Matrix based CNN

no code implementations10 Apr 2018 Yingqi Qu, Jie Liu, Liangyi Kang, Qinfeng Shi, Dan Ye

To preserve more original information, we propose an attentive recurrent neural network with similarity matrix based convolutional neural network (AR-SMCNN) model, which is able to capture comprehensive hierarchical information utilizing the advantages of both RNN and CNN.

Question Answering

Cannot find the paper you are looking for? You can Submit a new open access paper.