Search Results for author: Huijia Liang

Found 1 papers, 0 papers with code

LLMs Instruct LLMs:An Extraction and Editing Method

no code implementations23 Mar 2024 Xin Zhang, Tianjie Ju, Huijia Liang, Ying Fu, Qin Zhang

The interest in updating Large Language Models (LLMs) without retraining from scratch is substantial, yet it comes with some challenges. This is especially true for situations demanding complex reasoning with limited samples, a scenario we refer to as the Paucity-Constrained Complex Reasoning Adaptation for LLMs (PCRA-LLM). Traditional methods like Low-Rank Adaptation (LoRA) and Retrieval-Augmented Generation (RAG) are inadequate for this critical issue, particularly evident in our exploration of a specific medical context that epitomize the PCRA-LLM's distinct needs. To address the issue, we propose a Sequential Fusion method to incorporate knowledge from complex context into LLMs.

Knowledge Graphs Question Answering

Cannot find the paper you are looking for? You can Submit a new open access paper.