Search Results for author: Yikai Guo

Found 7 papers, 4 papers with code

Recall, Retrieve and Reason: Towards Better In-Context Relation Extraction

no code implementations27 Apr 2024 Guozheng Li, Peng Wang, Wenjun Ke, Yikai Guo, Ke Ji, Ziyu Shang, Jiajun Liu, Zijie Xu

On the one hand, retrieving good demonstrations is a non-trivial process in RE, which easily results in low relevance regarding entities and relations.

In-Context Learning Language Modelling +4

Meta In-Context Learning Makes Large Language Models Better Zero and Few-Shot Relation Extractors

no code implementations27 Apr 2024 Guozheng Li, Peng Wang, Jiajun Liu, Yikai Guo, Ke Ji, Ziyu Shang, Zijie Xu

To this end, we introduce \textsc{Micre} (\textbf{M}eta \textbf{I}n-\textbf{C}ontext learning of LLMs for \textbf{R}elation \textbf{E}xtraction), a new meta-training framework for zero and few-shot RE where an LLM is tuned to do ICL on a diverse collection of RE datasets (i. e., learning to learn in context for RE).

Few-Shot Learning In-Context Learning +2

Empirical Analysis of Dialogue Relation Extraction with Large Language Models

no code implementations27 Apr 2024 Guozheng Li, Zijie Xu, Ziyu Shang, Jiajun Liu, Ke Ji, Yikai Guo

However, existing DRE methods still suffer from two serious issues: (1) hard to capture long and sparse multi-turn information, and (2) struggle to extract golden relations based on partial dialogues, which motivates us to discover more effective methods that can alleviate the above issues.

Relation Relation Extraction

ChatKBQA: A Generate-then-Retrieve Framework for Knowledge Base Question Answering with Fine-tuned Large Language Models

1 code implementation13 Oct 2023 Haoran Luo, Haihong E, Zichen Tang, Shiyao Peng, Yikai Guo, Wentai Zhang, Chenghao Ma, Guanting Dong, Meina Song, Wei Lin

Knowledge Base Question Answering (KBQA) aims to derive answers to natural language questions over large-scale knowledge bases (KBs), which are generally divided into two research components: knowledge retrieval and semantic parsing.

Knowledge Base Question Answering Knowledge Graphs +2

HAHE: Hierarchical Attention for Hyper-Relational Knowledge Graphs in Global and Local Level

1 code implementation ACL 2023 Haoran Luo, Haihong E, Yuhao Yang, Yikai Guo, Mingzhi Sun, Tianyu Yao, Zichen Tang, Kaiyang Wan, Meina Song, Wei Lin

The global-level attention can model the graphical structure of HKG using hypergraph dual-attention layers, while the local-level attention can learn the sequential structure inside H-Facts via heterogeneous self-attention layers.

Attribute Knowledge Graphs +1

Cannot find the paper you are looking for? You can Submit a new open access paper.