no code implementations • Findings (NAACL) 2022 • Wang Xu, Tiejun Zhao
Abstractive summarization can generate high quality results with the development of the neural network.
no code implementations • 12 Feb 2024 • Hongyun Zhou, Xiangyu Lu, Wang Xu, Conghui Zhu, Tiejun Zhao
Low-Rank Adaptation (LoRA) introduces auxiliary parameters for each layer to fine-tune the pre-trained model under limited computing resources.
1 code implementation • 25 Jun 2023 • Yinyu Lan, Yanru Wu, Wang Xu, Weiqiang Feng, Youhao Zhang
Entity-level fine-grained sentiment analysis in the financial domain is a crucial subtask of sentiment analysis and currently faces numerous challenges.
1 code implementation • NAACL 2022 • Wang Xu, Kehai Chen, Lili Mou, Tiejun Zhao
Document-level relation extraction (DocRE) aims to determine the relation between two entities from a document of multiple sentences.
Ranked #5 on Dialog Relation Extraction on DialogRE (F1c (v1) metric)
Dialog Relation Extraction Document-level Relation Extraction +2
2 code implementations • Findings (ACL) 2021 • Wang Xu, Kehai Chen, Tiejun Zhao
Document-level relation extraction (DocRE) models generally use graph networks to implicitly model the reasoning skill (i. e., pattern recognition, logical reasoning, coreference reasoning, etc.)
Ranked #24 on Relation Extraction on DocRED
1 code implementation • 21 Dec 2020 • Wang Xu, Kehai Chen, Tiejun Zhao
In document-level relation extraction (DocRE), graph structure is generally used to encode relation information in the input document to classify the relation category between each entity pair, and has greatly advanced the DocRE task over the past several years.
Ranked #35 on Relation Extraction on DocRED