no code implementations • 21 Sep 2023 • Junyi Bian, Jiaxuan Zheng, Yuyi Zhang, Shanfeng Zhu
In this paper, inspired by Chain-of-thought, we leverage the LLM to solve the Biomedical NER step-by-step: break down the NER task into entity span extraction and entity type determination.
1 code implementation • 27 Jun 2023 • Junyi Bian, Rongze Jiang, Weiqi Zhai, Tianyang Huang, Hong Zhou, Shanfeng Zhu
Biomedical named entity recognition (BNER) serves as the foundation for numerous biomedical text mining tasks.
1 code implementation • 18 Nov 2022 • Junyi Bian, Xiaodi Huang, Hong Zhou, Shanfeng Zhu
In this paper, we propose GoSum, a novel graph and reinforcement learning based extractive model for long-paper summarization.
Ranked #4 on Text Summarization on Pubmed
no code implementations • 24 Mar 2019 • Ronghui You, Zihan Zhang, Suyang Dai, Shanfeng Zhu
Extreme multi-label text classification (XMTC) addresses the problem of tagging each text with the most relevant labels from an extreme-scale label set.
3 code implementations • NeurIPS 2019 • Ronghui You, Zihan Zhang, Ziye Wang, Suyang Dai, Hiroshi Mamitsuka, Shanfeng Zhu
We propose a new label tree-based deep learning model for XMTC, called AttentionXML, with two unique features: 1) a multi-label attention mechanism with raw text as input, which allows to capture the most relevant part of text to each label; and 2) a shallow and wide probabilistic label tree (PLT), which allows to handle millions of labels, especially for "tail labels".