Search Results for author: Shanfeng Zhu

Found 7 papers, 3 papers with code

VANER: Leveraging Large Language Model for Versatile and Adaptive Biomedical Named Entity Recognition

no code implementations27 Apr 2024 Junyi Biana, Weiqi Zhai, Xiaodi Huang, Jiaxuan Zheng, Shanfeng Zhu

By combining the LLM's understanding of instructions with sequence labeling techniques, we use mix of datasets to train a model capable of extracting various types of entities.

Language Modelling Large Language Model +3

Inspire the Large Language Model by External Knowledge on BioMedical Named Entity Recognition

no code implementations21 Sep 2023 Junyi Bian, Jiaxuan Zheng, Yuyi Zhang, Shanfeng Zhu

In this paper, inspired by Chain-of-thought, we leverage the LLM to solve the Biomedical NER step-by-step: break down the NER task into entity span extraction and entity type determination.

Language Modelling Large Language Model +3

HAXMLNet: Hierarchical Attention Network for Extreme Multi-Label Text Classification

no code implementations24 Mar 2019 Ronghui You, Zihan Zhang, Suyang Dai, Shanfeng Zhu

Extreme multi-label text classification (XMTC) addresses the problem of tagging each text with the most relevant labels from an extreme-scale label set.

General Classification Multi Label Text Classification +2

AttentionXML: Label Tree-based Attention-Aware Deep Model for High-Performance Extreme Multi-Label Text Classification

3 code implementations NeurIPS 2019 Ronghui You, Zihan Zhang, Ziye Wang, Suyang Dai, Hiroshi Mamitsuka, Shanfeng Zhu

We propose a new label tree-based deep learning model for XMTC, called AttentionXML, with two unique features: 1) a multi-label attention mechanism with raw text as input, which allows to capture the most relevant part of text to each label; and 2) a shallow and wide probabilistic label tree (PLT), which allows to handle millions of labels, especially for "tail labels".

General Classification Multi-Label Text Classification +3

Cannot find the paper you are looking for? You can Submit a new open access paper.