Recognizing the need for more flexible adaptation, we extend the methodology of LoRA to an innovative approach we call sparse low-rank adaptation (SoRA) that enables dynamic adjustments to the intrinsic rank during the adaptation process.
Zero-shot entity linking (EL) aims at aligning entity mentions to unseen entities to challenge the generalization ability.
The advancement of large language models (LLMs) has significantly enhanced the ability to effectively tackle various downstream NLP tasks and unify these tasks into generative pipelines.
Additionally, we introduce a post-processing method that combines the target information and target contours to distinguish overlapping nuclei and generate an instance segmentation image.
To address this issue, we propose a Voting-Stacking ensemble strategy, which employs three Inception networks as base learners and integrates their outputs through a voting ensemble.
Consistently scaling pre-trained language models (PLMs) imposes substantial burdens on model adaptation, necessitating more efficient alternatives to conventional fine-tuning.
Fine-tuning on instruction data has been widely validated as an effective practice for implementing chat language models like ChatGPT.
It contains 103, 193 event coreference chains, 1, 216, 217 temporal relations, 57, 992 causal relations, and 15, 841 subevent relations, which is larger than existing datasets of all the ERE tasks by at least an order of magnitude.
Moreover, it is more convenient to perform metric-based classification with hypersphere prototypes than statistical modeling, as we only need to calculate the distance from a data point to the surface of the hypersphere.
1 code implementation • 14 Mar 2022 • Ning Ding, Yujia Qin, Guang Yang, Fuchao Wei, Zonghan Yang, Yusheng Su, Shengding Hu, Yulin Chen, Chi-Min Chan, Weize Chen, Jing Yi, Weilin Zhao, Xiaozhi Wang, Zhiyuan Liu, Hai-Tao Zheng, Jianfei Chen, Yang Liu, Jie Tang, Juanzi Li, Maosong Sun
This necessitates a new branch of research focusing on the parameter-efficient adaptation of PLMs, dubbed as delta tuning in this paper.
In this paper, we propose a novel interpretable method, BTPK (Binary Talmudic Public Announcement Logic model), to help users understand the internal recognition logic of the name entity recognition tasks based on Talmudic Public Announcement Logic.
Prompt-learning has become a new paradigm in modern natural language processing, which directly adapts pre-trained language models (PLMs) to $cloze$-style prediction, autoregressive modeling, or sequence to sequence generation, resulting in promising performances on various tasks.
A big prototype could be effectively modeled by two sets of learnable parameters, one is the center of the hypersphere, which is an embedding with the same dimension of training examples.
In this work, we investigate the application of prompt-learning on fine-grained entity typing in fully supervised, few-shot and zero-shot scenarios.
In this paper, we present Few-NERD, a large-scale human-annotated few-shot NER dataset with a hierarchy of 8 coarse-grained and 66 fine-grained entity types.
Ranked #5 on Named Entity Recognition (NER) on Few-NERD (SUP)
no code implementations • 17 Feb 2021 • Cuiying Pei, Suhua Jin, Peihao Huang, Anna Vymazalova, Lingling Gao, Yi Zhao, Weizheng Cao, Changhua Li, Peter Nemes-Incze, Yulin Chen, Hanyu Liu, Gang Li, Yanpeng Qi
Recently monolayer jacutingaite (Pt2HgSe3), a naturally occurring exfoliable mineral, discovered in Brazil in 2008, has been theoretically predicted as a candidate quantum spin Hall system with a 0. 5 eV band gap, while the bulk form is one of only a few known dual-topological insulators which may host different surface states protected by symmetries.
Band Gap Superconductivity Materials Science
In this paper, we present Hi-RES, a framework for high-throughput relation extraction algorithm development.
no code implementations • 9 Sep 2019 • Wujun Shi, Benjamin J. Wieder, H. L. Meyerheim, Yan Sun, Yang Zhang, Yiwei Li, Lei Shen, Yanpeng Qi, Lexian Yang, Jagannath Jena, Peter Werner, Klaus Koepernik, Stuart Parkin, Yulin Chen, Claudia Felser, B. Andrei Bernevig, Zhijun Wang
We here demonstrate that the room-temperature phase of (TaSe$_4$)$_2$I is a Weyl semimetal with 24 pairs of Weyl nodes.
Band Gap Materials Science Strongly Correlated Electrons