no code implementations • 8 Jan 2025 • Terrance Yu-Hao Chen, Yulin Chen, Pontus Soederhaell, Sadrishya Agrawal, Kateryna Shapovalenko
These findings lay the groundwork for future research on EEG speech perception decoding, with possible extensions to speech production tasks such as silent or imagined speech.
no code implementations • 19 Aug 2024 • Haoran Li, Wei Fan, Yulin Chen, Jiayang Cheng, Tianshu Chu, Xuebing Zhou, Peizhao Hu, Yangqiu Song
Unlike prior works on CI that either cover limited expert annotated norms or model incomplete social context, our proposed privacy checklist uses the whole Health Insurance Portability and Accountability Act of 1996 (HIPAA) as an example, to show that we can resort to large language models (LLMs) to completely cover the HIPAA's regulations.
1 code implementation • 13 May 2024 • Haoran Li, Yulin Chen, Zihao Zheng, Qi Hu, Chunkit Chan, Heshan Liu, Yangqiu Song
We initially propose Overwrite Supervised Fine-tuning (OSFT) for effective backdoor removal when the trigger is known.
no code implementations • 1 Apr 2024 • Yulin Chen, Guoheng Huang, Kai Huang, Zijin Lin, Guo Zhong, Shenghong Luo, Jie Deng, Jian Zhou
This novel framework offers improved performance with fewer parameters and holds significant potential for accurate segmentation of lesion regions in various medical tasks, making it clinically valuable.
no code implementations • 13 Mar 2024 • Ning Ding, Yulin Chen, Ganqu Cui, Xingtai Lv, Weilin Zhao, Ruobing Xie, BoWen Zhou, Zhiyuan Liu, Maosong Sun
Underlying data distributions of natural language, programming code, and mathematical symbols vary vastly, presenting a complex challenge for large language models (LLMs) that strive to achieve high performance across all three domains simultaneously.
1 code implementation • 20 Nov 2023 • Ning Ding, Xingtai Lv, Qiaosen Wang, Yulin Chen, BoWen Zhou, Zhiyuan Liu, Maosong Sun
Recognizing the need for more flexible adaptation, we extend the methodology of LoRA to an innovative approach we call sparse low-rank adaptation (SoRA) that enables dynamic adjustments to the intrinsic rank during the adaptation process.
1 code implementation • 19 Oct 2023 • Zhenran Xu, Yulin Chen, Baotian Hu, Min Zhang
Zero-shot entity linking (EL) aims at aligning entity mentions to unseen entities to challenge the generalization ability.
1 code implementation • 19 Oct 2023 • Yulin Chen, Zhenran Xu, Baotian Hu, Min Zhang
Entity linking aims to link ambiguous mentions to their corresponding entities in a knowledge base.
no code implementations • 16 Oct 2023 • Haoran Li, Yulin Chen, Jinglong Luo, Jiecong Wang, Hao Peng, Yan Kang, Xiaojin Zhang, Qi Hu, Chunkit Chan, Zenglin Xu, Bryan Hooi, Yangqiu Song
The advancement of large language models (LLMs) has significantly enhanced the ability to effectively tackle various downstream NLP tasks and unify these tasks into generative pipelines.
no code implementations • 15 Sep 2023 • Yulin Chen, Ning Ding, Hai-Tao Zheng, Zhiyuan Liu, Maosong Sun, BoWen Zhou
Artificial intelligence has been applied in various aspects of online education to facilitate teaching and learning.
no code implementations • 7 Aug 2023 • Junzhou Chen, Qian Huang, Yulin Chen, Linyi Qian, Chengyuan Yu
Additionally, we introduce a post-processing method that combines the target information and target contours to distinguish overlapping nuclei and generate an instance segmentation image.
no code implementations • 5 Aug 2023 • Linyi Qian, Qian Huang, Yulin Chen, Junzhou Chen
To address this issue, we propose a Voting-Stacking ensemble strategy, which employs three Inception networks as base learners and integrates their outputs through a voting ensemble.
no code implementations • 31 May 2023 • Yulin Chen, Ning Ding, Xiaobin Wang, Shengding Hu, Hai-Tao Zheng, Zhiyuan Liu, Pengjun Xie
Consistently scaling pre-trained language models (PLMs) imposes substantial burdens on model adaptation, necessitating more efficient alternatives to conventional fine-tuning.
1 code implementation • 23 May 2023 • Ning Ding, Yulin Chen, Bokai Xu, Yujia Qin, Zhi Zheng, Shengding Hu, Zhiyuan Liu, Maosong Sun, BoWen Zhou
Fine-tuning on instruction data has been widely validated as an effective practice for implementing chat language models like ChatGPT.
no code implementations • 16 May 2023 • Huan Mao, Yulin Chen, ZongTan Li, Feng Chen, Pingping Chen
Detection-based tracking is one of the main methods of multi-object tracking.
1 code implementation • 14 Nov 2022 • Xiaozhi Wang, Yulin Chen, Ning Ding, Hao Peng, Zimu Wang, Yankai Lin, Xu Han, Lei Hou, Juanzi Li, Zhiyuan Liu, Peng Li, Jie zhou
It contains 103, 193 event coreference chains, 1, 216, 217 temporal relations, 57, 992 causal relations, and 15, 841 subevent relations, which is larger than existing datasets of all the ERE tasks by at least an order of magnitude.
no code implementations • 10 Nov 2022 • Ning Ding, Yulin Chen, Ganqu Cui, Xiaobin Wang, Hai-Tao Zheng, Zhiyuan Liu, Pengjun Xie
Moreover, it is more convenient to perform metric-based classification with hypersphere prototypes than statistical modeling, as we only need to calculate the distance from a data point to the surface of the hypersphere.
1 code implementation • 14 Mar 2022 • Ning Ding, Yujia Qin, Guang Yang, Fuchao Wei, Zonghan Yang, Yusheng Su, Shengding Hu, Yulin Chen, Chi-Min Chan, Weize Chen, Jing Yi, Weilin Zhao, Xiaozhi Wang, Zhiyuan Liu, Hai-Tao Zheng, Jianfei Chen, Yang Liu, Jie Tang, Juanzi Li, Maosong Sun
This necessitates a new branch of research focusing on the parameter-efficient adaptation of PLMs, dubbed as delta tuning in this paper.
no code implementations • 24 Jan 2022 • Yulin Chen, Beishui Liao, Bruno Bentzen, Bo Yuan, Zelai Yao, Haixiao Chi, Dov Gabbay
In this paper, we propose a novel interpretable method, BTPK (Binary Talmudic Public Announcement Logic model), to help users understand the internal recognition logic of the name entity recognition tasks based on Talmudic Public Announcement Logic.
2 code implementations • ACL 2022 • Ning Ding, Shengding Hu, Weilin Zhao, Yulin Chen, Zhiyuan Liu, Hai-Tao Zheng, Maosong Sun
Prompt-learning has become a new paradigm in modern natural language processing, which directly adapts pre-trained language models (PLMs) to $cloze$-style prediction, autoregressive modeling, or sequence to sequence generation, resulting in promising performances on various tasks.
no code implementations • 29 Sep 2021 • Ning Ding, Yulin Chen, Xiaobin Wang, Hai-Tao Zheng, Zhiyuan Liu, Pengjun Xie
A big prototype could be effectively modeled by two sets of learnable parameters, one is the center of the hypersphere, which is an embedding with the same dimension of training examples.
no code implementations • 24 Aug 2021 • Ning Ding, Yulin Chen, Xu Han, Guangwei Xu, Pengjun Xie, Hai-Tao Zheng, Zhiyuan Liu, Juanzi Li, Hong-Gee Kim
In this work, we investigate the application of prompt-learning on fine-grained entity typing in fully supervised, few-shot and zero-shot scenarios.
7 code implementations • ACL 2021 • Ning Ding, Guangwei Xu, Yulin Chen, Xiaobin Wang, Xu Han, Pengjun Xie, Hai-Tao Zheng, Zhiyuan Liu
In this paper, we present Few-NERD, a large-scale human-annotated few-shot NER dataset with a hierarchy of 8 coarse-grained and 66 fine-grained entity types.
Ranked #6 on Named Entity Recognition (NER) on Few-NERD (SUP)
no code implementations • 17 Feb 2021 • Cuiying Pei, Suhua Jin, Peihao Huang, Anna Vymazalova, Lingling Gao, Yi Zhao, Weizheng Cao, Changhua Li, Peter Nemes-Incze, Yulin Chen, Hanyu Liu, Gang Li, Yanpeng Qi
Recently monolayer jacutingaite (Pt2HgSe3), a naturally occurring exfoliable mineral, discovered in Brazil in 2008, has been theoretically predicted as a candidate quantum spin Hall system with a 0. 5 eV band gap, while the bulk form is one of only a few known dual-topological insulators which may host different surface states protected by symmetries.
Band Gap Superconductivity Materials Science
no code implementations • 8 Sep 2020 • Yucong Lin, Keming Lu, Yulin Chen, Chuan Hong, Sheng Yu
In this paper, we present Hi-RES, a framework for high-throughput relation extraction algorithm development.
no code implementations • 9 Sep 2019 • Wujun Shi, Benjamin J. Wieder, H. L. Meyerheim, Yan Sun, Yang Zhang, Yiwei Li, Lei Shen, Yanpeng Qi, Lexian Yang, Jagannath Jena, Peter Werner, Klaus Koepernik, Stuart Parkin, Yulin Chen, Claudia Felser, B. Andrei Bernevig, Zhijun Wang
We here demonstrate that the room-temperature phase of (TaSe$_4$)$_2$I is a Weyl semimetal with 24 pairs of Weyl nodes.
Band Gap Materials Science Strongly Correlated Electrons