1 code implementation • Findings (ACL) 2022 • Wei Li, Yuhan Song, Qi Su, Yanqiu Shao
Word Segmentation is a fundamental step for understanding Chinese language.
no code implementations • CCL 2020 • Wenxian Zhang, Qi Su
本文基于自建的竞争类多人游戏对话语料库对汉语疑问句的形式与功能进行了考察。文章首先在前人研究的基础上将疑问句的类型分为五大类, 然后考察不同类型的疑问句在对话中出现的位置与功能。研究显示, 是非问(包括反复问)与特指问是最常见的类型, 选择问使用频率最低。大部分疑问句会引起话轮转换, 具有询问功能, 此外, 否定与指出事实也是疑问句的主要功能。特指问的否定功能与附加问指出事实的 功能比较突出。
1 code implementation • 22 Mar 2024 • Xuemei Tang, Zekun Deng, Qi Su, Hao Yang, Jun Wang
Additionally, we have evaluated the capabilities of Large Language Models (LLMs) in the context of tasks related to ancient Chinese history.
1 code implementation • 11 Mar 2024 • Siyu Duan, Jun Wang, Qi Su
Cultural heritage serves as the enduring record of human thought and history.
no code implementations • 22 Feb 2024 • Xuemei Tang, Jun Wang, Qi Su
Recently, large language models (LLMs) have been successful in relational extraction (RE) tasks, especially in the few-shot learning.
no code implementations • 21 Feb 2024 • Xuemei Tang, Qi Su
To address this challenge, we propose a two-stage curriculum learning (TCL) framework specifically designed for sequence labeling tasks.
no code implementations • 13 Dec 2023 • Shengguang Wu, Zhenglun Chen, Qi Su
Ancient artifacts are an important medium for cultural preservation and restoration.
no code implementations • 12 Dec 2023 • Shengguang Wu, Mei Yuan, Qi Su
Recent advances in image and video creation, especially AI-based image synthesis, have led to the production of numerous visual scenes that exhibit a high level of abstractness and diversity.
1 code implementation • 14 Nov 2023 • Shengguang Wu, Keming Lu, Benfeng Xu, Junyang Lin, Qi Su, Chang Zhou
The key to our data sampling technique lies in the enhancement of diversity in the chosen subsets, as the model selects new data points most distinct from any existing ones according to its current embedding space.
no code implementations • 24 Jul 2023 • Qi Su, Na Wang, Jiawen Xie, Yinan Chen, Xiaofan Zhang
Therefore, we propose a new automatic lung lobe segmentation framework, in which we urge the model to pay attention to the area around the pulmonary fissure during the training process, which is realized by a task-specific loss function.
no code implementations • 3 Jun 2023 • Xuemei Tang, Jun Wang, Qi Su
Recently, it is quite common to integrate Chinese sequence labeling results to enhance syntactic and semantic parsing.
no code implementations • 6 May 2023 • XIAOYU GUO, Xiang Wei, Qi Su, Huiqin Zhao, Shunli Zhang
Semantic segmentation in rainy scenes is a challenging task due to the complex environment, class distribution imbalance, and limited annotated data.
1 code implementation • ACL 2023 • Jiawen Xie, Qi Su, Shaoting Zhang, and Xiaofan Zhang
Most Transformer based abstractive summarization systems have a severe mismatch between training and inference, i. e., exposure bias.
1 code implementation • 10 Mar 2023 • Minghui Zhang, Yangqian Wu, Hanxiao Zhang, Yulei Qin, Hao Zheng, Wen Tang, Corey Arnold, Chenhao Pei, Pengxin Yu, Yang Nan, Guang Yang, Simon Walsh, Dominic C. Marshall, Matthieu Komorowski, Puyang Wang, Dazhou Guo, Dakai Jin, Ya'nan Wu, Shuiqing Zhao, Runsheng Chang, Boyu Zhang, Xing Lv, Abdul Qayyum, Moona Mazher, Qi Su, Yonghuang Wu, Ying'ao Liu, Yufei Zhu, Jiancheng Yang, Ashkan Pakzad, Bojidar Rangelov, Raul San Jose Estepar, Carlos Cano Espinosa, Jiayuan Sun, Guang-Zhong Yang, Yun Gu
In recent years, new methods have extended the reach of pulmonary airway segmentation that is closer to the limit of image resolution.
no code implementations • 13 Oct 2022 • Zhiyuan Zhang, Ruixuan Luo, Qi Su, Xu sun
It demonstrates that flat minima tend to imply better generalization abilities.
no code implementations • 13 Oct 2022 • Zhiyuan Zhang, Qi Su, Xu sun
NLP attacks tend to have small relative backdoor strengths, which may result in the failure of robust federated aggregation methods for NLP attacks.
no code implementations • ACL 2022 • Xuemei Tang, Qi Su, Jun Wang
The evolution of language follows the rule of gradual change.
no code implementations • 6 Sep 2022 • Guocheng Wang, Qi Su, Long Wang, Joshua B. Plotkin
The concept of fitness is central to evolution, but it quantifies only the expected number of offspring an individual will produce.
no code implementations • 22 Jan 2022 • Xuemei Tang, Jun Wang, Qi Su
In recent years, deep learning has achieved significant success in the Chinese word segmentation (CWS) task.
1 code implementation • 21 Nov 2021 • Wenhui Lei, Qi Su, Ran Gu, Na Wang, Xinglong Liu, Guotai Wang, Xiaofan Zhang, Shaoting Zhang
Deep neural networks usually require accurate and a large number of annotations to achieve outstanding performance in medical image segmentation.
no code implementations • 7 Sep 2021 • Zhiyuan Zhang, Ruixuan Luo, Xuancheng Ren, Qi Su, Liangyou Li, Xu sun
To enhance neural networks, we propose the adversarial parameter defense algorithm that minimizes the average risk of multiple adversarial parameter corruptions.
no code implementations • NAACL 2021 • Zhiyuan Zhang, Xuancheng Ren, Qi Su, Xu sun, Bin He
Motivated by neuroscientific evidence and theoretical results, we demonstrate that side effects can be controlled by the number of changed parameters and thus, we propose to conduct \textit{neural network surgery} by only modifying a limited number of parameters.
1 code implementation • NAACL 2021 • Kaiyuan Liao, Yi Zhang, Xuancheng Ren, Qi Su, Xu sun, Bin He
We first take into consideration all the linguistic information embedded in the past layers and then take a further step to engage the future information which is originally inaccessible for predictions.
no code implementations • 28 May 2021 • Yi Zhang, Lei LI, Yunfang Wu, Qi Su, Xu sun
Knowledge facts are typically represented by relational triples, while we observe that some commonsense facts are represented by the triples whose forms are inconsistent with the expression of language.
no code implementations • 3 May 2021 • Qi Su, Joshua. B Plotkin
How cooperation emerges in human societies is both an evolutionary enigma, and a practical problem with tangible implications for societal health.
no code implementations • Asian Chapter of the Association for Computational Linguistics 2020 • Rong Xiang, Mingyu Wan, Qi Su, Chu-Ren Huang, Qin Lu
Mandarin Alphabetical Word (MAW) is one indispensable component of Modern Chinese that demonstrates unique code-mixing idiosyncrasies influenced by language exchanges.
1 code implementation • Asian Chapter of the Association for Computational Linguistics 2020 • Ximing Liu, Wei Xue, Qi Su, Weiran Nie, Wei Peng
Creating high-quality annotated dialogue corpora is challenging.
Automatic Speech Recognition Automatic Speech Recognition (ASR) +2
no code implementations • Findings of the Association for Computational Linguistics 2020 • Zhiyuan Zhang, Xiaoqian Liu, Yi Zhang, Qi Su, Xu sun, Bin He
Conventional knowledge graph embedding (KGE) often suffers from limited knowledge representation, leading to performance degradation especially on the low-resource problem.
no code implementations • WS 2020 • Mingyu WAN, Kathleen Ahrens, Emmanuele Chersoni, Menghan Jiang, Qi Su, Rong Xiang, Chu-Ren Huang
This paper reports a linguistically-enriched method of detecting token-level metaphors for the second shared task on Metaphor Detection.
2 code implementations • 14 Apr 2020 • Shu Liu, Wei Li, Yunfang Wu, Qi Su, Xu sun
Target-Based Sentiment Analysis aims to detect the opinion aspects (aspect extraction) and the sentiment polarities (sentiment detection) towards them.
2 code implementations • 25 Dec 2019 • Guangxiang Zhao, Junyang Lin, Zhiyuan Zhang, Xuancheng Ren, Qi Su, Xu sun
Self-attention based Transformer has demonstrated the state-of-the-art performances in a number of natural language processing tasks.
no code implementations • 1 Dec 2019 • Zhiyuan Zhang, Xiaoqian Liu, Yi Zhang, Qi Su, Xu sun, Bin He
Learning knowledge graph embeddings (KGEs) is an efficient approach to knowledge graph completion.
no code implementations • 10 Nov 2019 • Deli Chen, Xiaoqian Liu, Yankai Lin, Peng Li, Jie zhou, Qi Su, Xu sun
To address this issue, we propose to model long-distance node relations by simply relying on shallow GNN architectures with two solutions: (1) Implicitly modelling by learning to predict node pair relations (2) Explicitly modelling by adding edges between nodes that potentially have the same label.
no code implementations • IJCNLP 2019 • Pengcheng Yang, Junyang Lin, Jingjing Xu, Jun Xie, Qi Su, Xu sun
The task of unsupervised sentiment modification aims to reverse the sentiment polarity of the input text while preserving its semantic content without any parallel data.
no code implementations • WS 2019 • Deli Chen, Shuming Ma, Keiko Harimoto, Ruihan Bao, Qi Su, Xu sun
In this work, we propose a BERT-based Hierarchical Aggregation Model to summarize a large amount of finance news to predict forex movement.
no code implementations • 24 May 2019 • Zhiyuan Zhang, Pengcheng Yang, Xuancheng Ren, Qi Su, Xu sun
Neural network learning is usually time-consuming since backpropagation needs to compute full gradients and backpropagate them across multiple layers.
no code implementations • 10 Sep 2018 • Pengcheng Yang, Shuming Ma, Yi Zhang, Junyang Lin, Qi Su, Xu sun
However, the Seq2Seq model is not suitable for the MLTC task in essence.
1 code implementation • EMNLP 2018 • Junyang Lin, Qi Su, Pengcheng Yang, Shuming Ma, Xu sun
We propose a novel model for multi-label text classification, which is based on sequence-to-sequence learning.
1 code implementation • EMNLP 2018 • Junyang Lin, Xu sun, Xuancheng Ren, Muyu Li, Qi Su
Most of the Neural Machine Translation (NMT) models are based on the sequence-to-sequence (Seq2Seq) model with an encoder-decoder framework equipped with the attention mechanism.
Ranked #7 on Machine Translation on IWSLT2015 English-Vietnamese
1 code implementation • COLING 2018 • Junyang Lin, Xu sun, Xuancheng Ren, Shuming Ma, Jinsong Su, Qi Su
A great proportion of sequence-to-sequence (Seq2Seq) models for Neural Machine Translation (NMT) adopt Recurrent Neural Network (RNN) to generate translation word by word following a sequential order.
Ranked #9 on Machine Translation on IWSLT2015 English-Vietnamese
4 code implementations • ACL 2018 • Junyang Lin, Xu sun, Shuming Ma, Qi Su
To tackle the problem, we propose a global encoding framework, which controls the information flow from the encoder to the decoder based on the global information of the source context.
Ranked #29 on Text Summarization on GigaWord
no code implementations • 10 May 2018 • Bingzhen Wei, Xuancheng Ren, Xu sun, Yi Zhang, Xiaoyan Cai, Qi Su
Especially, the proposed approach improves the semantic consistency by 4\% in terms of human evaluation.
no code implementations • NAACL 2018 • Ji Wen, Xu sun, Xuancheng Ren, Qi Su
In this paper, we propose the task of relation classification for Chinese literature text.
no code implementations • 5 Mar 2018 • Zhiyuan Zhang, Wei Li, Qi Su
In this paper, we propose to build an end-to-end neural model to automatically translate between ancient and contemporary Chinese.
no code implementations • 6 Feb 2018 • Junyang Lin, Shuming Ma, Qi Su, Xu sun
ACA learns to control the attention by keeping track of the decoding history and the current information with a memory vector, so that the model can take the translated contents and the current information into consideration.
2 code implementations • 19 Nov 2017 • Jingjing Xu, Ji Wen, Xu sun, Qi Su
To build a high quality dataset, we propose two tagging methods to solve the problem of data inconsistency, including a heuristic tagging method and a machine auxiliary tagging method.
1 code implementation • ACL 2017 • Shuming Ma, Xu sun, Jingjing Xu, Houfeng Wang, Wenjie Li, Qi Su
In this work, our goal is to improve semantic relevance between source texts and summaries for Chinese social media summarization.