Search Results for author: Qi Su

Found 35 papers, 11 papers with code

汉语竞争类多人游戏语言中疑问句的形式与功能(The Form and Function of Interrogatives in Multi-party Chinese Competitive Game Conversation)

no code implementations CCL 2020 Wenxian Zhang, Qi Su

本文基于自建的竞争类多人游戏对话语料库对汉语疑问句的形式与功能进行了考察。文章首先在前人研究的基础上将疑问句的类型分为五大类, 然后考察不同类型的疑问句在对话中出现的位置与功能。研究显示, 是非问(包括反复问)与特指问是最常见的类型, 选择问使用频率最低。大部分疑问句会引起话轮转换, 具有询问功能, 此外, 否定与指出事实也是疑问句的主要功能。特指问的否定功能与附加问指出事实的 功能比较突出。

Chinese Word Segmentation with Heterogeneous Graph Neural Network

no code implementations22 Jan 2022 Xuemei Tang, Jun Wang, Qi Su

In recent years, deep learning has achieved significant success in the Chinese word segmentation (CWS) task.

Chinese Word Segmentation Language Modelling

One-shot Weakly-Supervised Segmentation in Medical Images

1 code implementation21 Nov 2021 Wenhui Lei, Qi Su, Ran Gu, Na Wang, Xinglong Liu, Guotai Wang, Xiaofan Zhang, Shaoting Zhang

Deep neural networks usually require accurate and a large number of annotations to achieve outstanding performance in medical image segmentation.

Denoising Medical Image Segmentation +2

Adversarial Parameter Defense by Multi-Step Risk Minimization

no code implementations7 Sep 2021 Zhiyuan Zhang, Ruixuan Luo, Xuancheng Ren, Qi Su, Liangyou Li, Xu sun

To enhance neural networks, we propose the adversarial parameter defense algorithm that minimizes the average risk of multiple adversarial parameter corruptions.

A Global Past-Future Early Exit Method for Accelerating Inference of Pre-trained Language Models

1 code implementation NAACL 2021 Kaiyuan Liao, Yi Zhang, Xuancheng Ren, Qi Su, Xu sun, Bin He

We first take into consideration all the linguistic information embedded in the past layers and then take a further step to engage the future information which is originally inaccessible for predictions.

Neural Network Surgery: Injecting Data Patterns into Pre-trained Models with Minimal Instance-wise Side Effects

no code implementations NAACL 2021 Zhiyuan Zhang, Xuancheng Ren, Qi Su, Xu sun, Bin He

Motivated by neuroscientific evidence and theoretical results, we demonstrate that side effects can be controlled by the number of changed parameters and thus, we propose to conduct \textit{neural network surgery} by only modifying a limited number of parameters.

Alleviating the Knowledge-Language Inconsistency: A Study for Deep Commonsense Knowledge

no code implementations28 May 2021 Yi Zhang, Lei LI, Yunfang Wu, Qi Su, Xu sun

Knowledge facts are typically represented by relational triples, while we observe that some commonsense facts are represented by the triples whose forms are inconsistent with the expression of language.

Evolution of cooperation with asymmetric social interactions

no code implementations3 May 2021 Qi Su, Joshua. B Plotkin

How cooperation emerges in human societies is both an evolutionary enigma, and a practical problem with tangible implications for societal health.

Sina Mandarin Alphabetical Words:A Web-driven Code-mixing Lexical Resource

no code implementations Asian Chapter of the Association for Computational Linguistics 2020 Rong Xiang, Mingyu Wan, Qi Su, Chu-Ren Huang, Qin Lu

Mandarin Alphabetical Word (MAW) is one indispensable component of Modern Chinese that demonstrates unique code-mixing idiosyncrasies influenced by language exchanges.

Using Conceptual Norms for Metaphor Detection

no code implementations WS 2020 Mingyu WAN, Kathleen Ahrens, Emmanuele Chersoni, Menghan Jiang, Qi Su, Rong Xiang, Chu-Ren Huang

This paper reports a linguistically-enriched method of detecting token-level metaphors for the second shared task on Metaphor Detection.

Jointly Modeling Aspect and Sentiment with Dynamic Heterogeneous Graph Neural Networks

2 code implementations14 Apr 2020 Shu Liu, Wei Li, Yunfang Wu, Qi Su, Xu sun

Target-Based Sentiment Analysis aims to detect the opinion aspects (aspect extraction) and the sentiment polarities (sentiment detection) towards them.

Aspect Extraction Sentiment Analysis

Explicit Sparse Transformer: Concentrated Attention Through Explicit Selection

2 code implementations25 Dec 2019 Guangxiang Zhao, Junyang Lin, Zhiyuan Zhang, Xuancheng Ren, Qi Su, Xu sun

Self-attention based Transformer has demonstrated the state-of-the-art performances in a number of natural language processing tasks.

Image Captioning Language Modelling +2

HighwayGraph: Modelling Long-distance Node Relations for Improving General Graph Neural Network

no code implementations10 Nov 2019 Deli Chen, Xiaoqian Liu, Yankai Lin, Peng Li, Jie zhou, Qi Su, Xu sun

To address this issue, we propose to model long-distance node relations by simply relying on shallow GNN architectures with two solutions: (1) Implicitly modelling by learning to predict node pair relations (2) Explicitly modelling by adding edges between nodes that potentially have the same label.

General Classification Node Classification

Specificity-Driven Cascading Approach for Unsupervised Sentiment Modification

no code implementations IJCNLP 2019 Pengcheng Yang, Junyang Lin, Jingjing Xu, Jun Xie, Qi Su, Xu sun

The task of unsupervised sentiment modification aims to reverse the sentiment polarity of the input text while preserving its semantic content without any parallel data.

Memorized Sparse Backpropagation

no code implementations24 May 2019 Zhiyuan Zhang, Pengcheng Yang, Xuancheng Ren, Qi Su, Xu sun

Neural network learning is usually time-consuming since backpropagation needs to compute full gradients and backpropagate them across multiple layers.

Learning When to Concentrate or Divert Attention: Self-Adaptive Attention Temperature for Neural Machine Translation

1 code implementation EMNLP 2018 Junyang Lin, Xu sun, Xuancheng Ren, Muyu Li, Qi Su

Most of the Neural Machine Translation (NMT) models are based on the sequence-to-sequence (Seq2Seq) model with an encoder-decoder framework equipped with the attention mechanism.

Machine Translation Translation

Deconvolution-Based Global Decoding for Neural Machine Translation

1 code implementation COLING 2018 Junyang Lin, Xu sun, Xuancheng Ren, Shuming Ma, Jinsong Su, Qi Su

A great proportion of sequence-to-sequence (Seq2Seq) models for Neural Machine Translation (NMT) adopt Recurrent Neural Network (RNN) to generate translation word by word following a sequential order.

Machine Translation Translation

Global Encoding for Abstractive Summarization

4 code implementations ACL 2018 Junyang Lin, Xu sun, Shuming Ma, Qi Su

To tackle the problem, we propose a global encoding framework, which controls the information flow from the encoder to the decoder based on the global information of the source context.

Abstractive Text Summarization

Automatic Translating Between Ancient Chinese and Contemporary Chinese with Limited Aligned Corpora

no code implementations5 Mar 2018 Zhiyuan Zhang, Wei Li, Qi Su

In this paper, we propose to build an end-to-end neural model to automatically translate between ancient and contemporary Chinese.

Translation

Decoding-History-Based Adaptive Control of Attention for Neural Machine Translation

no code implementations6 Feb 2018 Junyang Lin, Shuming Ma, Qi Su, Xu sun

ACA learns to control the attention by keeping track of the decoding history and the current information with a memory vector, so that the model can take the translated contents and the current information into consideration.

Machine Translation Translation

A Discourse-Level Named Entity Recognition and Relation Extraction Dataset for Chinese Literature Text

2 code implementations19 Nov 2017 Jingjing Xu, Ji Wen, Xu sun, Qi Su

To build a high quality dataset, we propose two tagging methods to solve the problem of data inconsistency, including a heuristic tagging method and a machine auxiliary tagging method.

Named Entity Recognition NER +1

Cannot find the paper you are looking for? You can Submit a new open access paper.