no code implementations • WS 2020 • Qian Wang, Yuchen Liu, Cong Ma, Yu Lu, Yining Wang, Long Zhou, Yang Zhao, Jiajun Zhang, Cheng-qing Zong
This paper describes the CASIA{'}s system for the IWSLT 2020 open domain translation task.
no code implementations • WS 2020 • Long Zhou, Jiajun Zhang, Cheng-qing Zong
In this work, we propose a novel Encoder-NAD-AD framework for NMT, aiming at boosting AT with global information produced by NAT model.
no code implementations • ACL 2020 • Junnan Zhu, Yu Zhou, Jiajun Zhang, Cheng-qing Zong
Cross-lingual summarization aims at summarizing a document in one language (e. g., Chinese) into another language (e. g., English).
1 code implementation • 13 Apr 2020 • Jiajun Zhang, Cheng-qing Zong
Machine translation (MT) is a technique that leverages computers to translate human languages automatically.
1 code implementation • 16 Dec 2019 • Yuchen Liu, Jiajun Zhang, Hao Xiong, Long Zhou, Zhongjun He, Hua Wu, Haifeng Wang, Cheng-qing Zong
Speech-to-text translation (ST), which translates source language speech into target language text, has attracted intensive attention in recent years.
Automatic Speech Recognition
Automatic Speech Recognition (ASR)
+4
no code implementations • IJCNLP 2019 • Junjie Li, Xuepeng Wang, Dawei Yin, Cheng-qing Zong
Review summarization aims to generate a condensed summary for a review or multiple reviews.
no code implementations • IJCNLP 2019 • Yining Wang, Jiajun Zhang, Long Zhou, Yuchen Liu, Cheng-qing Zong
In this paper, we introduce a novel interactive approach to translate a source language into two different languages simultaneously and interactively.
1 code implementation • IJCNLP 2019 • Junnan Zhu, Qian Wang, Yining Wang, Yu Zhou, Jiajun Zhang, Shaonan Wang, Cheng-qing Zong
Moreover, we propose to further improve NCLS by incorporating two related tasks, monolingual summarization and machine translation, into the training process of CLS under multi-task learning.
1 code implementation • IJCNLP 2019 • Weikang Wang, Jiajun Zhang, Qian Li, Cheng-qing Zong, Zhifei Li
In this paper, we focus on identity fraud detection in loan applications and propose to solve this problem with a novel interactive dialogue system which consists of two modules.
no code implementations • ACL 2019 • Yining Wang, Long Zhou, Jiajun Zhang, FeiFei Zhai, Jingfang Xu, Cheng-qing Zong
We verify our methods on various translation scenarios, including one-to-many, many-to-many and zero-shot.
no code implementations • 1 Jul 2019 • Kexin Wang, Yu Zhou, Shaonan Wang, Jiajun Zhang, Cheng-qing Zong
Recent work has shown that memory modules are crucial for the generalization ability of neural networks on learning simple algorithms.
no code implementations • 23 Jun 2019 • Long Zhou, Jiajun Zhang, Cheng-qing Zong, Heng Yu
The encoder-decoder framework has achieved promising process for many sequence generation tasks, such as neural machine translation and text summarization.
1 code implementation • ACL 2019 • Weikang Wang, Jiajun Zhang, Qian Li, Mei-Yuh Hwang, Cheng-qing Zong, Zhifei Li
Clarifying user needs is essential for existing task-oriented dialogue systems.
no code implementations • ACL 2019 • He Bai, Yu Zhou, Jiajun Zhang, Cheng-qing Zong
Dialogue contexts are proven helpful in the spoken language understanding (SLU) system and they are typically encoded with explicit memory representations.
2 code implementations • TACL 2019 • Long Zhou, Jiajun Zhang, Cheng-qing Zong
In this paper, we introduce a synchronous bidirectional neural machine translation (SB-NMT) that predicts its outputs using left-to-right and right-to-left decoding simultaneously and interactively, in order to leverage both of the history and future information at the same time.
Ranked #27 on
Machine Translation
on WMT2014 English-German
no code implementations • 17 Apr 2019 • Yuchen Liu, Hao Xiong, Zhongjun He, Jiajun Zhang, Hua Wu, Haifeng Wang, Cheng-qing Zong
End-to-end speech translation (ST), which directly translates from source language speech into target language text, has attracted intensive attentions in recent years.
1 code implementation • 24 Feb 2019 • Jiajun Zhang, Long Zhou, Yang Zhao, Cheng-qing Zong
In this work, we propose a synchronous bidirectional inference model to generate outputs using both left-to-right and right-to-left decoding simultaneously and interactively.
no code implementations • 1 Nov 2018 • Long Zhou, Yuchen Liu, Jiajun Zhang, Cheng-qing Zong, Guoping Huang
Current Neural Machine Translation (NMT) employs a language-specific encoder to represent the source sentence and adopts a language-specific decoder to generate target translation.
no code implementations • EMNLP 2018 • Junnan Zhu, Haoran Li, Tianshang Liu, Yu Zhou, Jiajun Zhang, Cheng-qing Zong
In this paper, we propose a novel task, multimodal summarization with multimodal output (MSMO).
1 code implementation • EMNLP 2018 • Shaonan Wang, Jiajun Zhang, Cheng-qing Zong
In this paper we address the problem of learning multimodal word representations by integrating textual, visual and auditory inputs.
no code implementations • EMNLP 2018 • Yining Wang, Jiajun Zhang, FeiFei Zhai, Jingfang Xu, Cheng-qing Zong
However, previous studies show that one-to-many translation based on this framework cannot perform on par with the individually trained models.
no code implementations • EMNLP 2018 • Weikang Wang, Jiajun Zhang, Han Zhang, Mei-Yuh Hwang, Cheng-qing Zong, Zhifei Li
Specifically, the {``}student{''} is an extended dialog manager based on a new ontology, and the {``}teacher{''} is existing resources used for guiding the learning process of the {``}student{''}.
no code implementations • EMNLP 2018 • Jingyuan Sun, Shaonan Wang, Cheng-qing Zong
Distributional semantic models (DSMs) generally require sufficient examples for a word to learn a high quality representation.
no code implementations • EMNLP 2018 • Yang Zhao, Jiajun Zhang, Zhongjun He, Cheng-qing Zong, Hua Wu
One of the weaknesses of Neural Machine Translation (NMT) is in handling lowfrequency and ambiguous words, which we refer as troublesome words.
no code implementations • 19 Aug 2018 • He Bai, Yu Zhou, Jiajun Zhang, Liang Zhao, Mei-Yuh Hwang, Cheng-qing Zong
This paper focuses on the language transferring task given a tiny in-domain parallel SLU corpus.
Cultural Vocal Bursts Intensity Prediction
domain classification
+7
no code implementations • COLING 2018 • Qianlong Du, Cheng-qing Zong, Keh-Yih Su
This paper proposes to perform natural language inference with Word-Pair-Dependency-Triplets.
no code implementations • COLING 2018 • Haoran Li, Junnan Zhu, Jiajun Zhang, Cheng-qing Zong
In this paper, we investigate the sentence summarization task that produces a summary from a source sentence.
Ranked #7 on
Text Summarization
on DUC 2004 Task 1
1 code implementation • COLING 2018 • Junjie Li, Haitong Yang, Cheng-qing Zong
The document representation is combined with user and overall rating information to predict aspect ratings of a review.
no code implementations • COLING 2018 • He Bai, Yu Zhou, Jiajun Zhang, Liang Zhao, Mei-Yuh Hwang, Cheng-qing Zong
An SLU corpus is a monolingual corpus with domain/intent/slot labels.
Cultural Vocal Bursts Intensity Prediction
domain classification
+9
no code implementations • 25 May 2018 • Yang Zhao, Yining Wang, Jiajun Zhang, Cheng-qing Zong
Neural Machine Translation (NMT) has drawn much attention due to its promising translation performance recently.
no code implementations • 2 Jan 2018 • Shaonan Wang, Jiajun Zhang, Cheng-qing Zong
Multimodal models have been proven to outperform text-based models on learning semantic word representations.
no code implementations • WS 2017 • Guoping Huang, Jiajun Zhang, Yu Zhou, Cheng-qing Zong
Terms extensively exist in specific domains, and term translation plays a critical role in domain-specific machine translation (MT) tasks.
no code implementations • 15 Nov 2017 • Shaonan Wang, Jiajun Zhang, Nan Lin, Cheng-qing Zong
Considering that multimodal models are originally motivated by human concept representations, we assume that correlating multimodal representations with brain-based semantics would interpret their inner properties to answer the above questions.
Learning Semantic Representations
Natural Language Understanding
1 code implementation • 13 Nov 2017 • Yining Wang, Long Zhou, Jiajun Zhang, Cheng-qing Zong
Our experiments show that subword model performs best for Chinese-to-English translation with the vocabulary which is not so big while hybrid word-character model is most suitable for English-to-Chinese translation.
no code implementations • IJCNLP 2017 • Yining Wang, Yang Zhao, Jiajun Zhang, Cheng-qing Zong, Zhengshan Xue
While neural machine translation (NMT) has become the new paradigm, the parameter optimization requires large-scale parallel data which is scarce in many domains and language pairs.
no code implementations • EMNLP 2017 • Haoran Li, Junnan Zhu, Cong Ma, Jiajun Zhang, Cheng-qing Zong
In this work, we propose an extractive Multi-modal Summarization (MMS) method which can automatically generate a textual summary given a set of documents, images, audios and videos related to a specific topic.
Automatic Speech Recognition (ASR)
Document Summarization
+1
no code implementations • EMNLP 2017 • Shaonan Wang, Jiajun Zhang, Cheng-qing Zong
We introduce a novel mixed characterword architecture to improve Chinese sentence representations, by utilizing rich semantic information of word internal structures.
no code implementations • 30 Aug 2017 • Long Zhou, Jiajun Zhang, Cheng-qing Zong
The attention model has become a standard component in neural machine translation (NMT) and it guides translation process by selectively focusing on parts of the source sentence when predicting each target word.
no code implementations • ACL 2017 • Long Zhou, Wenpeng Hu, Jiajun Zhang, Cheng-qing Zong
Neural machine translation (NMT) becomes a new approach to machine translation and generates much more fluent results compared to statistical machine translation (SMT).
no code implementations • 3 Jan 2017 • Huijia Wu, Jiajun Zhang, Cheng-qing Zong
To simply the stacked architecture, we propose a framework called shortcut block, which is a marriage of the gating mechanism and shortcuts, while discarding the self-connected part in LSTM cell.
no code implementations • 24 Oct 2016 • Jiajun Zhang, Cheng-qing Zong
Neural Machine Translation (NMT) has become the new state-of-the-art in several language pairs.
no code implementations • COLING 2016 • Huijia Wu, Jiajun Zhang, Cheng-qing Zong
In this paper, we empirically explore the effects of various kinds of skip connections in stacked bidirectional LSTMs for sequential tagging.
no code implementations • 10 Oct 2016 • Huijia Wu, Jiajun Zhang, Cheng-qing Zong
These motivate us to build a supertagger with a dynamic window approach, which can be treated as an attention mechanism on the local contexts.
no code implementations • 29 Sep 2016 • Shaonan Wang, Jiajun Zhang, Cheng-qing Zong
Recently, much progress has been made in learning general-purpose sentence representations that can be used across domains.
no code implementations • LREC 2018 • Xiao-Qing Li, Jiajun Zhang, Cheng-qing Zong
Neural machine translation (NMT) becomes a new state-of-the-art and achieves promising translation results using a simple encoder-decoder neural network.
no code implementations • 7 Jul 2016 • Xiao-Qing Li, Jiajun Zhang, Cheng-qing Zong
In order to control computational complexity, neural machine translation (NMT) systems convert all rare words outside the vocabulary into a single unk symbol.
no code implementations • LREC 2016 • Yang Liu, Jiajun Zhang, Cheng-qing Zong, Yating Yang, Xi Zhou
Existing discourse research only focuses on the monolingual languages and the inconsistency between languages limits the power of the discourse theory in multilingual applications such as machine translation.
no code implementations • 5 Feb 2015 • Jiajun Zhang, Shujie Liu, Mu Li, Ming Zhou, Cheng-qing Zong
Language model is one of the most important modules in statistical machine translation and currently the word-based language model dominants this community.
no code implementations • TACL 2015 • Haitong Yang, Tao Zhuang, Cheng-qing Zong
Experiments on English data in the CoNLL 2009 shared task show that our method largely reduced the performance drop on out-of-domain test data.
no code implementations • TACL 2013 • Zhiguo Wang, Cheng-qing Zong
In this paper, we take dependency cohesion as a soft constraint, and integrate it into a generative model for large-scale word alignment experiments.
no code implementations • TACL 2013 • Feifei Zhai, Jiajun Zhang, Yu Zhou, Cheng-qing Zong
In current research, most tree-based translation models are built directly from parse trees.