Search Results for author: Jia-Jun Chen

Found 49 papers, 14 papers with code

Explicit Semantic Decomposition for Definition Generation

no code implementations ACL 2020 Jiahuan Li, Yu Bao, Shu-Jian Huang, Xin-yu Dai, Jia-Jun Chen

Definition generation, which aims to automatically generate dictionary definitions for words, has recently been proposed to assist the construction of dictionaries and help people understand unfamiliar texts.

Mirror-Generative Neural Machine Translation

no code implementations ICLR 2020 Zaixiang Zheng, Hao Zhou, Shu-Jian Huang, Lei LI, Xin-yu Dai, Jia-Jun Chen

Training neural machine translation models (NMT) requires a large amount of parallel corpus, which is scarce for many language pairs.

Machine Translation Translation

Adding A Filter Based on The Discriminator to Improve Unconditional Text Generation

1 code implementation5 Apr 2020 Xingyuan Chen, Ping Cai, Peng Jin, Hongjun Wang, Xin-yu Dai, Jia-Jun Chen

To alleviate the exposure bias, generative adversarial networks (GAN) use the discriminator to update the generator's parameters directly, but they fail by being evaluated precisely.

Language Modelling Text Generation

GRET: Global Representation Enhanced Transformer

no code implementations24 Feb 2020 Rongxiang Weng, Hao-Ran Wei, Shu-Jian Huang, Heng Yu, Lidong Bing, Weihua Luo, Jia-Jun Chen

The encoder maps the words in the input sentence into a sequence of hidden states, which are then fed into the decoder to generate the output sentence.

Machine Translation Text Generation +2

Towards Making the Most of Context in Neural Machine Translation

1 code implementation19 Feb 2020 Zaixiang Zheng, Xiang Yue, Shu-Jian Huang, Jia-Jun Chen, Alexandra Birch

Document-level machine translation manages to outperform sentence level models by a small margin, but have failed to be widely adopted.

Document-level Document Level Machine Translation +2

Latent Opinions Transfer Network for Target-Oriented Opinion Words Extraction

1 code implementation7 Jan 2020 Zhen Wu, Fei Zhao, Xin-yu Dai, Shu-Jian Huang, Jia-Jun Chen

In this paper, we propose a novel model to transfer these opinions knowledge from resource-rich review sentiment classification datasets to low-resource task TOWE.

Aspect-oriented Opinion Extraction General Classification +1

Generating Diverse Translation by Manipulating Multi-Head Attention

no code implementations21 Nov 2019 Zewei Sun, Shu-Jian Huang, Hao-Ran Wei, Xin-yu Dai, Jia-Jun Chen

Experiments also show that back-translation with these diverse translations could bring significant improvement on performance on translation tasks.

Data Augmentation Machine Translation +1

A Reinforced Generation of Adversarial Examples for Neural Machine Translation

no code implementations ACL 2020 Wei Zou, Shu-Jian Huang, Jun Xie, Xin-yu Dai, Jia-Jun Chen

Neural machine translation systems tend to fail on less decent inputs despite its significant efficacy, which may significantly harm the credibility of this systems-fathoming how and when neural-based systems fail in such cases is critical for industrial maintenance.

Machine Translation Translation

The Detection of Distributional Discrepancy for Text Generation

no code implementations28 Sep 2019 Xingyuan Chen, Ping Cai, Peng Jin, Haokun Du, Hongjun Wang, Xingyu Dai, Jia-Jun Chen

In this paper, we theoretically propose two metric functions to measure the distributional difference between real text and generated text.

Language Modelling Text Generation

Fine-grained Knowledge Fusion for Sequence Labeling Domain Adaptation

no code implementations IJCNLP 2019 Huiyun Yang, Shu-Jian Huang, Xin-yu Dai, Jia-Jun Chen

In sequence labeling, previous domain adaptation methods focus on the adaptation from the source domain to the entire target domain without considering the diversity of individual target domain samples, which may lead to negative transfer results for certain samples.

Domain Adaptation

Improving Neural Machine Translation with Pre-trained Representation

no code implementations21 Aug 2019 Rongxiang Weng, Heng Yu, Shu-Jian Huang, Weihua Luo, Jia-Jun Chen

Then, we design a framework for integrating both source and target sentence-level representations into NMT model to improve the translation quality.

Machine Translation Text Generation +1

Correct-and-Memorize: Learning to Translate from Interactive Revisions

no code implementations8 Jul 2019 Rongxiang Weng, Hao Zhou, Shu-Jian Huang, Lei LI, Yifan Xia, Jia-Jun Chen

Experiments in both ideal and real interactive translation settings demonstrate that our proposed \method enhances machine translation results significantly while requires fewer revision instructions from human compared to previous methods.

Machine Translation Translation

Target-oriented Opinion Words Extraction with Target-fused Neural Sequence Labeling

1 code implementation NAACL 2019 Zhifang Fan, Zhen Wu, Xin-yu Dai, Shu-Jian Huang, Jia-Jun Chen

In this paper, we propose a novel sequence labeling subtask for ABSA named TOWE (Target-oriented Opinion Words Extraction), which aims at extracting the corresponding opinion words for a given opinion target.

Aspect-oriented Opinion Extraction target-oriented opinion words extraction

Dynamic Past and Future for Neural Machine Translation

1 code implementation IJCNLP 2019 Zaixiang Zheng, Shu-Jian Huang, Zhaopeng Tu, Xin-yu Dai, Jia-Jun Chen

Previous studies have shown that neural machine translation (NMT) models can benefit from explicitly modeling translated (Past) and untranslated (Future) to groups of translated and untranslated contents through parts-to-wholes assignment.

Machine Translation Translation

Learning to Discriminate Noises for Incorporating External Information in Neural Machine Translation

no code implementations24 Oct 2018 Zaixiang Zheng, Shu-Jian Huang, Zewei Sun, Rongxiang Weng, Xin-yu Dai, Jia-Jun Chen

Previous studies show that incorporating external information could improve the translation quality of Neural Machine Translation (NMT) systems.

Machine Translation Translation

Collaborative Filtering with Topic and Social Latent Factors Incorporating Implicit Feedback

no code implementations26 Mar 2018 Guang-Neng Hu, Xin-yu Dai, Feng-Yu Qiu, Rui Xia, Tao Li, Shu-Jian Huang, Jia-Jun Chen

First, we propose a novel model {\em \mbox{MR3}} to jointly model three sources of information (i. e., ratings, item reviews, and social relations) effectively for rating prediction by aligning the latent factors and hidden topics.

Collaborative Filtering Recommendation Systems

Modeling Past and Future for Neural Machine Translation

1 code implementation TACL 2018 Zaixiang Zheng, Hao Zhou, Shu-Jian Huang, Lili Mou, Xin-yu Dai, Jia-Jun Chen, Zhaopeng Tu

The Past and Future contents are fed to both the attention model and the decoder states, which offers NMT systems the knowledge of translated and untranslated contents.

Machine Translation Translation

Dynamic Oracle for Neural Machine Translation in Decoding Phase

no code implementations LREC 2018 Zi-Yi Dou, Hao Zhou, Shu-Jian Huang, Xin-yu Dai, Jia-Jun Chen

However, there are certain limitations in Scheduled Sampling and we propose two dynamic oracle-based methods to improve it.

Machine Translation Translation

Neural Machine Translation with Word Predictions

no code implementations EMNLP 2017 Rongxiang Weng, Shu-Jian Huang, Zaixiang Zheng, Xin-yu Dai, Jia-Jun Chen

In the encoder-decoder architecture for neural machine translation (NMT), the hidden states of the recurrent structures in the encoder and decoder carry the crucial information about the sentence. These vectors are generated by parameters which are updated by back-propagation of translation errors through time.

Machine Translation Translation

Improved Neural Machine Translation with a Syntax-Aware Encoder and Decoder

1 code implementation ACL 2017 Huadong Chen, Shu-Jian Huang, David Chiang, Jia-Jun Chen

Most neural machine translation (NMT) models are based on the sequential encoder-decoder framework, which makes no use of syntactic information.

Machine Translation Translation

Chunk-Based Bi-Scale Decoder for Neural Machine Translation

1 code implementation ACL 2017 Hao Zhou, Zhaopeng Tu, Shu-Jian Huang, Xiaohua Liu, Hang Li, Jia-Jun Chen

In typical neural machine translation~(NMT), the decoder generates a sentence word by word, packing all linguistic granularities in the same time-scale of RNN.

Machine Translation Translation

A Synthetic Approach for Recommendation: Combining Ratings, Social Relations, and Reviews

no code implementations11 Jan 2016 Guang-Neng Hu, Xin-yu Dai, Yunya Song, Shu-Jian Huang, Jia-Jun Chen

Recommender systems (RSs) provide an effective way of alleviating the information overload problem by selecting personalized choices.

Recommendation Systems

Non-linear Learning for Statistical Machine Translation

no code implementations IJCNLP 2015 Shujian Huang, Huadong Chen, Xin-yu Dai, Jia-Jun Chen

The linear combination assumes that all the features are in a linear relationship and constrains that each feature interacts with the rest features in an linear manner, which might limit the expressive power of the model and lead to a under-fit model on the current data.

Machine Translation Translation

Cannot find the paper you are looking for? You can Submit a new open access paper.