Search Results for author: Xin-yu Dai

Found 45 papers, 12 papers with code

Modeling Users' Contextualized Page-wise Feedback for Click-Through Rate Prediction in E-commerce Search

1 code implementation29 Mar 2022 Zhifang Fan, Dan Ou, Yulong Gu, Bairan Fu, Xiang Li, Wentian Bao, Xin-yu Dai, Xiaoyi Zeng, Tao Zhuang, Qingwen Liu

In this paper, we propose a new perspective for context-aware users' behavior modeling by including the whole page-wisely exposed products and the corresponding feedback as contextualized page-wise feedback sequence.

Click-Through Rate Prediction Denoising

Explicit Semantic Decomposition for Definition Generation

no code implementations ACL 2020 Jiahuan Li, Yu Bao, Shu-Jian Huang, Xin-yu Dai, Jia-Jun Chen

Definition generation, which aims to automatically generate dictionary definitions for words, has recently been proposed to assist the construction of dictionaries and help people understand unfamiliar texts.

Mirror-Generative Neural Machine Translation

no code implementations ICLR 2020 Zaixiang Zheng, Hao Zhou, Shu-Jian Huang, Lei LI, Xin-yu Dai, Jia-Jun Chen

Training neural machine translation models (NMT) requires a large amount of parallel corpus, which is scarce for many language pairs.

Machine Translation Translation

Adding A Filter Based on The Discriminator to Improve Unconditional Text Generation

1 code implementation5 Apr 2020 Xingyuan Chen, Ping Cai, Peng Jin, Hongjun Wang, Xin-yu Dai, Jia-Jun Chen

To alleviate the exposure bias, generative adversarial networks (GAN) use the discriminator to update the generator's parameters directly, but they fail by being evaluated precisely.

Language Modelling Text Generation

R3: A Reading Comprehension Benchmark Requiring Reasoning Processes

no code implementations2 Apr 2020 Ran Wang, Kun Tao, Dingjie Song, Zhilong Zhang, Xiao Ma, Xi'ao Su, Xin-yu Dai

Existing question answering systems can only predict answers without explicit reasoning processes, which hinder their explainability and make us overestimate their ability of understanding and reasoning over natural language.

Question Answering Reading Comprehension

Latent Opinions Transfer Network for Target-Oriented Opinion Words Extraction

1 code implementation7 Jan 2020 Zhen Wu, Fei Zhao, Xin-yu Dai, Shu-Jian Huang, Jia-Jun Chen

In this paper, we propose a novel model to transfer these opinions knowledge from resource-rich review sentiment classification datasets to low-resource task TOWE.

Aspect-oriented Opinion Extraction General Classification +1

Generating Diverse Translation by Manipulating Multi-Head Attention

no code implementations21 Nov 2019 Zewei Sun, Shu-Jian Huang, Hao-Ran Wei, Xin-yu Dai, Jia-Jun Chen

Experiments also show that back-translation with these diverse translations could bring significant improvement on performance on translation tasks.

Data Augmentation Machine Translation +3

A Reinforced Generation of Adversarial Examples for Neural Machine Translation

no code implementations ACL 2020 Wei Zou, Shu-Jian Huang, Jun Xie, Xin-yu Dai, Jia-Jun Chen

Neural machine translation systems tend to fail on less decent inputs despite its significant efficacy, which may significantly harm the credibility of this systems-fathoming how and when neural-based systems fail in such cases is critical for industrial maintenance.

Machine Translation reinforcement-learning +1

Fine-grained Knowledge Fusion for Sequence Labeling Domain Adaptation

no code implementations IJCNLP 2019 Huiyun Yang, Shu-Jian Huang, Xin-yu Dai, Jia-Jun Chen

In sequence labeling, previous domain adaptation methods focus on the adaptation from the source domain to the entire target domain without considering the diversity of individual target domain samples, which may lead to negative transfer results for certain samples.

Domain Adaptation

Target-oriented Opinion Words Extraction with Target-fused Neural Sequence Labeling

1 code implementation NAACL 2019 Zhifang Fan, Zhen Wu, Xin-yu Dai, Shu-Jian Huang, Jia-Jun Chen

In this paper, we propose a novel sequence labeling subtask for ABSA named TOWE (Target-oriented Opinion Words Extraction), which aims at extracting the corresponding opinion words for a given opinion target.

Aspect-oriented Opinion Extraction target-oriented opinion words extraction

Dynamic Past and Future for Neural Machine Translation

1 code implementation IJCNLP 2019 Zaixiang Zheng, Shu-Jian Huang, Zhaopeng Tu, Xin-yu Dai, Jia-Jun Chen

Previous studies have shown that neural machine translation (NMT) models can benefit from explicitly modeling translated (Past) and untranslated (Future) to groups of translated and untranslated contents through parts-to-wholes assignment.

Machine Translation Translation

Learning to Discriminate Noises for Incorporating External Information in Neural Machine Translation

no code implementations24 Oct 2018 Zaixiang Zheng, Shu-Jian Huang, Zewei Sun, Rongxiang Weng, Xin-yu Dai, Jia-Jun Chen

Previous studies show that incorporating external information could improve the translation quality of Neural Machine Translation (NMT) systems.

Machine Translation Translation

Collaborative Filtering with Topic and Social Latent Factors Incorporating Implicit Feedback

no code implementations26 Mar 2018 Guang-Neng Hu, Xin-yu Dai, Feng-Yu Qiu, Rui Xia, Tao Li, Shu-Jian Huang, Jia-Jun Chen

First, we propose a novel model {\em \mbox{MR3}} to jointly model three sources of information (i. e., ratings, item reviews, and social relations) effectively for rating prediction by aligning the latent factors and hidden topics.

Collaborative Filtering Recommendation Systems

Modeling Past and Future for Neural Machine Translation

1 code implementation TACL 2018 Zaixiang Zheng, Hao Zhou, Shu-Jian Huang, Lili Mou, Xin-yu Dai, Jia-Jun Chen, Zhaopeng Tu

The Past and Future contents are fed to both the attention model and the decoder states, which offers NMT systems the knowledge of translated and untranslated contents.

Machine Translation Translation

Dynamic Oracle for Neural Machine Translation in Decoding Phase

no code implementations LREC 2018 Zi-Yi Dou, Hao Zhou, Shu-Jian Huang, Xin-yu Dai, Jia-Jun Chen

However, there are certain limitations in Scheduled Sampling and we propose two dynamic oracle-based methods to improve it.

Machine Translation Translation

Neural Machine Translation with Word Predictions

no code implementations EMNLP 2017 Rongxiang Weng, Shu-Jian Huang, Zaixiang Zheng, Xin-yu Dai, Jia-Jun Chen

In the encoder-decoder architecture for neural machine translation (NMT), the hidden states of the recurrent structures in the encoder and decoder carry the crucial information about the sentence. These vectors are generated by parameters which are updated by back-propagation of translation errors through time.

Machine Translation Translation

Integrating Reviews into Personalized Ranking for Cold Start Recommendation

no code implementations31 Jan 2017 Guang-Neng Hu, Xin-yu Dai

On top of text features we uncover the review dimensions that explain the variation in users' feedback and these review factors represent a prior preference of users.

Collaborative Filtering Word Embeddings

A Synthetic Approach for Recommendation: Combining Ratings, Social Relations, and Reviews

no code implementations11 Jan 2016 Guang-Neng Hu, Xin-yu Dai, Yunya Song, Shu-Jian Huang, Jia-Jun Chen

Recommender systems (RSs) provide an effective way of alleviating the information overload problem by selecting personalized choices.

Recommendation Systems

Topic2Vec: Learning Distributed Representations of Topics

no code implementations28 Jun 2015 Li-Qiang Niu, Xin-yu Dai

Latent Dirichlet Allocation (LDA) mining thematic structure of documents plays an important role in nature language processing and machine learning areas.

Non-linear Learning for Statistical Machine Translation

no code implementations IJCNLP 2015 Shujian Huang, Huadong Chen, Xin-yu Dai, Jia-Jun Chen

The linear combination assumes that all the features are in a linear relationship and constrains that each feature interacts with the rest features in an linear manner, which might limit the expressive power of the model and lead to a under-fit model on the current data.

Machine Translation Translation

Cannot find the paper you are looking for? You can Submit a new open access paper.