Search Results for author: Yuexian Hou

Found 18 papers, 4 papers with code

CRFR: Improving Conversational Recommender Systems via Flexible Fragments Reasoning on Knowledge Graphs

no code implementations EMNLP 2021 Jinfeng Zhou, Bo wang, Ruifang He, Yuexian Hou

Although paths of user interests shift in knowledge graphs (KGs) can benefit conversational recommender systems (CRS), explicit reasoning on KGs has not been well considered in CRS, due to the complex of high-order and incomplete paths.

Knowledge Graphs Recommendation Systems +1

TopKG: Target-oriented Dialog via Global Planning on Knowledge Graph

no code implementations COLING 2022 Zhitong Yang, Bo wang, Jinfeng Zhou, Yue Tan, Dongming Zhao, Kun Huang, Ruifang He, Yuexian Hou

We design a global reinforcement learning with the planned paths to flexibly adjust the local response generation model towards the global target.

Response Generation

Mind vs. Mouth: On Measuring Re-judge Inconsistency of Social Bias in Large Language Models

no code implementations24 Aug 2023 Yachao Zhao, Bo wang, Dongming Zhao, Kun Huang, Yan Wang, Ruifang He, Yuexian Hou

We propose that this re-judge inconsistency can be similar to the inconsistency between human's unaware implicit social bias and their aware explicit social bias.

Syntax-Aware Complex-Valued Neural Machine Translation

no code implementations17 Jul 2023 Yang Liu, Yuexian Hou

The experimental results demonstrate that the proposed method can bring significant improvements in BLEU scores on two datasets.

Machine Translation NMT +1

Enhancing Personalized Dialogue Generation with Contrastive Latent Variables: Combining Sparse and Dense Persona

1 code implementation19 May 2023 Yihong Tang, Bo wang, Miao Fang, Dongming Zhao, Kun Huang, Ruifang He, Yuexian Hou

We design a Contrastive Latent Variable-based model (CLV) that clusters the dense persona descriptions into sparse categories, which are combined with the history query to generate personalized responses.

Dialogue Generation

Constructing Holistic Measures for Social Biases in Masked Language Models

no code implementations12 May 2023 Yang Liu, Yuexian Hou

In this paper, the log-likelihoods of stereotype bias and anti-stereotype bias samples output by MLMs are considered Gaussian distributions.

Empathetic Response Generation via Emotion Cause Transition Graph

no code implementations23 Feb 2023 Yushan Qian, Bo wang, Ting-En Lin, Yinhe Zheng, Ying Zhu, Dongming Zhao, Yuexian Hou, Yuchuan Wu, Yongbin Li

Empathetic dialogue is a human-like behavior that requires the perception of both affective factors (e. g., emotion status) and cognitive factors (e. g., cause of the emotion).

Empathetic Response Generation Response Generation

Mining Effective Features Using Quantum Entropy for Humor Recognition

1 code implementation7 Feb 2023 Yang Liu, Yuexian Hou

Humor recognition has been extensively studied with different methods in the past years.

Think Twice: A Human-like Two-stage Conversational Agent for Emotional Response Generation

no code implementations12 Jan 2023 Yushan Qian, Bo wang, Shangzhao Ma, Wu Bin, Shuo Zhang, Dongming Zhao, Kun Huang, Yuexian Hou

Towards human-like dialogue systems, current emotional dialogue approaches jointly model emotion and semantics with a unified neural network.

Response Generation

Aligning Recommendation and Conversation via Dual Imitation

no code implementations5 Nov 2022 Jinfeng Zhou, Bo wang, Minlie Huang, Dongming Zhao, Kun Huang, Ruifang He, Yuexian Hou

Human conversations of recommendation naturally involve the shift of interests which can align the recommendation actions and conversation process to make accurate recommendations with rich explanations.

Recommendation Systems

Label-Based Diversity Measure Among Hidden Units of Deep Neural Networks: A Regularization Method

no code implementations19 Sep 2020 Chenguang Zhang, Yuexian Hou, Dawei Song, Liangzhu Ge, Yaoshuai Yao

In this paper, from an information theoretic perspective, we introduce a new definition of redundancy to describe the diversity of hidden units under supervised learning settings by formalizing the effect of hidden layers on the generalization capacity as the mutual information.

Interpretable Network Structure for Modeling Contextual Dependency

no code implementations25 Sep 2019 Xindian Ma, Peng Zhang, Xiaoliu Mao, Yehua Zhang, Nan Duan, Yuexian Hou, Ming Zhou.

Then, we show that the lower bound of such a separation rank can reveal the quantitative relation between the network structure (e. g. depth/width) and the modeling ability for the contextual dependency.

Language Modelling Sentence +1

A Tensorized Transformer for Language Modeling

1 code implementation NeurIPS 2019 Xindian Ma, Peng Zhang, Shuai Zhang, Nan Duan, Yuexian Hou, Dawei Song, Ming Zhou

In this paper, based on the ideas of tensor decomposition and parameters sharing, we propose a novel self-attention model (namely Multi-linear attention) with Block-Term Tensor Decomposition (BTD).

Language Modelling Machine Translation +2

A Position-aware Bidirectional Attention Network for Aspect-level Sentiment Analysis

1 code implementation COLING 2018 Shuqin Gu, Lipeng Zhang, Yuexian Hou, Yin Song

PBAN not only concentrates on the position information of aspect terms, but also mutually models the relation between aspect term and sentence by employing bidirectional attention mechanism.

Aspect-Based Sentiment Analysis (ABSA) Feature Engineering +2

A Confident Information First Principle for Parametric Reduction and Model Selection of Boltzmann Machines

no code implementations5 Feb 2015 Xiaozhao Zhao, Yuexian Hou, Dawei Song, Wenjie Li

We then revisit Boltzmann machines (BM) from a model selection perspective and theoretically show that both the fully visible BM (VBM) and the BM with hidden units can be derived from the general binary multivariate distribution using the CIF principle.

Density Estimation Dimensionality Reduction +1

Understanding Boltzmann Machine and Deep Learning via A Confident Information First Principle

no code implementations16 Feb 2013 Xiaozhao Zhao, Yuexian Hou, Qian Yu, Dawei Song, Wenjie Li

Typical dimensionality reduction methods focus on directly reducing the number of random variables while retaining maximal variations in the data.

Density Estimation Dimensionality Reduction

Cannot find the paper you are looking for? You can Submit a new open access paper.