Search Results for author: Liangchen Luo

Found 7 papers, 3 papers with code

Large-Scale Generative Data-Free Distillation

no code implementations10 Dec 2020 Liangchen Luo, Mark Sandler, Zi Lin, Andrey Zhmoginov, Andrew Howard

Knowledge distillation is one of the most popular and effective techniques for knowledge transfer, model compression and semi-supervised learning.

Knowledge Distillation Model Compression +1

MUSE: Parallel Multi-Scale Attention for Sequence to Sequence Learning

2 code implementations17 Nov 2019 Guangxiang Zhao, Xu sun, Jingjing Xu, Zhiyuan Zhang, Liangchen Luo

In this work, we explore parallel multi-scale representation learning on sequence data, striving to capture both long-range and short-range language structures.

Machine Translation Representation Learning +1

Adaptive Gradient Methods with Dynamic Bound of Learning Rate

5 code implementations ICLR 2019 Liangchen Luo, Yuanhao Xiong, Yan Liu, Xu sun

Recent work has put forward some algorithms such as AMSGrad to tackle this issue but they failed to achieve considerable improvement over existing methods.

Text Assisted Insight Ranking Using Context-Aware Memory Network

no code implementations13 Nov 2018 Qi Zeng, Liangchen Luo, Wenhao Huang, Yang Tang

Extracting valuable facts or informative summaries from multi-dimensional tables, i. e. insight mining, is an important task in data analysis and business intelligence.

Learning Personalized End-to-End Goal-Oriented Dialog

no code implementations12 Nov 2018 Liangchen Luo, Wenhao Huang, Qi Zeng, Zaiqing Nie, Xu sun

Most existing works on dialog systems only consider conversation content while neglecting the personality of the user the bot is interacting with, which begets several unsolved issues.

Goal-Oriented Dialog

An Auto-Encoder Matching Model for Learning Utterance-Level Semantic Dependency in Dialogue Generation

1 code implementation EMNLP 2018 Liangchen Luo, Jingjing Xu, Junyang Lin, Qi Zeng, Xu sun

Different from conventional text generation tasks, the mapping between inputs and responses in conversations is more complicated, which highly demands the understanding of utterance-level semantic dependency, a relation between the whole meanings of inputs and outputs.

Dialogue Generation

Cannot find the paper you are looking for? You can Submit a new open access paper.