no code implementations • 18 Jun 2021 • Marco Fornoni, Chaochao Yan, Liangchen Luo, Kimberly Wilber, Alex Stark, Yin Cui, Boqing Gong, Andrew Howard
When interacting with objects through cameras, or pictures, users often have a specific intent.
no code implementations • 10 Dec 2020 • Liangchen Luo, Mark Sandler, Zi Lin, Andrey Zhmoginov, Andrew Howard
Knowledge distillation is one of the most popular and effective techniques for knowledge transfer, model compression and semi-supervised learning.
2 code implementations • 17 Nov 2019 • Guangxiang Zhao, Xu sun, Jingjing Xu, Zhiyuan Zhang, Liangchen Luo
In this work, we explore parallel multi-scale representation learning on sequence data, striving to capture both long-range and short-range language structures.
Ranked #8 on Machine Translation on WMT2014 English-French
5 code implementations • ICLR 2019 • Liangchen Luo, Yuanhao Xiong, Yan Liu, Xu sun
Recent work has put forward some algorithms such as AMSGrad to tackle this issue but they failed to achieve considerable improvement over existing methods.
no code implementations • 13 Nov 2018 • Qi Zeng, Liangchen Luo, Wenhao Huang, Yang Tang
Extracting valuable facts or informative summaries from multi-dimensional tables, i. e. insight mining, is an important task in data analysis and business intelligence.
no code implementations • 12 Nov 2018 • Liangchen Luo, Wenhao Huang, Qi Zeng, Zaiqing Nie, Xu sun
Most existing works on dialog systems only consider conversation content while neglecting the personality of the user the bot is interacting with, which begets several unsolved issues.
1 code implementation • EMNLP 2018 • Liangchen Luo, Jingjing Xu, Junyang Lin, Qi Zeng, Xu sun
Different from conventional text generation tasks, the mapping between inputs and responses in conversations is more complicated, which highly demands the understanding of utterance-level semantic dependency, a relation between the whole meanings of inputs and outputs.
Ranked #2 on Text Generation on DailyDialog