no code implementations • EMNLP 2021 • Yuanhang Zheng, Zhixing Tan, Meng Zhang, Mieradilijiang Maimaiti, Huanbo Luan, Maosong Sun, Qun Liu, Yang Liu
Quality estimation (QE) of machine translation (MT) aims to evaluate the quality of machine-translated sentences without references and is important in practical applications of MT.
no code implementations • EMNLP 2021 • Mieradilijiang Maimaiti, Yang Liu, Yuanhang Zheng, Gang Chen, Kaiyu Huang, Ji Zhang, Huanbo Luan, Maosong Sun
Besides, the robustness of the previous neural methods is limited by the large-scale annotated data.
no code implementations • 7 Mar 2025 • Fengbin Zhu, Junfeng Li, Liangming Pan, Wenjie Wang, Fuli Feng, Chao Wang, Huanbo Luan, Tat-Seng Chua
Finance decision-making often relies on in-depth data analysis across various data sources, including financial tables, news articles, stock prices, etc.
no code implementations • 25 Oct 2024 • Fengbin Zhu, Ziyang Liu, Xiang Yao Ng, Haohui Wu, Wenjie Wang, Fuli Feng, Chao Wang, Huanbo Luan, Tat Seng Chua
Large Vision-Language Models (LVLMs) have achieved remarkable performance in many vision-language tasks, yet their capabilities in fine-grained visual understanding remain insufficiently evaluated.
1 code implementation • NeurIPS 2020 • Wenzheng Feng, Jie Zhang, Yuxiao Dong, Yu Han, Huanbo Luan, Qian Xu, Qiang Yang, Evgeny Kharlamov, Jie Tang
We study the problem of semi-supervised learning on graphs, for which graph neural networks (GNNs) have been extensively explored.
1 code implementation • 14 Jul 2020 • Xuancheng Huang, Jiacheng Zhang, Zhixing Tan, Derek F. Wong, Huanbo Luan, Jingfang Xu, Maosong Sun, Yang Liu
System combination is an important technique for combining the hypotheses of different machine translation systems to improve translation performance.
9 code implementations • 22 May 2020 • Wenzheng Feng, Jie Zhang, Yuxiao Dong, Yu Han, Huanbo Luan, Qian Xu, Qiang Yang, Evgeny Kharlamov, Jie Tang
We study the problem of semi-supervised learning on graphs, for which graph neural networks (GNNs) have been extensively explored.
no code implementations • 5 Dec 2019 • Gang Chen, Yang Liu, Huanbo Luan, Meng Zhang, Qun Liu, Maosong Sun
While the use of neural networks has proven effective in improving story generation, how to learn to generate an explainable high-level plot still remains a major challenge.
no code implementations • 26 Nov 2019 • Jiacheng Zhang, Huanbo Luan, Maosong Sun, FeiFei Zhai, Jingfang Xu, Yang Liu
The lack of alignment in NMT models leads to three problems: it is hard to (1) interpret the translation process, (2) impose lexical constraints, and (3) impose structural constraints.
2 code implementations • IJCNLP 2019 • Xuancheng Huang, Yang Liu, Huanbo Luan, Jingfang Xu, Maosong Sun
To better identify translation errors, our method learns the representations of source sentences and system outputs in an interactive way.
1 code implementation • IJCNLP 2019 • Shuo Wang, Yang Liu, Chao Wang, Huanbo Luan, Maosong Sun
While back-translation is simple and effective in exploiting abundant monolingual corpora to improve low-resource neural machine translation (NMT), the synthetic bilingual corpora generated by NMT models trained on limited authentic bilingual data are inevitably noisy.
Low Resource Neural Machine Translation
Low-Resource Neural Machine Translation
+3
1 code implementation • 21 Dec 2018 • Cheng Yang, Maosong Sun, Haoran Liu, Shiyi Han, Zhiyuan Liu, Huanbo Luan
The strong assumptions oversimplify the complex diffusion mechanism and prevent these models from better fitting real-world cascade data.
Social and Information Networks Physics and Society
1 code implementation • ACL 2017 • Jiacheng Zhang, Yang Liu, Huanbo Luan, Jingfang Xu, Maosong Sun
Although neural machine translation has made significant progress recently, how to integrate multiple overlapping, arbitrary prior knowledge sources remains a challenge.
3 code implementations • EMNLP 2018 • Jiacheng Zhang, Huanbo Luan, Maosong Sun, FeiFei Zhai, Jingfang Xu, Min Zhang, Yang Liu
Although the Transformer translation model (Vaswani et al., 2017) has achieved state-of-the-art performance in a variety of translation tasks, how to use document-level context to deal with discourse phenomena problematic for Transformer still remains a challenge.
no code implementations • EMNLP 2017 • Meng Zhang, Yang Liu, Huanbo Luan, Maosong Sun
By viewing word embedding spaces as distributions, we propose to minimize their earth mover{'}s distance, a measure of divergence between distributions.
no code implementations • ACL 2017 • Yanzhuo Ding, Yang Liu, Huanbo Luan, Maosong Sun
While neural machine translation (NMT) has made remarkable progress in recent years, it is hard to interpret its internal workings due to the continuous representations and non-linearity of neural networks.
no code implementations • ACL 2017 • Meng Zhang, Yang Liu, Huanbo Luan, Maosong Sun
In this work, we show that such cross-lingual connection can actually be established without any form of supervision.
6 code implementations • 20 Jun 2017 • Jiacheng Zhang, Yanzhuo Ding, Shiqi Shen, Yong Cheng, Maosong Sun, Huanbo Luan, Yang Liu
This paper introduces THUMT, an open-source toolkit for neural machine translation (NMT) developed by the Natural Language Processing Group at Tsinghua University.
no code implementations • COLING 2016 • Meng Zhang, Yang Liu, Huanbo Luan, Yiqun Liu, Maosong Sun
Being able to induce word translations from non-parallel data is often a prerequisite for cross-lingual processing in resource-scarce languages and domains.
1 code implementation • 22 Sep 2016 • Ruobing Xie, Zhiyuan Liu, Huanbo Luan, Maosong Sun
More specifically, we first construct representations for all images of an entity with a neural image encoder.
no code implementations • ACL 2016 • Chunyang Liu, Yang Liu, Huanbo Luan, Maosong Sun, Heng Yu
We introduce an agreement-based approach to learning parallel lexicons and phrases from non-parallel corpora.
no code implementations • CVPR 2016 • Hanwang Zhang, Xindi Shang, Wenzhuo Yang, Huan Xu, Huanbo Luan, Tat-Seng Chua
Leveraging on the structure of the proposed collaborative learning formulation, we develop an efficient online algorithm that can jointly learn the label embeddings and visual classifiers.
1 code implementation • EMNLP 2015 • Yankai Lin, Zhiyuan Liu, Huanbo Luan, Maosong Sun, Siwei Rao, Song Liu
Representation learning of knowledge bases (KBs) aims to embed both entities and relations into a low-dimensional space.