no code implementations • IJCNLP 2019 • Wenqiang Lei, Weiwen Xu, Ai Ti Aw, Yuanxin Xiang, Tat Seng Chua
While achieving great fluency, current machine translation (MT) techniques are bottle-necked by adequacy issues.
1 code implementation • NAACL 2021 • Weiwen Xu, Ai Ti Aw, Yang Ding, Kui Wu, Shafiq Joty
Neural Machine Translation (NMT) has achieved significant breakthrough in performance but is known to suffer vulnerability to input perturbations.
1 code implementation • Findings (ACL) 2021 • Weiwen Xu, Huihui Zhang, Deng Cai, Wai Lam
Our framework contains three new ideas: (a) {\tt AMR-SG}, an AMR-based Semantic Graph, constructed by candidate fact AMRs to uncover any hop relations among question, answer and multiple facts.
1 code implementation • Findings (EMNLP) 2021 • Weiwen Xu, Yang Deng, Huihui Zhang, Deng Cai, Wai Lam
We propose a novel Chain Guided Retriever-reader ({\tt CGR}) framework to model the reasoning chain for multi-hop Science Question Answering.
no code implementations • 28 Feb 2022 • Weiwen Xu, Bowei Zou, Wai Lam, Ai Ti Aw
Recent techniques in Question Answering (QA) have gained remarkable performance improvement with some QA models even surpassed human performance.
1 code implementation • 14 Apr 2022 • Yang Deng, Wenxuan Zhang, Weiwen Xu, Wenqiang Lei, Tat-Seng Chua, Wai Lam
In this work, we propose a novel Unified MultI-goal conversational recommeNDer system, namely UniMIND.
1 code implementation • 17 Oct 2022 • Weiwen Xu, Xin Li, Yang Deng, Wai Lam, Lidong Bing
Specifically, a novel Peer Data Augmentation (PeerDA) approach is proposed which employs span pairs with the PR relation as the augmentation data for training.
1 code implementation • 17 Oct 2022 • Weiwen Xu, Yang Deng, Wenqiang Lei, Wenlong Zhao, Tat-Seng Chua, Wai Lam
We study automatic Contract Clause Extraction (CCE) by modeling implicit relations in legal contracts.
1 code implementation • 9 Dec 2022 • Weiwen Xu, Xin Li, Wenxuan Zhang, Meng Zhou, Wai Lam, Luo Si, Lidong Bing
We present Pre-trained Machine Reader (PMR), a novel method for retrofitting pre-trained masked language models (MLMs) to pre-trained machine reading comprehension (MRC) models without acquiring labeled data.
1 code implementation • 23 May 2023 • Weiwen Xu, Xin Li, Wai Lam, Lidong Bing
mPMR aims to guide multilingual pre-trained language models (mPLMs) to perform natural language understanding (NLU) including both sequence classification and span extraction in multiple languages.
1 code implementation • 22 Dec 2023 • Weiwen Xu, Deng Cai, Zhisong Zhang, Wai Lam, Shuming Shi
As humans, we consistently engage in interactions with our peers and receive feedback in the form of natural language.