no code implementations • 2 Oct 2024 • Xingxuan Li, Weiwen Xu, Ruochen Zhao, Fangkai Jiao, Shafiq Joty, Lidong Bing
We validate CR-Planner on challenging domain-knowledge-intensive and reasoning-heavy tasks, including competitive programming, theorem-driven math reasoning, and complex domain retrieval problems.
1 code implementation • 29 Jul 2024 • Wenxuan Zhang, Hou Pong Chan, Yiran Zhao, Mahani Aljunied, Jianyu Wang, Chaoqun Liu, Yue Deng, Zhiqiang Hu, Weiwen Xu, Yew Ken Chia, Xin Li, Lidong Bing
Large Language Models (LLMs) have shown remarkable abilities across various tasks, yet their development has predominantly centered on high-resource languages like English and Chinese, leaving low-resource languages underserved.
no code implementations • 24 Jun 2024 • Deng Cai, Huayang Li, Tingchen Fu, Siheng Li, Weiwen Xu, Shuaiyi Li, Bowen Cao, Zhisong Zhang, Xinting Huang, Leyang Cui, Yan Wang, Lemao Liu, Taro Watanabe, Shuming Shi
Despite the general capabilities of pre-trained large language models (LLMs), they still need further adaptation to better serve practical applications.
1 code implementation • 22 Dec 2023 • Weiwen Xu, Deng Cai, Zhisong Zhang, Wai Lam, Shuming Shi
CUT (LLaMA2-chat-13b) can also align LLMs in an iterative fashion using up-to-date model-specific judgments, improving performance from 81. 09 to 91. 68 points on AlpacaEval.
1 code implementation • 23 May 2023 • Weiwen Xu, Xin Li, Wai Lam, Lidong Bing
mPMR aims to guide multilingual pre-trained language models (mPLMs) to perform natural language understanding (NLU) including both sequence classification and span extraction in multiple languages.
1 code implementation • 9 Dec 2022 • Weiwen Xu, Xin Li, Wenxuan Zhang, Meng Zhou, Wai Lam, Luo Si, Lidong Bing
We present Pre-trained Machine Reader (PMR), a novel method for retrofitting pre-trained masked language models (MLMs) to pre-trained machine reading comprehension (MRC) models without acquiring labeled data.
1 code implementation • 17 Oct 2022 • Weiwen Xu, Xin Li, Yang Deng, Wai Lam, Lidong Bing
Specifically, a novel Peer Data Augmentation (PeerDA) approach is proposed which employs span pairs with the PR relation as the augmentation data for training.
1 code implementation • 17 Oct 2022 • Weiwen Xu, Yang Deng, Wenqiang Lei, Wenlong Zhao, Tat-Seng Chua, Wai Lam
We study automatic Contract Clause Extraction (CCE) by modeling implicit relations in legal contracts.
1 code implementation • 14 Apr 2022 • Yang Deng, Wenxuan Zhang, Weiwen Xu, Wenqiang Lei, Tat-Seng Chua, Wai Lam
In this work, we propose a novel Unified MultI-goal conversational recommeNDer system, namely UniMIND.
no code implementations • 28 Feb 2022 • Weiwen Xu, Bowei Zou, Wai Lam, Ai Ti Aw
Recent techniques in Question Answering (QA) have gained remarkable performance improvement with some QA models even surpassed human performance.
1 code implementation • Findings (EMNLP) 2021 • Weiwen Xu, Yang Deng, Huihui Zhang, Deng Cai, Wai Lam
We propose a novel Chain Guided Retriever-reader ({\tt CGR}) framework to model the reasoning chain for multi-hop Science Question Answering.
1 code implementation • Findings (ACL) 2021 • Weiwen Xu, Huihui Zhang, Deng Cai, Wai Lam
Our framework contains three new ideas: (a) {\tt AMR-SG}, an AMR-based Semantic Graph, constructed by candidate fact AMRs to uncover any hop relations among question, answer and multiple facts.
1 code implementation • NAACL 2021 • Weiwen Xu, Ai Ti Aw, Yang Ding, Kui Wu, Shafiq Joty
Neural Machine Translation (NMT) has achieved significant breakthrough in performance but is known to suffer vulnerability to input perturbations.
no code implementations • IJCNLP 2019 • Wenqiang Lei, Weiwen Xu, Ai Ti Aw, Yuanxin Xiang, Tat Seng Chua
While achieving great fluency, current machine translation (MT) techniques are bottle-necked by adequacy issues.