no code implementations • 28 May 2024 • Xiaocheng Yang, Bingsen Chen, Yik-Cheung Tam
We hypothesize that an LLM should focus on extracting predicates and generating symbolic formulas from the math problem description so that the underlying calculation can be done via an external code interpreter.
no code implementations • 7 Sep 2023 • Xiaocheng Yang, Yik-Cheung Tam
Consequently, we employ chain-of-thought to fine-tune LLaMA7B as a baseline model and develop other fine-tuned LLaMA7B models for the generation of Prolog code, Prolog code + chain-of-thought, and chain-of-thought + Prolog code, respectively.
1 code implementation • 8 Nov 2022 • Yik-Cheung Tam, Jiacheng Xu, Jiakai Zou, Zecheng Wang, Tinglong Liao, Shuhan Yuan
Knowledge cluster classification is boosted from 0. 7924 to 0. 9333 in Recall@1.
Automatic Speech Recognition Automatic Speech Recognition (ASR) +4
1 code implementation • 6 Nov 2022 • Zecheng Wang, Yik-Cheung Tam
SUREALM employs an embedding retriever to search for training sentences in a data store that share similar word history during sequence generation.
1 code implementation • 7 Dec 2021 • Yichen Huang, Yuchen Wang, Yik-Cheung Tam
Our model ranks second in the official evaluation on the object coreference resolution task with an F1 score of 73. 3% after model ensembling.
no code implementations • 25 Aug 2021 • Yuhao Ding, Yik-Cheung Tam
In multi-domain task-oriented dialog system, user utterances and system responses may mention multiple named entities and attributes values.
1 code implementation • 11 Mar 2020 • Changyu Miao, Zhen Cao, Yik-Cheung Tam
Deep Semantic Matching is a crucial component in various natural language processing applications such as question and answering (QA), where an input query is compared to each candidate question in a QA corpus in terms of relevance.
1 code implementation • ACL 2019 • Zhi-Qiang Liu, Zuohui Fu, Jie Cao, Gerard de Melo, Yik-Cheung Tam, Cheng Niu, Jie zhou
Rhetoric is a vital element in modern poetry, and plays an essential role in improving its aesthetics.
no code implementations • NAACL 2018 • Haohui Deng, Yik-Cheung Tam
GA Reader makes two assumptions: (1) a uni-directional attention that uses an input query to gate token encodings of a document; (2) encoding at the cloze position of an input query is considered for answer prediction.
no code implementations • NeurIPS 2008 • Yik-Cheung Tam, Tanja Schultz
We propose using correlated bigram LSA for unsupervised LM adaptation for automatic speech recognition.
Automatic Speech Recognition Automatic Speech Recognition (ASR) +2