no code implementations • 4 Feb 2024 • Gaël Gendron, Bao Trung Nguyen, Alex Yuxuan Peng, Michael Witbrock, Gillian Dobbie
We show that such causal constraints can improve out-of-distribution performance on abstract and causal reasoning tasks.
1 code implementation • 13 Oct 2023 • Qiming Bao, Gael Gendron, Alex Yuxuan Peng, Wanjun Zhong, Neset Tan, Yang Chen, Michael Witbrock, Jiamou Liu
Despite their high performance on the original publicly available datasets, we find that all models perform poorly on these newly constructed datasets.
1 code implementation • 19 Sep 2023 • Qiming Bao, Juho Leinonen, Alex Yuxuan Peng, Wanjun Zhong, Gaël Gendron, Timothy Pistotti, Alice Huang, Paul Denny, Michael Witbrock, Jiamou Liu
When learnersourcing multiple-choice questions, creating explanations for the solution of a question is a crucial step; it helps other students understand the solution and promotes a deeper understanding of related concepts.
1 code implementation • 21 May 2023 • Qiming Bao, Alex Yuxuan Peng, Zhenyun Deng, Wanjun Zhong, Gael Gendron, Timothy Pistotti, Neset Tan, Nathan Young, Yang Chen, Yonghua Zhu, Paul Denny, Michael Witbrock, Jiamou Liu
Combining large language models with logical reasoning enhances their capacity to address problems in a robust and reliable manner.
no code implementations • 14 Mar 2023 • Neşet Özkan Tan, Alex Yuxuan Peng, Joshua Bensemann, Qiming Bao, Tim Hartill, Mark Gahegan, Michael Witbrock
Because of the attention mechanism's high computational cost, transformer models usually have an input-length limitation caused by hardware constraints.
1 code implementation • 28 Jul 2022 • Qiming Bao, Alex Yuxuan Peng, Tim Hartill, Neset Tan, Zhenyun Deng, Michael Witbrock, Jiamou Liu
In our model, reasoning is performed using an iterative memory neural network based on RNN with a gated attention mechanism.