no code implementations • ICML 2020 • Kangrui Wang, Oliver Hamelijnck, Theodoros Damoulas, Mark Steel
We describe a framework for constructing non-separable non-stationary random fields that is based on an infinite mixture of convolved stochastic processes.
1 code implementation • 29 Mar 2024 • Peng Ding, Jiading Fang, Peng Li, Kangrui Wang, Xiaochen Zhou, Mo Yu, Jing Li, Matthew R. Walter, Hongyuan Mei
The task is question-answering: for each maze, a large language model reads the walkthrough and answers hundreds of mapping and navigation questions such as "How should you go to Attic from West of House?"
2 code implementations • NeurIPS 2023 • Xiaoming Shi, Siqiao Xue, Kangrui Wang, Fan Zhou, James Y. Zhang, Jun Zhou, Chenhao Tan, Hongyuan Mei
Large language models have shown astonishing performance on a wide range of reasoning tasks.
2 code implementations • 28 Mar 2023 • Hongyu Zhao, Kangrui Wang, Mo Yu, Hongyuan Mei
In this paper, we propose LEAP, a novel system that uses language models to perform multi-step logical reasoning and incorporates explicit planning into the inference procedure.
no code implementations • 2 Dec 2021 • Dongrui Liu, Shaobo Wang, Jie Ren, Kangrui Wang, Sheng Yin, Huiqi Deng, Quanshi Zhang
In this paper, we focus on a typical two-phase phenomenon in the learning of multi-layer perceptrons (MLPs), and we aim to explain the reason for the decrease of feature diversity in the first phase.
1 code implementation • NeurIPS 2019 • Oliver Hamelijnck, Theodoros Damoulas, Kangrui Wang, Mark Girolami
We consider evidence integration from potentially dependent observation processes under varying spatio-temporal sampling resolutions and noise levels.