no code implementations • 20 Feb 2022 • Ruoqi Shen, Liyao Gao, Yi-An Ma
We demonstrate experimentally that our theoretical results on optimal early stopping time corresponds to the training process of deep neural networks.
no code implementations • 22 Nov 2021 • Liyao Gao, Guang Lin, Wei Zhu
Incorporating group symmetry directly into the learning process has proved to be an effective guideline for model design.
no code implementations • 29 Sep 2021 • Ruoqi Shen, Liyao Gao, Yian Ma
Early stopping is a simple and widely used method to prevent over-training neural networks.
1 code implementation • 25 May 2021 • Dongxia Wu, Liyao Gao, Xinyue Xiong, Matteo Chinazzi, Alessandro Vespignani, Yi-An Ma, Rose Yu
Deep learning is gaining increasing popularity for spatiotemporal forecasting.
no code implementations • 12 Feb 2021 • Dongxia Wu, Liyao Gao, Xinyue Xiong, Matteo Chinazzi, Alessandro Vespignani, Yi-An Ma, Rose Yu
We introduce DeepGLEAM, a hybrid model for COVID-19 forecasting.
2 code implementations • ICML 2020 • Wei Deng, Qi Feng, Liyao Gao, Faming Liang, Guang Lin
Replica exchange Monte Carlo (reMC), also known as parallel tempering, is an important technique for accelerating the convergence of the conventional Markov Chain Monte Carlo (MCMC) algorithms.
Ranked #77 on Image Classification on CIFAR-100 (using extra training data)
no code implementations • 28 Apr 2020 • Liyao Gao, Yifan Du, Hongshan Li, Guang Lin
Rotation symmetry is a general property for most symmetric fluid systems.
no code implementations • 8 Jan 2019 • Liyao Gao, Zehua Cheng
CNNG is a progression of neural systems that work cooperatively to deal with various errands independently in a similar learning framework.
no code implementations • 10 Apr 2018 • Liyao Gao
The Cortex Neural Network is an upper architecture of neural networks which motivated from cerebral cortex in the brain to handle different tasks in the same learning system.