The effects of regularisation on RNN models for time series forecasting: Covid-19 as an example

9 May 2021  ·  Marcus Carpenter, Chunbo Luo, Xiao-Si Wang ·

Many research papers that propose models to predict the course of the COVID-19 pandemic either use handcrafted statistical models or large neural networks. Even though large neural networks are more powerful than simpler statistical models, they are especially hard to train on small datasets. This paper not only presents a model with grater flexibility than the other proposed neural networks, but also presents a model that is effective on smaller datasets. To improve performance on small data, six regularisation methods were tested. The results show that the GRU combined with 20% Dropout achieved the lowest RMSE scores. The main finding was that models with less access to data relied more on the regulariser. Applying Dropout to a GRU model trained on only 28 days of data reduced the RMSE by 23%.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods