Short-term Load Forecasting Based on Hybrid Strategy Using Warm-start Gradient Tree Boosting

23 May 2020  ·  Yuexin Zhang, Jiahong Wang ·

A deep-learning-based hybrid strategy for short-term load forecasting is presented. The strategy proposes a novel tree-based ensemble method Warm-start Gradient Tree Boosting (WGTB). Current strategies either ensemble submodels of a single type, which fail to take advantage of the statistical strengths of different inference models. Or they simply sum the outputs from completely different inference models, which doesn't maximize the potential of ensemble. Inspired by the bias-variance trade-off, WGTB is proposed and tailored to the great disparity among different inference models on accuracy, volatility and linearity. The complete strategy integrates four different inference models of different capacities. WGTB then ensembles their outputs by a warm-start and a hybrid of bagging and boosting, which lowers bias and variance concurrently. It is validated on two real datasets from State Grid Corporation of China of hourly resolution. The result demonstrates the effectiveness of the proposed strategy that hybridizes the statistical strengths of both low-bias and low-variance inference models.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here