Gradient boosting machines and careful pre-processing work best: ASHRAE Great Energy Predictor III lessons learned

7 Feb 2022  ·  Clayton Miller, Liu Hao, Chun Fu ·

The ASHRAE Great Energy Predictor III (GEPIII) competition was held in late 2019 as one of the largest machine learning competitions ever held focused on building performance. It was hosted on the Kaggle platform and resulted in 39,402 prediction submissions, with the top five teams splitting $25,000 in prize money. This paper outlines lessons learned from participants, mainly from teams who scored in the top 5% of the competition. Various insights were gained from their experience through an online survey, analysis of publicly shared submissions and notebooks, and the documentation of the winning teams. The top-performing solutions mostly used ensembles of Gradient Boosting Machine (GBM) tree-based models, with the LightGBM package being the most popular. The survey participants indicated that the preprocessing and feature extraction phases were the most important aspects of creating the best modeling approach. All the survey respondents used Python as their primary modeling tool, and it was common to use Jupyter-style Notebooks as development environments. These conclusions are essential to help steer the research and practical implementation of building energy meter prediction in the future.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here