Interpretable User Models via Decision-rule Gaussian Processes: Preliminary Results on Energy Storage

Models of user behavior are critical inputs in many prescriptive settings and can be viewed as decision rules that transform state information available to the user into actions. Gaussian processes (GPs), as well as nonlinear extensions thereof, provide a flexible framework to learn user models in conjunction with approximate Bayesian inference. However, the resulting models may not be interpretable in general. We propose decision-rule GPs (DRGPs) that apply GPs in a transformed space defined by decision rules that have immediate interpretability to practitioners. We illustrate this modeling tool on a real application and show that structural variational inference techniques can be used with DRGPs. We find that DRGPs outperform the direct use of GPs in terms of out-of-sample performance.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here