Learning a Multi-Agent Controller for Shared Energy Storage System

16 Feb 2023  ·  Ruohong Liu, Yize Chen ·

Deployment of shared energy storage systems (SESS) allows users to use the stored energy to meet their own energy demands while saving energy costs without installing private energy storage equipment. In this paper, we consider a group of building users in the community with SESS, and each user can schedule power injection from the grid as well as SESS according to their demand and real-time electricity price to minimize energy cost and meet energy demand simultaneously. SESS is encouraged to charge when the price is low, thus providing as much energy as possible for users while achieving energy savings. However, due to the complex dynamics of buildings and real-time external signals, it is a challenging task to find high-performance power dispatch decisions in real-time. By designing a multi-agent reinforcement learning framework with state-aware reward functions, SESS and users can realize power scheduling to meet the users' energy demand and SESS's charging/discharging balance without additional communication, so as to achieve energy optimization. Compared with the baseline approach without the participation of the SESS, the energy cost is saved by around 2.37% to 21.58%.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here