DRL-Clusters: Buffer Management with Clustering based Deep Reinforcement Learning

Buffer cache has been widely implemented in database systems to reduce disk I/Os. Existing database systems typically use heuristic-based algorithms for buffer replacement, which cannot dynamically adapt to changing workload patterns. This paper proposes a deep reinforcement learning-based approach, DRL-Clusters, to manage the buffer pool when handling changing workloads. DRL-Clusters can dynamically adapt to different workload patterns without incurring high inference overhead and miss ratio with page re-clustering and continuous interactions with the cache environment. Our evaluation results demonstrate that DRL-Clusters can achieve a lower or comparable miss ratio than the heuristic policies while reducing 13.3% - 26.8% page access overhead under changing workloads.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here