Applied Sciences, Vol. 15, Pages 10031: Multi-Source Energy Storage Day-Ahead and Intra-Day Scheduling Based on Deep Reinforcement Learning with Attention Mechanism


Applied Sciences, Vol. 15, Pages 10031: Multi-Source Energy Storage Day-Ahead and Intra-Day Scheduling Based on Deep Reinforcement Learning with Attention Mechanism

Applied Sciences doi: 10.3390/app151810031

Authors:
Enren Liu
Song Gao
Xiaodi Chen
Jun Li
Yuntao Sun
Meng Zhang

With the rapid integration of high-penetration renewable energy, its inherent uncertainty complicates power system day-ahead/intra-day scheduling, leading to challenges like wind curtailment and high operational costs. Existing methods either rely on inflexible physical models or use deep reinforcement learning (DRL) without prioritizing critical variables or synergizing multi-source energy storage and demand response (DR). This study develops a multi-time scale coordination scheduling framework to balance cost minimization and renewable energy utilization, with strong adaptability to real-time uncertainties. The framework integrates a day-ahead optimization model and an intra-day rolling model powered by an attention-enhanced DRL Actor–Critic network—where the attention mechanism dynamically focuses on critical variables to correct real-time deviations. Validated on an East China regional grid, the framework significantly enhances renewable energy absorption and system flexibility, providing a robust technical solution for the economical and stable operation of high-renewable power systems.



Source link

Enren Liu www.mdpi.com