,load forecasting, sequence to sequence( Seq2Seq), self-attention mechanism, temporal attention mechanism, multi step prediction ,"/> Seq2seq Short-Term Load Forecasting Based on Double Attention Mechanism

Journal of Jilin University (Information Science Edition) ›› 2023, Vol. 41 ›› Issue (2): 251-258.

Previous Articles     Next Articles

Seq2seq Short-Term Load Forecasting Based on Double Attention Mechanism

JIANG Jianguo, CHEN Peng, GUO Xiaoli, TONG Linge, WAN Chengde   

  1. (College of Electrical and Information Engineering, Northeast Petroleum University, Daqing 163318, China) 
  • Received:2021-09-28 Online:2023-04-13 Published:2023-04-16

Abstract:  Aiming at the problem that the classical deep learning method has low accuracy in multi-step load forecasting, a short-term load forecasting model based on double attention sequence to sequence is proposed. Through the self-attention mechanism, the hidden related factors affecting the load data are effectively extracted, so that the model can better find the laws between the load data, adaptively learn the related characteristics between the load data, and the temporal-attention mechanism captures the time-related time-series characteristics. Through two actual load data experiments, the simulation results show that under the condition of (t+12) prediction, the model evaluation index MAPE(Mean Absolute Percentage Error) is 2. 09% , which is 56. 69% lower than that of LSTM(Long Short-Term Memory) model. The validity and feasibility of the model are verified. The prediction effect of the model is better than that of linear regression, LSTM model and Seq2Seq (Sequence to Sequence) model.

Key words:  ')">

 , load forecasting, sequence to sequence( Seq2Seq), self-attention mechanism, temporal attention mechanism, multi step prediction

CLC Number: 

  • TP391