Journal of Jilin University (Information Science Edition) ›› 2023, Vol. 41 ›› Issue (2): 251-258.
Previous Articles Next Articles
JIANG Jianguo, CHEN Peng, GUO Xiaoli, TONG Linge, WAN Chengde
Received:
Online:
Published:
Abstract: Aiming at the problem that the classical deep learning method has low accuracy in multi-step load forecasting, a short-term load forecasting model based on double attention sequence to sequence is proposed. Through the self-attention mechanism, the hidden related factors affecting the load data are effectively extracted, so that the model can better find the laws between the load data, adaptively learn the related characteristics between the load data, and the temporal-attention mechanism captures the time-related time-series characteristics. Through two actual load data experiments, the simulation results show that under the condition of (t+12) prediction, the model evaluation index MAPE(Mean Absolute Percentage Error) is 2. 09% , which is 56. 69% lower than that of LSTM(Long Short-Term Memory) model. The validity and feasibility of the model are verified. The prediction effect of the model is better than that of linear regression, LSTM model and Seq2Seq (Sequence to Sequence) model.
Key words: ')">
, load forecasting, sequence to sequence( Seq2Seq), self-attention mechanism, temporal attention mechanism, multi step prediction
CLC Number:
JIANG Jianguo, CHEN Peng, GUO Xiaoli, TONG Linge, WAN Chengde. Seq2seq Short-Term Load Forecasting Based on Double Attention Mechanism[J].Journal of Jilin University (Information Science Edition), 2023, 41(2): 251-258.
Add to citation manager EndNote|Reference Manager|ProCite|BibTeX|RefWorks
URL: http://xuebao.jlu.edu.cn/xxb/EN/
http://xuebao.jlu.edu.cn/xxb/EN/Y2023/V41/I2/251
Cited