,load forecasting, sequence to sequence( Seq2Seq), self-attention mechanism, temporal attention mechanism, multi step prediction ,"/> 基于双注意力机制的 Seq2Seq 短期负荷预测

吉林大学学报(信息科学版) ›› 2023, Vol. 41 ›› Issue (2): 251-258.

• • 上一篇    下一篇

基于双注意力机制的 Seq2Seq 短期负荷预测

姜建国, 陈 鹏, 郭晓丽, 佟麟阁, 万成德   

  1. (东北石油大学 电气信息工程学院, 黑龙江 大庆 163318) 
  • 收稿日期:2021-09-28 出版日期:2023-04-13 发布日期:2023-04-16
  • 作者简介: 姜建国(1966— ), 男, 新疆奇台人, 东北石油大学教授, 硕士生导师, 主要从事智能电网及电气自动化、 深度学习负荷 预测研究, (Tel)86-13734583588(E-mail)jjgnepu@ 163. com

Seq2seq Short-Term Load Forecasting Based on Double Attention Mechanism

JIANG Jianguo, CHEN Peng, GUO Xiaoli, TONG Linge, WAN Chengde   

  1. (College of Electrical and Information Engineering, Northeast Petroleum University, Daqing 163318, China) 
  • Received:2021-09-28 Online:2023-04-13 Published:2023-04-16

摘要:  针对经典的深度学习方法在多步长预测精度不高问题, 提出一种基于双注意力序列到序列的短期负荷预 测模型。 通过自注意力机制有效提取影响负荷数据的隐藏相关因素, 使模型能更好地发现负荷数据之间的规 律, 自适应地学习了负荷数据之间的相关特征, 时间注意力机制捕获与时间相关的时序特征。 经 2 个实际负荷 数据实验, 仿真结果表明, 在( t+12) 预测情况下, 模型评价指标 MAPE( Mean Absolute Percentage Error) 为 2. 09% , 较 LSTM(Long Short-Term Memory)模型损失下降 56. 69% 。 验证了模型的正确性和可行性, 模型较线 性回归、 LSTM 模型和 Seq2Seq (Sequence to Sequence)模型的预测效果更好。

关键词: 负荷预测, 序列到序列, 自注意力机制, 时间注意力机制, 多步长预测

Abstract:  Aiming at the problem that the classical deep learning method has low accuracy in multi-step load forecasting, a short-term load forecasting model based on double attention sequence to sequence is proposed. Through the self-attention mechanism, the hidden related factors affecting the load data are effectively extracted, so that the model can better find the laws between the load data, adaptively learn the related characteristics between the load data, and the temporal-attention mechanism captures the time-related time-series characteristics. Through two actual load data experiments, the simulation results show that under the condition of (t+12) prediction, the model evaluation index MAPE(Mean Absolute Percentage Error) is 2. 09% , which is 56. 69% lower than that of LSTM(Long Short-Term Memory) model. The validity and feasibility of the model are verified. The prediction effect of the model is better than that of linear regression, LSTM model and Seq2Seq (Sequence to Sequence) model.

Key words:  ')">

 , load forecasting, sequence to sequence( Seq2Seq), self-attention mechanism, temporal attention mechanism, multi step prediction

中图分类号: 

  • TP391