吉林大学学报(理学版) ›› 2024, Vol. 62 ›› Issue (4): 933-942.

• • 上一篇    下一篇

基于改进图注意力网络的油井产量预测模型

张强, 彭骨, 薛陈斌   

  1. 东北石油大学 计算机与信息技术学院, 黑龙江 大庆 163318
  • 收稿日期:2023-06-29 出版日期:2024-07-26 发布日期:2024-07-26
  • 通讯作者: 彭骨 E-mail: pg24095@163.com

Oil Well Production Prediction Model Based on Improved Graph Attention Network

ZHANG Qiang, PENG Gu, XUE Chenbin   

  1. School of Computer and Information Technology, Northeast Petroleum University, Daqing 163318, Heilongjiang Province, China
  • Received:2023-06-29 Online:2024-07-26 Published:2024-07-26

摘要: 针对图注意力网络处理噪声和时序数据较弱, 并且在堆叠多层后出现梯度爆炸、 过平滑等问题, 提出一种改进图注意力网络模型. 
首先, 使用Squeeze-and-Excitation模块对样本输入数据的特征信息进行不同程度关注, 增强模型处理噪声的能力; 其次, 使用多头注意力机制, 将序列数据中每个序列相对其他序列进行加权求和, 提取数据的时序性; 再次, 将图注意力网络提取的节点特征与节点的度中心性拼接, 获取节点的局部特征, 并用全局平均池化的方式提取节点的全局特征; 最后, 将两者进行融合得到节点的最终特征表示, 增强模型的表征能力. 为验证改进图注意力网络的有效性, 将改进图注意力网络模型与LSTM,GRU和GGNN模型进行对比, 实验结果表明, 该模型预测效果得到有效提升, 具有更高的预测精度.

关键词: 图注意力网络, 多头注意力, 节点度中心性, 全局平均池化

Abstract: Aiming at  the problems that graph attention networks were weak in handling noisy and temporal data, as well as gradient explosion and oversmoothing after stacking multiple layers, we proposed an improved graph attention network model. Firstly, we used  the Squeeze-and-Excitation module to pay different levels of attention to the feature information of the sample input data to enhance the model’s ability to handle noise. Secondly, the temporal sequence of the data was extracted by using the multi-head attention mechanism, which weighted and summed each sequence in the sequence data relative to the other sequences. Thirdly,  the node features extracted from the graph attention network were spliced with the degree centrality of the nodes to obtain the local features of the nodes, and the global features of the nodes were extracted by using global average pooling. Finally, the two were fused to obtain the final feature representation of the nodes, which enhanced the representational ability of the model. In order to verify the effectiveness of the improved graph attention network, the improved graph attention network model was compared with LSTM, GRU and GGNN models. The experimental results show that the prediction effect of the model has been effectively improved, with higher prediction accuracy.

Key words: graph attention network, multi-head attention, node degree centrality, global average pooling

中图分类号: 

  • TP18