sequential recommendation, recurrent neural network, gated recurrent unit, residual network ,"/> 基于深度残差循环神经网络的序列推荐模型 

吉林大学学报(信息科学版) ›› 2023, Vol. 41 ›› Issue (6): 1128-1134.

• • 上一篇    下一篇

基于深度残差循环神经网络的序列推荐模型 

王浩宇, 李蕴华    

  1. 吉林大学 大数据和网络管理中心, 长春 130012
  • 收稿日期:2023-04-20 出版日期:2023-11-30 发布日期:2023-12-01
  • 作者简介:王浩宇(1984— ), 男, 吉林松原人, 吉林大学工程师, 主要从事计算机网络、 无线通信研究, ( Tel) 86-13644408500 (E-mail)wanghy963@ mail. jlu. edu. cn

Residual Connected Deep GRU for Sequential Recommendation

WANG Haoyu, LI Yunhua    

  1. Big Data and Network Management Center, Jilin University, Changchun 130012, China
  • Received:2023-04-20 Online:2023-11-30 Published:2023-12-01

摘要: 为解决基于 RNN(Recurrent Neural Network)的序列推荐模型在处理长序列时易出现梯度消失或爆炸从而 导致推荐模型训练过程不稳定问题, 在传统门控循环单元(GRU: Gated Recurrent Unit) 基础上, 引入了残差 连接、 层归一化以及前馈神经网络等模块, 提出了基于深度残差循环神经网络的序列推荐模型 DeepGRU。 并 在 3 个公开数据集上进行了验证, 实验结果表明, DeepGRU 相较于目前最先进的序列推荐方法具有明显的 优势(推荐精度平均提升 8. 68% )。 消融实验验证了引入的残差连接等模块在 DeepGRU 框架下的有效性。 并且, DeepGRU 有效缓解了在处理长序列时训练过程不稳定的问题。 

关键词: 序列推荐, 循环神经网络, 门控循环单元, 残差网络

Abstract: To avoid the gradient vanishing or exploding issue in the RNN(Recurrent Neural Network)-based sequential recommenders, a gated recurrent unit based sequential recommender DeepGRU is proposed which introduces the residual connection, layer normalization and feed forward neural network. The proposed algorithm is verified on three public datasets, and the experimental results show that DeepGRU has superior recommendation performance over several state-of-the-art sequential recommenders ( averagely improved by 8. 68% ) over all compared metrics. The ablation study verifies the effectiveness of the introduced residual connection, layer normalization and feedforward layer. It is empirically demonstrated that DeepGRU effectively alleviates the unstable training issue when dealing with long sequences. 

Key words:

sequential recommendation')">"> sequential recommendation, recurrent neural network, gated recurrent unit, residual network

中图分类号: 

  • TP3