sequential recommendation, recurrent neural network, gated recurrent unit, residual network ,"/> Residual Connected Deep GRU for Sequential Recommendation

Journal of Jilin University (Information Science Edition) ›› 2023, Vol. 41 ›› Issue (6): 1128-1134.

Previous Articles     Next Articles

Residual Connected Deep GRU for Sequential Recommendation

WANG Haoyu, LI Yunhua    

  1. Big Data and Network Management Center, Jilin University, Changchun 130012, China
  • Received:2023-04-20 Online:2023-11-30 Published:2023-12-01

Abstract: To avoid the gradient vanishing or exploding issue in the RNN(Recurrent Neural Network)-based sequential recommenders, a gated recurrent unit based sequential recommender DeepGRU is proposed which introduces the residual connection, layer normalization and feed forward neural network. The proposed algorithm is verified on three public datasets, and the experimental results show that DeepGRU has superior recommendation performance over several state-of-the-art sequential recommenders ( averagely improved by 8. 68% ) over all compared metrics. The ablation study verifies the effectiveness of the introduced residual connection, layer normalization and feedforward layer. It is empirically demonstrated that DeepGRU effectively alleviates the unstable training issue when dealing with long sequences. 

Key words:

sequential recommendation')">"> sequential recommendation, recurrent neural network, gated recurrent unit, residual network

CLC Number: 

  • TP3