Journal of Jilin University (Information Science Edition) ›› 2023, Vol. 41 ›› Issue (6): 1128-1134.
Previous Articles Next Articles
WANG Haoyu, LI Yunhua
Received:
Online:
Published:
Abstract: To avoid the gradient vanishing or exploding issue in the RNN(Recurrent Neural Network)-based sequential recommenders, a gated recurrent unit based sequential recommender DeepGRU is proposed which introduces the residual connection, layer normalization and feed forward neural network. The proposed algorithm is verified on three public datasets, and the experimental results show that DeepGRU has superior recommendation performance over several state-of-the-art sequential recommenders ( averagely improved by 8. 68% ) over all compared metrics. The ablation study verifies the effectiveness of the introduced residual connection, layer normalization and feedforward layer. It is empirically demonstrated that DeepGRU effectively alleviates the unstable training issue when dealing with long sequences.
Key words:
sequential recommendation')">"> sequential recommendation, recurrent neural network, gated recurrent unit, residual network
CLC Number:
WANG Haoyu, LI Yunhua . Residual Connected Deep GRU for Sequential Recommendation[J].Journal of Jilin University (Information Science Edition), 2023, 41(6): 1128-1134.
Add to citation manager EndNote|Reference Manager|ProCite|BibTeX|RefWorks
URL: http://xuebao.jlu.edu.cn/xxb/EN/
http://xuebao.jlu.edu.cn/xxb/EN/Y2023/V41/I6/1128
Cited