Journal of Jilin University Science Edition

Previous Articles     Next Articles

LogReLU: A New Rectified Activation Unit Using Log Funcion

WANG Duomin, LIU Shufen   

  1. College of Computer Science and Technology, Jilin University, Changchun 130012, China
  • Received:2016-07-26 Online:2017-05-26 Published:2017-05-31
  • Contact: LIU Shufen E-mail:liusf@jlu.edu.cn

Abstract: We proposed a new revised rectified activation function. The new rectified activation function has a changeable parameter, and the logarithmic function was used to rectify the gradient in positive region, which solved the problem of low accuracy of prediction. The improved activation function used two different parameters to control the gradient in positive region and negative region, respectively. The results of simulation experiments on two different data sets show that the effectiveness of the two new methods is better than that of the original rectified linear units. With the improvement of two parameters, the verification error rate is reduced by 014% and 533%, respectively.

Key words: ReLU, convolutional neural network, activation function, artificial intelligence

CLC Number: 

  • TP301.6