吉林大学学报(理学版)

• 计算机科学 • 上一篇    下一篇

一种使用log函数的新型修正激活单元LogReLU

王多民, 刘淑芬   

  1. 吉林大学 计算机科学与技术学院, 长春 130012
  • 收稿日期:2016-07-26 出版日期:2017-05-26 发布日期:2017-05-31
  • 通讯作者: 刘淑芬 E-mail:liusf@jlu.edu.cn

LogReLU: A New Rectified Activation Unit Using Log Funcion

WANG Duomin, LIU Shufen   

  1. College of Computer Science and Technology, Jilin University, Changchun 130012, China
  • Received:2016-07-26 Online:2017-05-26 Published:2017-05-31
  • Contact: LIU Shufen E-mail:liusf@jlu.edu.cn

摘要: 提出一种新型校正激活函数的改进, 该新型校正激活函数带有一个可变参数, 使用对数函数对正区域的梯度进行矫正, 解决了预测准确率较低的问题. 改进的激活函数使用两个不同参数分别控制正区域和负区域的梯度. 通过对两个不同数据集进行仿真实验的结果表明, 新提出的两种方法效果均好于原始的修正线性单元, 带有两
个参数的改进使验证错误率分别降低了0.14%和5.33%.

关键词: 人工智能, 激活函数, ReLU, 卷积神经网络

Abstract: We proposed a new revised rectified activation function. The new rectified activation function has a changeable parameter, and the logarithmic function was used to rectify the gradient in positive region, which solved the problem of low accuracy of prediction. The improved activation function used two different parameters to control the gradient in positive region and negative region, respectively. The results of simulation experiments on two different data sets show that the effectiveness of the two new methods is better than that of the original rectified linear units. With the improvement of two parameters, the verification error rate is reduced by 014% and 533%, respectively.

Key words: ReLU, convolutional neural network, activation function, artificial intelligence

中图分类号: 

  • TP301.6