吉林大学学报(理学版) ›› 2019, Vol. 57 ›› Issue (04): 857-859.

• 数学 • 上一篇    下一篇

深度学习神经网络的新型自适应激活函数

刘宇晴, 王天昊, 徐旭   

  1. 吉林大学 数学学院, 长春 130012
  • 收稿日期:2019-01-17 出版日期:2019-07-26 发布日期:2019-07-11
  • 通讯作者: 徐旭 E-mail:xuxu@jlu.edu.cn

New Adaptive Activation Function for Deep Learning Neural Networks

LIU Yuqing, WANG Tianhao, XU Xu   

  1. College of Mathematics, Jilin University, Changchun 130012, China
  • Received:2019-01-17 Online:2019-07-26 Published:2019-07-11
  • Contact: XU Xu E-mail:xuxu@jlu.edu.cn

摘要: 构造一个含有参数的光滑激活函数用于深度学习神经网络, 通过基于误差反向传播算法建立参数的在线修正公式, 避免了梯度弥散、 不光滑及过
拟合等问题. 与一些常用的激活函数进行对比实验结果表明, 新的激活函数在多个数据集上效果均较好.

关键词: 激活函数, 卷积神经网络, 机器学习

Abstract: A smooth activation function with a parameter was constructed for the deep learning neural networks. The online correction formula for this parameter was established based on the error back propagation algorithm, which avoided the problems of gradients losing, nonsmooth and overfitting. Compared with some popular activation functions, the results show that the new activation function works well on many data sets.

Key words: activation function, convolutional neural network, machine learning

中图分类号: 

  • O175.1