Journal of Jilin University (Information Science Edition) ›› 2019, Vol. 37 ›› Issue (1): 107-112.

Previous Articles    

Improving Neural Network Method by Using Information Interaction Optimal Weight

XU Yang1,ZHANG Zhongwei1,LIU Ming2   

  1. 1. School of Electrical Information and Engineering,Northeast Petroleum University,Daqing 163318,China;
    2. Pipeline Qinhuangdao Oil and Gas Branch,China National Petroleum Corporation,Qinhuangdao 066000,China
  • Online:2019-01-24 Published:2019-05-09

Abstract: Multilayer perceptrons of convolutional neural networks use the gradient descent algorithm for training,there are often problems of slow convergence and small localization. In order to solve the problem,we propose a method to calculate the optimal initialization weight by using information interaction to improve the proposed network structure,which can effectively reduce the training time and avoid the local minimum problem. Firstly we use mathematical theory to derive the formula for the optimal initialization weight of the ReLU function. We target 2-channel network structure using this method,directly substituting data to find the optimal initial weight.Through multiple training and testing of the three data sets,better results are obtained. Average matching accuracy for grayscale images is increased by about 1. 5%. For the FPR95,the average value also dropped from 5. 23 to 4. 65. Initialization weight setting prevents neurons from entering the hard saturated region. And it has the advantages of higher precision,stable effect and fast convergence.

Key words: neural network, weight, information interaction

CLC Number: 

  • TP183