Journal of Jilin University Science Edition ›› 2025, Vol. 63 ›› Issue (3): 835-0844.

Previous Articles     Next Articles

Improved  SHO Optimization Neural Network Model

LI Jian1,2, WANG Hairui1, WANG Zenghui3, FU Haitao1, YU Weilin1   

  1. 1. College of Information Technology, Jilin Agricultural University, Changchun 130118, China; 2. Jilin Bioinformatics Research Center, Changchun 130118, China;3. School of Science and Technology, Changchun University of Humanities, Changchun 130117, China
  • Received:2023-11-03 Online:2025-05-26 Published:2025-05-26

Abstract: Aiming at the problems of low recognition accuracy and poor sensitivity of Googlenet model, we proposed a hyperparameter optimization Googlenet model by using  the improved sea-horse  optimization (SASHO) algorithm.  Firstly, the sea-horse optimization algorithm was improved by using Sobel sequence and adaptive weight algorithm. Secondly, the four basic neural networks were compared to select the most suitable Googlenet for this dataset as the basic recognition model. Finally, the improved SASHO algorithm was used to optimize the parameters of Googlenet model, and a new model SASHO-Googlenet was constructed. In order to verify the effectiveness of SASHO-Googlenet model, the SASHO-Googlenet model was compared with the model optimized by the other four swarm intelligence algorithms for seven indicators. The results show that the accuracy rate of SASHO-Googlenet model is 95.36%, the sensitivity is 95.35%, the specificity is 95.39%, the accuracy is 96.47%, the recall rat
e is 95.35%, the f_measure is 95.90%, and the g_mean is 95.37%. Experimental results show that the SASHO-Googlenet  model has the best comprehensive performance.

Key words: artificial intelligence, deep learning, sea-horse optimization algorithm, parameter optimization

CLC Number: 

  •