吉林大学学报(理学版) ›› 2021, Vol. 59 ›› Issue (5): 1107-1116.

• • 上一篇    下一篇

一类修正Hager-Zhang共轭梯度法的收敛性及其数值实验

王松华, 夏师, 黎勇   

  1. 百色学院 数学与统计学院, 广西 百色533000
  • 收稿日期:2021-01-07 出版日期:2021-09-26 发布日期:2021-09-26
  • 通讯作者: 夏师 E-mail:33164629@qq.com

Convergence of a Class of Modified Hager-Zhang Conjugate Gradient Method and Its Numerical Experiment

WANG Songhua, XIA Shi, LI Yong   

  1. School of Mathematics and Statistics, Baise University, Baise 533000, Guangxi Zhuang Autonomous Region, China
  • Received:2021-01-07 Online:2021-09-26 Published:2021-09-26

摘要: 为有效提高求解无约束优化问题的计算效率, 提出一类新的修正Hager-Zhang共轭梯度法, 该算法不依赖线搜索, 具有充分下降性和信赖域性质. 理论研究结果表明, 在常规假设条件下, 新算法不仅在弱Wolfe-Powell线搜索下对一般函数全局收敛, 且对一致凸函数具有R-线性收敛速度. 数值实验结果表明, 新算法比经典Hager-Zhang算法及其两个修正算法性能更优.

关键词: 无约束优化, Hager-Zhang共轭梯度法, 充分下降性, 信赖域, 收敛性

Abstract: In order to improve computational efficiency for unconstrained optimization problems, we proposed a new class of modified Hager-Zhang conjugate method, which possessed sufficient descent feature and trust region  trait without line search. The theoretical research results show that under some proper assumptions, the new algorithm not only converges globally for general function under weak Wolfe-Powell line search, but also has R-|linear convergence rate for uniformly convex functions. Numerical results show that the new algorithm performs better than the classical Hager-Zhang algorithm and its two classical modified algorithms.

Key words: unconstrained optimization, Hager-Zhang conjugate gradient method, sufficient descent feature, trust region trait, convergence

中图分类号: 

  • O224.2