J4 ›› 2010, Vol. 48 ›› Issue (03): 396-400.

• 数学 • 上一篇    下一篇

Wolfe线性搜索下的超记忆梯度法及其收敛性

汤京永1,2, 董丽1   

  1. 1. 信阳师范学院 数学与信息科学学院, 河南 信阳 464000|2. 上海交通大学 数学系, 上海 200240
  • 收稿日期:2009-06-26 出版日期:2010-05-26 发布日期:2010-05-19
  • 通讯作者: 汤京永 E-mail:tangjingyong@tom.com

A SuperMemory Gradient Method with Wolfe Linear Search Rule and Its Convergence

TANG Jingyong1,2, DONG Li1   

  1. 1. College of Mathematics and Information Science, Xinyang Normal University, Xinyang 464000, Henan Province, China;
    2. Department of Mathematics, Shanghai Jiaotong University, Shanghai 200240, China
  • Received:2009-06-26 Online:2010-05-26 Published:2010-05-19
  • Contact: TANG Jingyong E-mail:tangjingyong@tom.com

摘要:

研究无约束优化问题, 给出了一种新的超记忆梯度法, 在较弱条件下证明了算法具有全局收敛性和线性收敛速率. 数值试验表明新算法是有效的.

关键词: 无约束优化, 超记忆梯度法, 全局收敛性, 线性收敛速率

Abstract:

A new supermemory gradient method for unconstrained optimization problems was presented. The globe convergence and linear convergence rate were proved under some mild conditions. Numerical experiments show that the method is efficient in practical coputation.

Key words: unconstrained optimization, supermemory gradient method, global convergence, linear convergence rate

中图分类号: 

  • O221.2