一类具有充分下降性的谱共轭梯度法

展开
  • 1. 广西民族大学数学与物理学院, 广西南宁 530006;
    2. 广西大学数学与信息科学学院, 广西南宁 530004

收稿日期: 2020-10-23

  网络出版日期: 2022-11-28

基金资助

国家自然科学基金(No. 12171106), 广西自然科学基金(No. 2020GXNSFDA238017), 广西民族大学科研基金(No. 2018KJQD02), 广西民族大学研究生创新项目(No. gxun-chxzs2019034)

A class of spectral conjugate gradient method with sufficient descent property

Expand
  • 1. College of of Mathematics and Physics, Guangxi Minzu University, Nanning 530006, Guangxi, China;
    2. College of Mathematics and Information Science, Guangxi University, Nanning 530004, Guangxi, China

Received date: 2020-10-23

  Online published: 2022-11-28

摘要

谱共轭梯度法是经典共轭梯度法的一种重要推广,是求解大规模无约束优化问题的有效方法之一, 其中谱参数的设计尤为重要。本文通过构造一个新的谱参数且要求共轭参数满足一定条件,建立一个新的谱共轭梯度法框架。常规假设条件下,使用强Wolfe非精确线搜索准则产生步长,证明新算法框架具有充分下降性及全局收敛性。最后,基于新算法框架, 选择满足条件的现有共轭参数进行数值测试,并与其他数值效果较好的算法进行比较,结果显示基于本文新算法框架所建立的算法是有效的。

本文引用格式

刘鹏杰, 江羡珍, 宋丹 . 一类具有充分下降性的谱共轭梯度法[J]. 运筹学学报, 2022 , 26(4) : 87 -97 . DOI: 10.15960/j.cnki.issn.1007-6093.2022.04.007

Abstract

The spectral conjugate gradient method is an important extension of the conjugate gradient method, and is one of the effective methods for solving large-scale unconstrained optimization. The designing for the spectral parameter is a critical work in spectral conjugate gradient method. In this paper, a new spectral parameter is given, and a new framework of spectral conjugate gradient method is established when the conjugate parameter satisfies a certain restrictive condition. Under the general assumptions and in case where the strong Wolfe inexact line search criterion to yield the step length, the new algorithm framework have sufficient descent property and global convergence. Finally, for the new algorithm framework, the existing conjugate parameter that satisfies the restrictive condition is selected, and the numerical experiments are done to compare the proposed algorithm with other potential algorithms, and the numerical results show that the established algorithm is promising.

参考文献

[1] Hestenes M R, Stiefel E. Method of conjugate gradient for solving linear equations [J]. Journal of Research National Bureau of Standards, 1952, 49(6): 409-436.
[2] Fletcher R, Reeves C M. Function minimization by conjugate gradients [J]. The Computer Journal, 1964, 7(2): 149-154.
[3] Polak E, Ribière G. Note sur la convergence de mèthodes de directions conjugèes [J]. Revue Francaise d’Informatique et de Recherche Opérationnelle, 1969, 3(16): 35-43.
[4] Polyak B T. The conjugate gradient method in extreme problems [J]. USSR Computational Mathematics and Mathematical Physics, 1969, 9(4): 94-112.
[5] Dai Y H, Yuan Y X. A nonlinear conjugate gradient method with a strong global convergence property [J]. SIAM Journal on Optimization, 1999, 10(1): 177-182.
[6] Hager W W, Zhang H C. A new conjugate gradient method with guaranteed descent and an efficient line search [J]. SIAM Journal on Optimization, 2005, 16(1): 170-192.
[7] Hager W W, Zhang H C. A survey of nonlinear conjugate gradient methods [J]. Pacific Journal of Optimization, 2006, 2(1): 35-58.
[8] Jiang X Z, Jian J B. Two modified nonlinear conjugate gradient methods with disturbance factors for unconstrained optimization [J]. Nonlinear Dynamics, 2014, 77(1-2): 387-397.
[9] 江羡珍, 简金宝, 马国栋. 具有充分下降性的两个共轭梯度法[J]. 数学学报, 2014, 57(2): 365-372.
[10] Tsegay G W, 张海斌, 张鑫, 等. 一种求解非线性无约束优化问题的充分下降的共轭梯度法[J]. 运筹学学报, 2018, 22(3): 59-68.
[11] Mtagulwa P, Kaelo P. An efficient modified PRP-FR hybrid conjugate gradient method for solving unconstrained optimization problems [J]. Applied Numerical Mathematics, 2019, 145: 111-120.
[12] Zhang L, Zhou W J, Li D H. Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search [J]. Numerische Mathematik, 2006, 104(4): 561-572.
[13] Liu J K, Feng Y M, Zou L M. A spectral conjugate gradient method for solving large-scale unconstrained optimization [J]. Computers & Mathematics with Applications, 2019, 77(3): 731-739.
[14] Saman B K. A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization [J]. 4OR-A Quarterly Journal of Operations Research, 2013, 11(4): 361-374.
[15] Andrei N. A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization [J]. Applied Mathematics Letters, 2007, 20(6): 645-650.
[16] Jian J B, Chen Q, Jiang X Z, et al. A new spectral conjugate gradient method for large-scale unconstrained optimization [J]. Optimization Methods & Software, 2017, 32(3): 503-515.
[17] Deng S H, Wan Z, Chen X H. An improved spectral conjugate gradient algorithm for nonconvex unconstrained optimization problems [J]. Journal of Optimization Theory and Applications, 2013, 157(3): 820-842.
[18] Aminifard Z, Babaie-Kafaki S. A modified descent Polak-RibiWre-Polyak conjugate gradient method with global convergence property for nonconvex functions [J]. 2019, Calcolo, 56(2): 16.
[19] Birgin E G, Martínez J M. A spectral conjugate gradient method for unconstrained optimization [J]. Applied Mathematics and Optimization, 2001, 43(2): 117-128.
[20] Dai Y H, Yuan Y X. An efficient hybrid conjugate gradient method for unconstrained optimization [J]. Annals of Operations Research, 2001, 103: 33-47.
[21] 闫晖, 陈兰平. 推广AS-GN混合共轭梯度算法[J]. 运筹学学报, 2010, 14(3): 122-128.
[22] Liu J K, Li S J. New hybrid conjugate gradient method for unconstrained optimization [J]. Applied Mathematics & Computation, 2014, 245: 36-43.
[23] Liu J K, Feng Y M, Zou L M. Some three-term conjugate gradient methods with the inexact line search condition [J]. Calcolo, 2018, 55(2): 16.
[24] Kou C X, Dai Y H. A modified self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method for unconstrained optimization [J]. Journal of Optimization Theory and Applications, 2015, 165(1): 209-224.
[25] Tang C M, Li S Y, Cui Z R. Least-squares-based three-term conjugate gradient methods [J]. Journal of Inequalities and Applications, 2020, 2020: 27.
[26] Li M, Liu H W, Liu Z X. A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization [J]. Numerical Algorithms, 2018, 79: 195-219.
[27] Diao X L, Liu H W, Liu Z X. A new subspace minimization conjugate gradient method based on modified secant equation for unconstrained optimization [J]. Computational & Applied Mathematics, 2020, 39: 251.
[28] Al-Baali M. Descent property and global convergence of the Fletcher-Reeves method with inexact line search [J]. IMA Journal of Numerical Analysis, 1985, 5(1): 121-124.
[29] Glibert J C, Nocedal J. Global covergence properties of conjugate gradient method for optimization [J]. SIAM Journal on Optimization, 1992, 2(1): 21-42.
[30] Zoutendijk G. Nonlinear programming, computational methods [J]. Integer and Nonlinear Programming, 1970, 143: 37-86.
[31] Bongartz K E, Conn A R, Gould N I M, et al. CUTE: constrained and unconstrained testing environments [J]. ACM Transactions on Mathematical Software, 1995, 21: 123-160.
[32] Moré J, Garbow B S, Hillstrome K E. Testing unconstrained optimization software [J]. ACM Transactions on Mathematical Software, 1981, 7: 17-41.
[33] Andrei N. An unconstrained optimization test functions collection [J]. Advanced Modeling and Optimization, 2008, 10(1): 147-161.
[34] Dolan E D, Moré J. Benchmarking optimization software with performance profiles [J]. Mathematical Programming, 2002, 91: 201-213.
文章导航

/