运筹学学报 ›› 2021, Vol. 25 ›› Issue (2): 81-92.doi: 10.15960/j.cnki.issn.1007-6093.2021.02.006

•   • 上一篇    下一篇

一种单位化的增量梯度算法

钱晓慧1, 王湘美1,*()   

  1. 1. 贵州大学数学与统计学院, 贵阳 550025
  • 收稿日期:2019-12-30 出版日期:2021-06-15 发布日期:2021-05-06
  • 通讯作者: 王湘美 E-mail:xmwang2@gzu.edu.cn
  • 作者简介:王湘美 E-mail: xmwang2@gzu.edu.cn
  • 基金资助:
    国家自然科学基金(11661019);贵州省科技计划项目(20161039);贵州省数据驱动建模学习与优化创新团队(黔科合平台人才[2020]5016)

A scaled incremental gradient method

Xiaohui QIAN1, Xiangmei WANG1,*()   

  1. 1. School of Mathematics and Statistics, Guizhou University, Guiyang 550025, China
  • Received:2019-12-30 Online:2021-06-15 Published:2021-05-06
  • Contact: Xiangmei WANG E-mail:xmwang2@gzu.edu.cn

摘要:

研究目标函数是若干光滑函数和的可分离优化问题,提出了一种单位化增量梯度算法。该算法每次子迭代只需要计算一个(或几个)分量函数的单位负梯度方向作为迭代方向。在一定条件下,证明了采用发散步长的单位化增量梯度算法的收敛性。作为应用,新算法和Bertsekas D P,Tsitsikils J N提出的(没有单位化)增量梯度算法分别用来求解稳健估计问题和源定位问题。数值例子表明,新算法优于(没有单位化)增量梯度算法。

关键词: 可分离优化, 单位化增量梯度算法, 增量梯度法, 发散步长准则

Abstract:

A scaled incremental gradient algorithm for minimizing a sum of continuously differentiable functions is presented. At each iteration of the algorithm, the iterate is updated incrementally by a sequence of some steps, and each step is cyclically evaluates a normalized gradient of a single component function (or several component functions). Under some moderate assumptions, the convergence result of the algorithm employing the divergence step sizes is established. As applications, the new algorithm and the (unscaled) one proposed by Bertsekas D P, Tsitsikils J N are applied to solve the robust estimation problem and the source localization problem, respectively. Some numerical experiments show that the new algorithm is more effective and robust than the corresponding (unscaled) one.

Key words: separable optimization, scaled incremental gradient algorithm, incremental gradient algorithm, divergence step size rule

中图分类号: