Operations Research Transactions ›› 2021, Vol. 25 ›› Issue (2): 81-92.doi: 10.15960/j.cnki.issn.1007-6093.2021.02.006

Previous Articles     Next Articles

A scaled incremental gradient method

Xiaohui QIAN1, Xiangmei WANG1,*()   

  1. 1. School of Mathematics and Statistics, Guizhou University, Guiyang 550025, China
  • Received:2019-12-30 Online:2021-06-15 Published:2021-05-06
  • Contact: Xiangmei WANG E-mail:xmwang2@gzu.edu.cn

Abstract:

A scaled incremental gradient algorithm for minimizing a sum of continuously differentiable functions is presented. At each iteration of the algorithm, the iterate is updated incrementally by a sequence of some steps, and each step is cyclically evaluates a normalized gradient of a single component function (or several component functions). Under some moderate assumptions, the convergence result of the algorithm employing the divergence step sizes is established. As applications, the new algorithm and the (unscaled) one proposed by Bertsekas D P, Tsitsikils J N are applied to solve the robust estimation problem and the source localization problem, respectively. Some numerical experiments show that the new algorithm is more effective and robust than the corresponding (unscaled) one.

Key words: separable optimization, scaled incremental gradient algorithm, incremental gradient algorithm, divergence step size rule

CLC Number: