Operations Research Transactions ›› 2021, Vol. 25 ›› Issue (3): 119-132.doi: 10.15960/j.cnki.issn.1007-6093.2021.03.007

Previous Articles     Next Articles

A brief review on gradient method

SUN Cong*, ZHANG Ya   

  1. School of Science, Beijing University of Posts and Telecommunications, Beijing 100876, China
  • Received:2021-03-16 Published:2021-09-26

Abstract: Gradient method is a kind of first order optimization method. It is widely used for large scale problems, due to its simplicity and low complexity. This paper is a review on gradient method. Gradient methods for smooth unconstrained problems are introduced, with details of algorithm framework and theories. The crucial factor in gradient method is the stepsize, which determines the convergence property of the method. This paper reviews the stepsize update strategies and the corresponding convergence results from four aspects:line search, approximation technique, stochastic technique, alternating and constant stepsizes. Other related topics like gradient method for nonsmooth and constrained optimization problems, acceleration technique and stochastic gradient method are also mentioned.

Key words: gradient method, smooth unconstrained optimization, stepsize update, line search, approximation

CLC Number: