运筹学学报 ›› 2021, Vol. 25 ›› Issue (3): 119-132.doi: 10.15960/j.cnki.issn.1007-6093.2021.03.007

• • 上一篇    下一篇

梯度法简述

孙聪*, 张亚   

  1. 北京邮电大学理学院, 北京 100876
  • 收稿日期:2021-03-16 发布日期:2021-09-26
  • 通讯作者: 孙聪 E-mail:suncong86@bupt.edu.cn
  • 基金资助:
    国家自然科学基金(Nos.11771056,11871115)

A brief review on gradient method

SUN Cong*, ZHANG Ya   

  1. School of Science, Beijing University of Posts and Telecommunications, Beijing 100876, China
  • Received:2021-03-16 Published:2021-09-26

摘要: 梯度法是一类求解优化问题的一阶方法。梯度法形式简单、计算开销小,在大规模问题的求解中得到了广泛应用。系统地介绍了光滑无约束问题梯度法的迭代格式、理论框架。梯度法中最重要的参数是步长,步长的选取直接决定了梯度法的收敛性质与收敛速度。从线搜索框架、近似技巧、随机技巧和交替重复步长四方面介绍了梯度步长的构造思想及相应梯度法的收敛性结果,还对非光滑及约束问题的梯度法、梯度法加速技巧和随机梯度法等扩展方向做了简要介绍。

关键词: 梯度法, 光滑无约束优化, 步长更新策略, 线搜索, 近似

Abstract: Gradient method is a kind of first order optimization method. It is widely used for large scale problems, due to its simplicity and low complexity. This paper is a review on gradient method. Gradient methods for smooth unconstrained problems are introduced, with details of algorithm framework and theories. The crucial factor in gradient method is the stepsize, which determines the convergence property of the method. This paper reviews the stepsize update strategies and the corresponding convergence results from four aspects:line search, approximation technique, stochastic technique, alternating and constant stepsizes. Other related topics like gradient method for nonsmooth and constrained optimization problems, acceleration technique and stochastic gradient method are also mentioned.

Key words: gradient method, smooth unconstrained optimization, stepsize update, line search, approximation

中图分类号: