运筹学学报 ›› 2021, Vol. 25 ›› Issue (1): 96-106.doi: 10.15960/j.cnki.issn.1007-6093.2021.01.009

•   • 上一篇    下一篇

梯度Q-线性收敛的光滑凸极小化的一阶算法

叶加青1, 陈倩竹2, 胡海平2,*()   

  1. 1. 淮南联合大学信息工程学院, 安徽淮南 232038
    2. 上海大学理学院, 上海 200444
  • 收稿日期:2019-03-18 出版日期:2021-03-15 发布日期:2021-03-05
  • 通讯作者: 胡海平 E-mail:hu_jack@staff.shu.edu.cn
  • 作者简介:胡海平 E-mail: hu_jack@staff.shu.edu.cn
  • 基金资助:
    安徽省质量工程重点教研项目(2018jyxm1429)

Optimizing first-order methods for smooth convex minimization of gradient Q-linearly convergence

Jiaqing YE1, Qianzhu CHEN2, Haiping HU2,*()   

  1. 1. School of Information Engineering, Huainan Union University, Huainan 232038, Anhui, China
    2. College of Sciences, Shanghai University, Shanghai 200444, China
  • Received:2019-03-18 Online:2021-03-15 Published:2021-03-05
  • Contact: Haiping HU E-mail:hu_jack@staff.shu.edu.cn

摘要:

受性能估计问题(PEP)方法的启发,通过考察最坏函数误差的收敛边界(即效率),优化了迭代点对应的梯度满足Q-线性收敛的光滑凸极小化的一阶方法的步长系数。介绍新的有效的一阶方法,称为QGM,具有与优化梯度法(OGM)类似的计算有效形式。

关键词: 一阶算法, 光滑凸极小化, 梯度法

Abstract:

Inspired by the performance estimation problem (PEP) method, this paper optimizes the step size of the first order method of smooth convex minimization that the gradient corresponding to the iteration point satisfies Q-convergence by examining the worst case convergence boundary (i.e. efficiency) of the cost function. This paper introduces a new and effective first-order method called QGM, which has an effective computation form similar to the optimized gradient method (OGM).

Key words: first-order methods, smooth convex minimization, gradient method

中图分类号: