运筹学学报

• •    

分块凸-非凹极小极大问题的交替近端梯度算法

张慧灵1,徐洋1,徐姿2   

  1. 1. 上海大学
    2. 上海大学理学院数学系
  • 收稿日期:2020-10-26 修回日期:2021-02-03 出版日期:2021-02-26 发布日期:2021-02-26
  • 通讯作者: 徐姿
  • 基金资助:
    国家自然科学基金;国家自然科学基金;上海市自然基金

Block Alternating Proximal Gradient Algorithm for Convex-Nonconcave Minimax Problems

  • Received:2020-10-26 Revised:2021-02-03 Online:2021-02-26 Published:2021-02-26
  • Contact: Zi Xu

摘要: 本文提出一种分块交替近端梯度算法求解分块凸-非凹的极小极大优化问题。该算法是一种单循环算法。在该算法的每次迭代中,采用近端梯度法交替更新目标函数中的各个变量。我们从理论上证明了算法达到$\varepsilon$-稳定点需要的迭代复杂度是 $\mathcal{O}\left( \varepsilon ^{-4} \right)$。据我们所知,这是求解分块凸-非凹的极小极大优化问题的首个带复杂度的单循环算法。

关键词: 极小极大优化问题, 机器学习, 交替近端梯度法

Abstract: This paper proposes a block-alternating proximal gradient algorithm to solve block convex-nonconcave minimax optimization problems. The algorithm is a single loop algorithm. An each iteration of the algorithm, the proximal gradient method is used to alternately update each variable in the objective function. We have theoretically proved that the algorithm achieves an $\varepsilon$-stationary point in $\mathcal{O}\left( \varepsilon ^{-4} \right)$ iterations. To the best of our knowledge, this is the first time that a single loop algorithm has been proposed to solve a block convex-nonconcave minimax optimization problem.

Key words: minimax optimization problem, machine learning, alternating proximal gradient method