Operations Research Transactions ›› 2024, Vol. 28 ›› Issue (2): 131-142.doi: 10.15960/j.cnki.issn.1007-6093.2024.02.010

Previous Articles     Next Articles

An accelerated proximal stochastic gradient method with variance reduction based on Polyak step size

Fusheng WANG1,*(), Luyu SHI1   

  1. 1. School of Mathematics and Statistics, Taiyuan Normal University, Jinzhong 030619, Shanxi, China
  • Received:2023-07-01 Online:2024-06-15 Published:2024-06-07
  • Contact: Fusheng WANG E-mail:fswang2005@163.com

Abstract:

To solve stochastic composite optimization problems in machine learning, we propose a new accelerated proximal variance reduction gradient algorithm called Acc-Prox-SVRG-Polyak, which combines the Acc-Prox-SVRG algorithm with the Polyak step size method. Compared to the existing algorithms, the new algorithm can make full use of the advantages of acceleration technology and Polyak step size to improve its accuracy, the convergence of the algorithm is demonstrated under the usual assumptions, and the complexity is analyzed. Finally, numerical experiments on standard data sets verify the effectiveness of the new algorithm.

Key words: Polyak step size, variance reduction, machine learning, stochastic gradient

CLC Number: