Operations Research Transactions ›› 2022, Vol. 26 ›› Issue (1): 1-22.doi: 10.15960/j.cnki.issn.1007-6093.2022.01.001

Previous Articles     Next Articles

Mini-batch stochastic block coordinate descent algorithm

Jia HU1, Tiande GUO1,2, Congying HAN1,2*,*()   

  1. 1. School of Mathematical Sciences, University of Chinese Academy of Sciences, Beijing 100049, China
    2. Key Laboratory of Big Data Mining and Knowledge Management, Chinese Academy of Sciences, Beijing 100190, China
  • Received:2021-06-14 Online:2022-03-15 Published:2022-03-14
  • Contact: Congying HAN E-mail:hancy@ucas.ac.cn

Abstract:

We study the mini-batch stochastic block coordinate descent algorithm (mSBD) for a class of problems, i.e., structured stochastic optimization problems (where "structured" means that feasible region of the problem has a block structure and the nonsmooth regularized part of the objective function is separable across the variable blocks), widely used in machine learning. We give the basic mSBD and its variant according to solving non-composite and composite problems respectively. For the noncomposite problem, we analyze the convergence properties of the algorithm without the assumption of uniformly bounded gradient variance. For the composite problem, we obtain the convergence of the algorithm without the usual Lipschitz gradient continuity assumption. Finally, we verify the effectiveness of mSBD by numerical experiments.

Key words: block coordinate descent, stochastic approximation, stochastic (composite) optimization, Hölder continuity, nonsmooth, nonconvex optimization

CLC Number: