Operations Research Transactions ›› 2026, Vol. 30 ›› Issue (1): 217-234.doi: 10.15960/j.cnki.issn.1007-6093.2026.01.016

Previous Articles    

Convergence analysis of an adaptive proximal gradient-subgradient algorithm for square-root-loss regression problems

YANG Jinji, SHEN Chungen†, YU Zhensheng   

  1. College of Sciences, University of Shanghai for Science and Technology, Shanghai 200093, China
  • Received:2022-11-01 Published:2026-03-16

Abstract: Square-root-loss regression problems have attracted great attention since the choice of its regularization parameter does not rely on the prior knowledge of the deviation of the noise. However, the square-root loss function has a non-differentiable point, which brings difficulties to numerical algorithms. In this paper, we improve proofs of local smoothness and locally restricted strong convexity of the square-root loss function, which is based on the work of Li et al. (2020). To overcome numerical difficulties caused by nonsmoothness of the loss function, we develop an adaptive proximal gradient-subgradient algorithm (APGSA). Under some assumptions, the global convergence of the proposed algorithm is guaranteed with high probability. In addition, we also prove that the algorithm can accurately identify the active manifold in finite iterations, and then the linear rate of local convergence with high probability is established. Finally, simulation experiments were conducted to verify both the effectiveness and the fast local linear convergence of the algorithm APGSA.

Key words: local smoothness, locally restricted strong convexity, proximal gradient algorithm, proximal subgradient algorithm, partial smoothness

CLC Number: