运筹学学报(中英文) ›› 2026, Vol. 30 ›› Issue (1): 217-234.doi: 10.15960/j.cnki.issn.1007-6093.2026.01.016

• • 上一篇    

关于求解平方根损失函数回归问题的自适应邻近梯度-次梯度算法的收敛性分析

杨金佶, 沈春根†, 宇振盛   

  1. 上海理工大学理学院, 上海 200093
  • 收稿日期:2022-11-01 发布日期:2026-03-16
  • 通讯作者: 沈春根 E-mail:shenchungen@usst.edu.cn
  • 基金资助:
    国家自然科学基金 (No. 12371308)

Convergence analysis of an adaptive proximal gradient-subgradient algorithm for square-root-loss regression problems

YANG Jinji, SHEN Chungen†, YU Zhensheng   

  1. College of Sciences, University of Shanghai for Science and Technology, Shanghai 200093, China
  • Received:2022-11-01 Published:2026-03-16

摘要: 模型由于其对于正则化参数的选择不依赖于残差的方差估计这一特点,从而受到了广泛关注。但平方根损失函数存在不可微点, 这给SQRT-Lasso模型的算法设计带来了困难。本文在Li 等人(2020)工作的基础上改进了平方根损失函数的局部光滑性与局部限制强凸性的证明;为了克服损失函数因存在不可微点而导致的计算困难,设计了自适应邻近梯度-次梯度算法(APGSA);在一定的假设条件下, 证明了所提算法在高概率意义下的全局收敛性。此外,本文还证明了算法在有限步迭代后准确探测出积极流形,进而得到了高概率意义下的局部线性收敛速度。最后通过仿真实验验证了算法(APGSA) 的有效性和局部线性收敛速度。

关键词: 局部光滑性, 局部限制强凸性, 邻近梯度算法, 邻近次梯度算法, 部分光滑性

Abstract: Square-root-loss regression problems have attracted great attention since the choice of its regularization parameter does not rely on the prior knowledge of the deviation of the noise. However, the square-root loss function has a non-differentiable point, which brings difficulties to numerical algorithms. In this paper, we improve proofs of local smoothness and locally restricted strong convexity of the square-root loss function, which is based on the work of Li et al. (2020). To overcome numerical difficulties caused by nonsmoothness of the loss function, we develop an adaptive proximal gradient-subgradient algorithm (APGSA). Under some assumptions, the global convergence of the proposed algorithm is guaranteed with high probability. In addition, we also prove that the algorithm can accurately identify the active manifold in finite iterations, and then the linear rate of local convergence with high probability is established. Finally, simulation experiments were conducted to verify both the effectiveness and the fast local linear convergence of the algorithm APGSA.

Key words: local smoothness, locally restricted strong convexity, proximal gradient algorithm, proximal subgradient algorithm, partial smoothness

中图分类号: