运筹学

基于迭代投影的梯度硬阈值追踪算法

展开
  • 1. 福州大学离散数学及其应用教育部重点实验室, 福州 350108;
    2. 福州第一中学, 福州 350116

收稿日期: 2017-01-24

  网络出版日期: 2019-03-15

基金资助

国家自然科学基金(No.A11501115),福建省高校杰出青年人才培育计划(No.SX2016-21)

Iterative projection gradient hard thresholding pursuit algorithm for sparse optimization

Expand
  • 1. Key Laboratory of Discrete Mathematics with Applications Ministry of Education, Fuzhou University, Fuzhou 350108, China;
    2. Fuzhou No.1 High School, Fuzhou 350116, China

Received date: 2017-01-24

  Online published: 2019-03-15

摘要

梯度硬阈值追踪算法是求解稀疏优化问题的有效算法之一.考虑到算法中投影对最优解的影响,提出一种比贪婪策略更好的投影算法是很有必要的.针对一般的稀疏约束优化问题,利用整数规划提出一种迭代投影策略,将梯度投影算法中的投影作为一个子问题求解.通过迭代求解该子问题得到投影的指标集,并以此继续求解原问题,以提高梯度硬阈值追踪算法的计算效果.证明了算法的收敛性,并通过数值实例验证了算法的有效性.

本文引用格式

陈薪蓓, 朱明康, 陈建利 . 基于迭代投影的梯度硬阈值追踪算法[J]. 运筹学学报, 2019 , 23(1) : 1 -14 . DOI: 10.15960/j.cnki.issn.1007-6093.2019.01.001

Abstract

The gradient hard threshold pursuit algorithm is considered as one of the most effective algorithms for solving sparse optimization problems. Since greedy projection operator affects the performance of the algorithm, it is desirable to propose a better projection algorithm besides the greedy strategies. In this paper, we first formulate the projection problem as an integer programming subproblem. Then, by iteratively solving the subproblem, we obtain a better support set of the sparse optimization problem, thus the original problem is solved to improve the performance of the gradient hard threshold pursuit algorithm. Finally, the convergence of the improved algorithm is proved, and the effectiveness of the algorithm is verified by numerical experiments.

参考文献

[1] Bach F, Ahipasaoglu S D, Aspremont A. Convex relaxations for subset selection[J]. ArXiv 1006.3601, 2010.
[2] Jokar S, Pfetsch M E. Exact and approximate sparse solutions of underdetermined linear equations[J]. SIAM Journal on Scientific Computing, 2008, 31(1):23-44.
[3] Natarajan B K. Sparse approximate solutions to linear systems[J]. SIAM Journal on Computing, 1995, 24(2):227-234.
[4] Mallat S, Zhang Z. Matching pursuits with time-frequency dictionaries[J]. IEEE Transactions on Signal Processing, 1993, 41(12):3397-3415.
[5] Tropp J A, Gilbert A C. Signal recovery from random measurements via orthogonal matching pursuit[J]. IEEE Transactions on Information Theory, 2007, 53(12):4655-4666.
[6] Needell D, Tropp J A. CoSaMP:Iterative signal recovery from incomplete and inaccurate samples[J]. Applied and Computational Harmonic Analysis, 2009, 26(3):301-321.
[7] Foucart S. Hard thresholding pursuit:An algorithm for compressive sensing[J]. SIAM Journal on Numerical Analysis, 2011, 49(6):2543-2563.
[8] Blumensath T, Davies M E. Iterative hard thresholding for compressed sensing[J]. Applied and Computational Harmonic Analysis, 2008, 27(3):265-274.
[9] Dai W, Milenkovic O. Subspace pursuit for compressive sensing signal reconstruction[J]. IEEE Transactions on Information Theory, 2009, 55(5):2230-2249.
[10] Figueiredo M A, Nowak R, Wright S J, et al. Gradient projection for sparse reconstruction:Application to compressed sensing and other inverse problems[J]. IEEE Journal of Selected Topics in Signal Processing, 2007, 1(4):586-597.
[11] Kim S, Koh K, Lustig M, et al. An interior-point method for large-scale l1-regularized least squares[J]. IEEE Journal of Selected Topics in Signal Processing, 2007, 1(4):606-617.
[12] Yang J, Zhang Y. Alternating direction algorithms for l1-problems in compressive sensing[J]. SIAM Journal on Scientific Computing, 2011, 33(1):250-278.
[13] Beck A, Teboulle M. A fast iterative shrinkage thresholding algorithm for linear inverse problems[J]. SIAM Journal on Imaging Sciences, 2009, 2(1):183-202.
[14] Wright S J, Nowak R, Figueiredo M A, et al. Sparse reconstruction by separable approximation[J]. IEEE Transactions on Signal Processing, 2009, 57(7):2479-2493.
[15] Asif M S, Romberg J. Fast and accurate algorithms for re-weighted l1-norm minimization[J]. IEEE Transactions on Signal Processing, 2013, 61(23):5905-5916.
[16] Efron B, Hastie T, Johnstone I M, et al. Least angle regression[J]. Annals of Statistics, 2004, 32(2):407-499.
[17] Xiao L, Zhang T. A proximal-gradient homotopy method for the sparse least-squares problem[J]. SIAM Journal on Optimization, 2012, 23(2):1062-1091.
[18] Langford J, Li L, Zhang T, et al. Sparse online learning via truncated gradient[J]. Journal of Machine Learning Research, 2009:777-801.
[19] Yuan X, Li P, Zhang T. Gradient hard thresholding pursuit for sparsity-constrained optimization[C]//Proceedings of the 31st International Conference on Machine Learning, Beijing:IMLS Publisher, 2014.
[20] Baes M, Pia A D, Nesterov Y, et al. Minimizing Lipschitz-continuous strongly convex functions over integer points in polytopes[J]. Mathematical Programming, 2012, 134(1):305-322.
[21] Lu Z, Zhang Y. Sparse approximation via penalty decomposition methods[J]. SIAM Journal on Optimization, 2012, 23(4):2448-2478.
[22] Bishop C M. Pattern Recognition and Machine Learning[M]. New York:Springer-Verlag, 2006.

文章导航

/