低秩矩阵优化若干新进展

展开
  • 北京交通大学理学院, 北京 100044

收稿日期: 2020-04-22

  网络出版日期: 2020-06-13

基金资助

国家自然科学基金(Nos.11971052,11771038)

Some advances in low-rank matrix optimization

Expand
  • School of Science, Beijing Jiaotong University, Beijing 100044, China

Received date: 2020-04-22

  Online published: 2020-06-13

摘要

低秩矩阵优化是一类含有秩极小或秩约束的矩阵优化问题,在统计与机器学习、信号与图像处理、通信与量子计算、系统识别与控制、经济与金融等众多学科领域有着广泛应用,是当前最优化及其相关领域的一个重点研究方向.然而,低秩矩阵优化是一个NP-难的非凸非光滑优化问题,其研究成果并非十分丰富,亟待进一步深入研究.主要从理论和算法两个方面总结和评述若干新结果,同时列出相关的重要文献,奉献给读者.

本文引用格式

李鑫荣, 修乃华, 罗自炎 . 低秩矩阵优化若干新进展[J]. 运筹学学报, 2020 , 24(2) : 23 -41 . DOI: 10.15960/j.cnki.issn.1007-6093.2020.02.003

Abstract

Low-rank matrix optimization is a class of matrix optimization problems with rank minimization or rank constraint. With wide applications ranging from statistics and machine learning, signal and image processing, communication and quantum computing, system identification and control, to economics and finance, low-rank matrix optimization is currently a key research direction in optimization and related fields. However, due to the intrinsic non-convexity and discontinuity in the rank function, low-rank matrix optimization is generally NP-hard. Existing research results in this direction are not very rich, and further research is urgently needed. In this paper, we mainly summarize and review some latest research results on low-rank matrix optimization in theory and in algorithm, along with related important references, so as to dedicate to readers.

参考文献

[1] Anderson T W. Estimating linear restrictions on regression coefficients for multivariate normal distributions[J]. Annals of Mathematical Statistics, 1951, 22(3):327-351.
[2] El Ghaoui L, Gahinet P. Rank minimization under LMI constraints:A framework for output feedback problems[C]//Proceedings European Control Conference, 1993.
[3] Fazel M. Matrix Rank Minimization with Applications[D]. Stanford:Stanford University, 2002.
[4] Natarajan B. Sparse approximate solutions to linear systems[J]. SIAM Journal on Computing, 1995, 24(2):227-234.
[5] David J. Algorithms for Analysis and Design of Robust Controllers[D]. Leuven:University of Leuven, 1994.
[6] Delgado R A, Agüero J C, Goodwin G C. A rank-constrained optimization approach:Application to factor analysis[J]. IFAC Proceedings Volumes, 2014, 47(3):10373-10378.
[7] Chen Y, Wainwright M J. Fast low-rank estimation by projected gradient descent:General statistical and algorithmic guarantees[J]. arXiv:1509.03025.
[8] She Y, Chen K. Robust reduced-rank regression[J]. Biometrika, 2017, 104(3):633-647.
[9] Delvaux S, Van Barel M. A Givens-weight representation for rank structured matrices[J]. SIAM Journal on Matrix Analysis and Applications, 2007, 29(4):1147-1170.
[10] Wright J, Ma Y, Mairal J, et al. Sparse representation for computer vision and pattern recognition[J]. Proceedings of the IEEE, 2010, 98(6):1031-1044.
[11] Xue S, Jin X. Robust classwise and projective low-rank representation for image classification[J]. Signal, Image and Video Processing, 2018, 12(1):107-115.
[12] Xing E P, Jordan M I, Russell S J, et al. Distance metric learning with application to clustering with side-information[C]//International Conference on Neural Information Processing Systems, 2002, 521-528.
[13] Amit Y, Fink M, Srebro N, et al. Uncovering shared structures in multiclass classification[C]//Proceedings of the 24th International Conference on Machine Learning, 2007, 17-24.
[14] Gillis N, Glineur F. Low-rank matrix approximation with weights or missing data is NP-hard[J]. SIAM Journal on Matrix Analysis and Applications, 2011, 32(4):1149-1165.
[15] Kobayashi T. Low-rank bilinear classification:Efficient convex optimization and extensions[J]. International Journal of Computer Vision, 2014, 110(3):308-327.
[16] Zeng W J, So H C. Outlier-robust matrix completion via lp-minimization[J]. IEEE Transactions on Signal Processing, 2017, 66(5):1125-1140.
[17] Shechtman Y, Eldar Y C, Cohen O, et al. Phase retrieval with application to optical imaging:A contemporary overview[J]. IEEE Signal Processing Magazine, 2015, 32(3):87-109.
[18] Ahmed A, Romberg J. Compressive multiplexing of correlated signals[J]. IEEE Transactions on Information Theory, 2014, 61(1):479-498.
[19] Fallat S M, Hogben L. The minimum rank of symmetric matrices described by a graph:A survey[J]. Linear Algebra and its Applications, 2007, 426(2-3):558-582.
[20] Ames B P W, Vavasis S A. Nuclear norm minimization for the planted clique and biclique problems[J]. Mathematical Programming, 2009, 129(1):69-89.
[21] Dong W, Shi G, Li X, et al. Compressive sensing via nonlocal low-rank regularization[J]. IEEE Transactions on Image Processing, 2014, 23(8):3618-3632.
[22] Argyriou A, Evgeniou T, Pontil M. Multi-task feature learning[C]//Advances in Neural Information Processing Systems, 2007, 41-48.
[23] Cheng B, Liu G, Wang J, et al. Multi-task low-rank affinity pursuit for image segmentation[C]//2011 International Conference on Computer Vision, 2011, 2439-2446.
[24] Gross D, Liu Y K, Flammia S T. Quantum state tomography via compressed sensing[J]. Physical Review Letters, 2010, 105(15):150401.
[25] Arslan G, Demirkol M F, Song Y. Equilibrium efficiency improvement in MIMO interference systems:A decentralized stream control approach[J]. IEEE Transactions on Wireless Communications, 2007, 6(8):2984-2993.
[26] Flammia S T, Gross D, Liu Y K, et al. Quantum tomography via compressed sensing:Error bounds, sample complexity and efficient estimators[J]. New Journal of Physics, 2012, 14(9):095022.
[27] Kalev A, Kosut R L, Deutsch I H. Quantum tomography protocols with positivity are compressed sensing protocols[J]. NPJ Quantum Information, 2015, 1(1):1-6.
[28] Wang B, Hu Y, Gao J, et al. Localized LRR on Grassmann manifold:An extrinsic view[J]. IEEE Transactions on Circuits and Systems for Video Technology, 2017, 28(10):2524-2536.
[29] Orsi R, Helmke U, Moore J B. A Newton-like method for solving rank constrained linear matrix inequalities[J]. Automatica, 2006, 42(11):1875-1882.
[30] Mesbahi M, Papavassilopoulos G P. On the rank minimization problem over a positive semidefinite linear matrix inequality[J]. IEEE Transactions on Automatic Control, 1997, 42(2):239-243.
[31] Fazel M, Hindi H, Boyd S P. A rank minimization heuristic with application to minimum order system approximation[C]//Proceedings of the 2001 American Control Conference, 2001, 6:4734-4739.
[32] Fazel M, Pong T K, Sun D, et al. Hankel matrix rank minimization with applications to system identification and realization[J]. SIAM Journal on Matrix Analysis and Applications, 2013, 34(3):946-977.
[33] Ankelhed D. On Design of Low Order H-Infinity Controllers[D]. Linkoping:Linkoping University, 2011.
[34] Kim S J, Moon Y H. Structurally constrained H2 and H control:A rank-constrained LMI approach[J]. Automatica, 2006, 42(9):1583-1588.
[35] Zare A, Chen Y, Jovanovic M R, et al. Low-complexity modeling of partially available secondorder statistics:Theory and an efficient matrix completion algorithm[J]. IEEE Transactions on Automatic Control, 2016, 62(3):1368-1383.
[36] Pietersz R, Groenen P J F. Rank reduction of correlation matrices by majorization[J]. Quantitative Finance, 2004, 4(6):649-662.
[37] Brigo D. Interest rate models-theory and practice:With smile, inflation and credit[J]. Springer Finance, 2006, 11(4):559-572.
[38] Rebonato R. Modern Pricing of Interest-Rate Derivatives:The LIBOR Market Model and Beyond[M]. Princeton:Princeton University Press, 2012.
[39] Wu L. Fast at-the-money calibration of the LIBOR market model through Lagrange multipliers[J]. Journal of Computational Finance, 2002, 6(2):39-77.
[40] Zhang Z, Wu L. Optimal low-rank approximation to a correlation matrix[J]. Linear Algebra and its Applications, 2003, 364(1):161-187.
[41] Lillo F, Mantegna R. Spectral density of the correlation matrix of factor models:A random matrix theory approach[J]. Physical Review E, 2005, 72(1):016219.
[42] Negahban S, Wainwright M J. Estimation of (near) low-rank matrices with noise and high dimensional scaling[J]. Annals of Statistics, 2011, 39(2):1069-1097.
[43] Bach F R. Consistency of trace norm minimization[J]. Journal of Machine Learning Research, 2008, 9(Jun):1019-1048.
[44] Bunea F, She Y, Wegkamp M H. Joint variable and rank selection for parsimonious estimation of high-dimensional matrices[J]. Annals of Statistics, 2012, 40(5):2359-2388.
[45] Zhou H, Li L. Regularized matrix regression. Journal of the Royal Statistical Society[J]. 2014, 76(2):463-483.
[46] Wainwright M J. Structured regularizers for high-dimensional problems:Statistical and computational issues[J]. Annual Review of Statistics and its Application, 2014, 1(1):233-253.
[47] Bing X, Wegkamp M H. Adaptive estimation of the rank of the coefficient matrix in high dimensional multivariate response regression models[J]. Annals of Statistics, 2019, 47(6):3157-3184.
[48] Wright J, Ganesh A, Rao S, et al. Robust principal component analysis:Exact recovery of corrupted low-rank matrices via convex optimization[C]//Advances in Neural Information Processing Systems, 2009, 2080-2088.
[49] Lin Z, Ganesh A, Wright J, et al. Fast convex optimization algorithms for exact recovery of a corrupted low-rank matrix[C]//Journal of the Marine Biological Association of the United Kingdom, 2009, 56:707-722.
[50] Candès E J, Li X, Ma Y, et al. Robust principal component analysis[J]. Journal of the ACM, 2011, 58(3):1-37.
[51] Chandrasekaran V, Sanghavi S, Parrilo P A, et al. Sparse and low-rank matrix decompositions[J]. IFAC Proceedings Volumes, 2009, 42(10):1493-1498.
[52] Chandrasekaran V, Sanghavi S, Parrilo P A, et al. Rank-sparsity incoherence for matrix decomposition[J]. SIAM Journal on Optimization, 2011, 21(2):572-596.
[53] Berge J, Kiers H. A numerical approach to the approximate and the exact minimum rank of a covariance matrix[J]. Psychometrika, 1991, 56(2):309-315.
[54] Bertsimas D, Copenhaver M S, Mazumder R. Certifiably optimal low rank factor analysis[J]. Journal of Machine Learning Research, 2016, 18(29):1-53.
[55] Khamaru K, Mazumder R. Computation of the maximum likelihood estimator in low-rank factor analysis[J]. Mathematical Programming, 2019, 176(1-2):279-310.
[56] Silverman L. Realization of linear dynamical systems[J]. IEEE Transactions on Automatic Control, 1971, 16(6):554-567.
[57] Grussler C, Rantzer A, Giselsson P. Low-rank optimization with convex constraints[J]. IEEE Transactions on Automatic Control, 2018, 63(11):4000-4007.
[58] Qi H, Shen J, Xiu N. A sequential majorization method for approximating weighted time series of finite rank[J]. Statistics and its Interface, 2018, 11(4):615-630.
[59] Xing E P, Jordan M I, Russell S J, et al. Distance metric learning with application to clustering with side-information[C]//International Conference on Neural Information Processing Systems, 2002, 521-528.
[60] Weinberger K Q, Saul L K. Distance metric learning for large margin nearest neighbor classification[J]. Journal of Machine Learning Research, 2009, 10(Feb):207-244.
[61] Davis J V, Kulis B, Jain P, et al. Information-theoretic metric learning[C]//Proceedings of the 24th International Conference on Machine Learning, 2007, 209-216.
[62] Luo L, Xie Y, Zhang Z, et al. Support matrix machines[C]//In International Conference on Machine Learning, 2015, 938-947.
[63] Qian C, Tran-Dinh Q, Fu S, et al. Robust multicategory support matrix machines[J]. Mathematical Programming, 2019, 176(1-2):429-463.
[64] Razzak I, Blumenstein M, Xu G. Multiclass support matrix machines by maximizing the interclass margin for single trial EGG classification[J]. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 2019, 27(6):1117-1127.
[65] Duan B, Yuan J, Liu Y, et al. Quantum algorithm for support matrix machines[J]. Physical Review A, 2017, 96(3):032301.
[66] Markovsky I. Low Rank Approximation[M]. London:Springer, 2012.
[67] Davenport M A, Romberg J. An overview of low-rank matrix recovery from incomplete observations[J]. IEEE Journal of Selected Topics in Signal Processing, 2016, 10(4):608-622.
[68] Gao Y. Structured Low Rank Matrix Optimization Problems:A Penalty Approach[D]. Singapore:National University of Singapore, 2010.
[69] Ding C. An Introduction to a Class of Matrix Optimization Problems[D]. Singapore:National University of Singapore, 2012.
[70] Miao W. Matrix Completion Models with Fixed Basis Coefficients and Rank Regularized Problems with Hard Constraints[D]. Singapore:National University of Singapore, 2013.
[71] Bi S. Study for Multi-Stage Convex Relaxation Approach to Low-Rank Optimization Problems[D]. Guangzhou:South China University of Technology, 2014.
[72] Zheng F. Study on Models and Algorithms for Matrix Rank Minimization Problem[D]. Wuhan:Wuhan University, 2015.
[73] Chen Y. Algorithms for Low-Rank Semidefinite Matrix Recovery Problems[D]. Beijing:Beijing Jiaotong University, 2015.
[74] Recht B, Fazel M, Parrilo P A. Guaranteed minimum-rank solutions of linear matrix equations via nuclear norm minimization[J]. SIAM Review, 2010, 52(3):471-501.
[75] Recht B, Xu W, Hassibi B. Null space conditions and thresholds for rank minimization[J]. Mathematical Programming, 2011, 127(1):175-202.
[76] Kong L, Sun J, Xiu N. S-semigoodness for low-rank semidefinite matrix recovery[J]. Pacific Journal of Optimization, 2014, 10(1):73-83.
[77] Cai J F, Candes E J, Shen Z. A singular value thresholding algorithm for matrix completion[J]. SIAM Journal on Optimization, 2010, 20(4):1956-1982.
[78] Liu Z, Vandenberghe L. Interior-point method for nuclear norm approximation with application to system identification[J]. SIAM Journal on Matrix Analysis and Applications, 2010, 31(3):1235-1256.
[79] Ma S, Goldfarb D, Chen L. Fixed point and Bregman iterative methods for matrix rank minimization[J]. Mathematical Programming, 2011, 128(1-2):321-353.
[80] Toh K C, Yun S. An accelerated proximal gradient algorithm for nuclear norm regularized linear least squares problems[J]. Pacific Journal of Optimization, 2010, 6(3):615-640.
[81] Yang J, Yuan X. Linearized augmented Lagrangian and alternating direction methods for nuclear norm minimization[J]. Mathematics of Computation, 2013, 82(281):301-329.
[82] Huang B, Ma S, Goldfarb D. Accelerated linearized Bregman method[J]. Journal of Scientific Computing, 2013, 54(2-3):428-453.
[83] Liu Y J, Sun D, Toh K C. An implementable proximal point algorithmic framework for nuclear norm minimization[J]. Mathematical Programming, 2012, 133(1-2):399-436.
[84] Marjanovic G, Solo V. On lq optimization and matrix completion[J]. IEEE Transactions on Signal Processing, 2010, 60(11):5714-5724.
[85] Nie F, Wang H, Cai X, et al. Robust matrix completion via joint Schatten p-norm and lp-norm minimization[C]//IEEE International Conference on Data Mining, 2012, 566-574.
[86] Mohan K, Fazel M. Iterative reweighted algorithms for matrix rank minimization[J]. Journal of Machine Learning Research, 2012, 13(Nov):3441-3473.
[87] Chen Y, Xiu N, Peng D. Global solutions of non-Lipschitz s2-sp minimization over the positive semidefinite cone[J]. Optimization Letters, 2014, 8(7):2053-2064.
[88] Lu Z, Yong Z, Jian L. lp regularized low-rank approximation via iterative reweighted singular value minimization[J]. Computational Optimization and Applications, 2017, 68(3):1-24.
[89] Kong L, Xiu N. Exact low-rank matrix recovery via nonconvex Schatten p-minimization[J]. Asia-Pacific Journal of Operational Research, 2013, 30(3):1340010.
[90] Oymak S, Mohan K, Fazel M, et al. A simplified approach to recovery conditions for low rank matrices[C]//2011 IEEE International Symposium on Information Theory Proceedings, 2011, 2318-2322.
[91] Fornasier M, Rauhut H, Ward R. Low-rank matrix recovery via iteratively reweighted least squares minimization[J]. SIAM Journal on Optimization, 2011, 21(4):1614-1640.
[92] Lu Y, Zhang L, Wu J. A smoothing majorization method for matrix minimization[J]. Optimization Methods and Software, 2015, 30(4):682-705.
[93] Peng D, Xiu N, Yu J. s1/2 regularization methods and fixed point algorithms for affine rank minimization problems[J]. Computational Optimization and Applications, 2017, 67(3):543-569.
[94] Fazel M, Hindi H, Boyd S P. Log-det heuristic for matrix rank minimization with applications to Hankel and Euclidean distance matrices[J]. American Control Conference, 2003, 3:2156-2162.
[95] Mohan K, Fazel M. New restricted isometry results for noisy low-rank recovery[C]//IEEE International Symposium on Information Theory, 2010, 1573-1577.
[96] Ghasemi H, Malek-Mohammadi M, Babaie-Zadeh M, et al. Matrix completion based on smoothed rank function[C]//IEEE International Conference on Acoustics, 2011, 3672-3675.
[97] Lai M, Xu Y, Yin W. Improved iteratively reweighted least squares for unconstrained smoothed lq minimization[J]. SIAM Journal on Numerical Analysis, 2013, 51(2):927-957.
[98] Malek-Mohammadi M, Babaie-Zadeh M, Amini A, et al. Recovery of low-rank matrices under affine constraints via a smoothed rank function[J]. IEEE Transactions on Signal Processing, 2014, 62(4):981-992.
[99] Miao W, Pan S, Sun D. A rank-corrected procedure for matrix completion with fixed basis coefficients[J]. Mathematical Programming, 2016, 159(1-2):289-338.
[100] Bi S, Pan S. Multistage convex relaxation approach to rank regularized minimization problems based on equivalent mathematical program with a generalized complementarity constraint[J]. SIAM Journal on Control and Optimization, 2017, 55(4):2493-2518.
[101] Bi S, Pan S, Sun D. A multi-stage convex relaxation approach to noisy structured low-rank matrix recovery[J]. Mathematical Programming Computation, 2020, DOI:10.1007/s12532-020-00177-4.
[102] Le H Y. Generalized subdifferentials of the rank function[J]. Optimization Letters, 2013, 7(4):731-743.
[103] Hiriart-Urruty J B, Le H Y. From Eckart and Young approximation to Moreau envelopes and vice versa[J]. RAIRO-Operations Research, 2013, 47(3):299-310.
[104] Hiriart-Urruty J B, Le H Y. The viscosity subdifferential of the rank function via the corresponding subdifferential of its Moreau envelopes[J]. Acta Mathematica Vietnamica, 2015, 40(4):735-746.
[105] Hiriart-Urruty J B. When only global optimization matters[J]. Journal of Global Optimization, 2013, 56(3):761-763.
[106] Lu Z, Zhang Y, Li X. Penalty decomposition methods for rank minimization[J]. Optimization Methods and Software, 2015, 30(3):531-558.
[107] Vandereycken B, Vandewalle S. A Riemannian optimization approach for computing low rank solutions of Lyapunov equations[J]. SIAM Journal on Matrix Analysis and Applications, 2010, 31(5):2553-2579.
[108] Shalit U, Weinshall D, Chechik G. Online learning in the embedded manifold of low-rank matrices[J]. Journal of Machine Learning Research, 2012, 13(Feb):429-458.
[109] Vandereycken B. Low-rank matrix completion by Riemannian optimization[J]. SIAM Journal on Optimization, 2013, 23(2):1214-1236.
[110] Meyer G. Geometric Optimization Algorithms for Linear Regression on Fixed-Rank Matrices[D]. University of Liege, 2011.
[111] Vandereycken B. Riemannian and Multilevel Optimization for Rank-Constrained Matrix Problems with Applications to Lyapunov Equations[D]. Katholieke Universiteit Leuven, 2010.
[112] Ye J. Generalized low rank approximations of matrices[J]. Machine Learning, 2005, 61(1-3):167-191.
[113] Mishra B, Meyer G, Bach F, et al. Low-rank optimization with trace norm penalty[J]. SIAM Journal on Optimization, 2013, 23(4):2124-2149.
[114] Journee M, Bach F, Absil P A, et al. Low-rank optimization on the cone of positive semidefinite matrices[J]. SIAM Journal on Optimization, 2010, 20(5):2327-2351.
[115] Meyer G, Bonnabel S, Sepulchre R. Regression on fixed-rank positive semidefinite matrices:A Riemannian approach[J]. Journal of Machine Learning Research, 2011, 12(Feb):593-625.
[116] Boumal N. Optimization and estimation on manifolds[D]. Louvain:Universite catholique de Louvain, 2014.
[117] Huang W. Optimization algorithms on Riemannian manifolds with applications[D]. Florida:Florida State University, 2013.
[118] Hu J, Liu X, Wen Z, et al. A brief introduction to Manifold optimization[J/OL].[2020-02-01]. Journal of the Operations Research Society of China, https://doi.org/10.1007/s40305-020-00295-9,2020.
[119] Absil P A, Mahony R, Sepulchre R. Optimization Algorithms on Matrix Manifolds[M]. Princeton:Princeton University Press, 2008.
[120] Wen Z, Yin W. A feasible method for optimization with orthogonality constraints[J]. Mathematical Programming, 2013, 142(1-2):397-434.
[121] Gao B, Liu X, Chen X, et al. A new first-order algorithmic framework for optimization problems with orthogonality constraints[J]. SIAM Journal on Optimization, 2018, 28(1):302-332.
[122] Jiang B, Dai Y H. A framework of constraint preserving update schemes for optimization on Stiefel Manifold[J]. Mathematical Programming, 2015, 153(2):535-575.
[123] Gao B, Liu X, Yuan Y. Parallelizable algorithms for optimization problems with orthogonality constraints[J]. SIAM Journal on Scientific Computing, 2019, 41(3):A1949-A1983.
[124] Hu J, Jiang B, Lin L, Wen Z, Yuan Y. Structured quasi-Newton methods for optimization with orthogonality constraints[J]. SIAM Journal on Scientific Computing, 2019, 41(4):A2239-A2269.
[125] Cai T T, Zhou W. Matrix completion via max-norm constrained optimization[J]. Electronic Journal of Statistics, 2016, 10:1493-1525.
[126] Grussler C, Giselsson P. Local convergence of proximal splitting methods for rank constrained problems[C]//IEEE 2017 IEEE 56th Annual Conference on Decision and Control, 2017, 702-708.
[127] Burer S, Monteiro R D. A nonlinear programming algorithm for solving semidefinite programs via low-rank factorization[J]. Mathematical Programming, 2003, 95(2):329-357.
[128] Wen Z, Yin W, Zhang Y. Solving a low-rank factorization model for matrix completion by a nonlinear successive over-relaxation algorithm[J]. Mathematical Programming Computation, 2012, 4(4):333-361.
[129] Mardani M, Mateos G, Giannakis G B. Rank minimization for subspace tracking from incomplete data[C]//2013 IEEE International Conference on Acoustics, Speech and Signal Processing, 2013, 5681-5685.
[130] Sun R, Luo Z Q. Guaranteed matrix completion via non-convex factorization[J]. IEEE Transactions on Information Theory, 2016, 62(11):6535-6579.
[131] Zhu Z, Li Q, Tang G, et al. Global optimality in low-rank matrix optimization[C]//2017 IEEE Global Conference on Signal and Information Processing, 2017, 66, 3614-3628.
[132] Li Q, Zhu Z, Tang G. The non-convex geometry of low-rank matrix optimization[J]. Information and Inference:A Journal of the IMA, 2019, 8(1):51-96.
[133] Chi Y, Lu Y M, Chen Y. Nonconvex optimization meets low-rank matrix factorization:An overview[J]. IEEE Transactions on Signal Processing, 2019, 67(20):5239-5269.
[134] Tao P D. Duality in D.C. (difference of convex functions) optimization. subgradient methods[J]. International Series of Numerical Mathematics, 1998, 84(1):277-293.
[135] Li Q, Qi H D. A sequential semismooth Newton method for the nearest low-rank correlation matrix problem[J]. SIAM Journal on Optimization, 2011, 21(4):1641-1666.
[136] Thi H A L, Pham Dinh T, Le H M, et al. DC approximation approaches for sparse optimization[J]. European Journal of Operational Research, 2015, 244(1):26-46.
[137] Bi S, Pan S. Error bounds for rank constrained optimization problems and applications[J]. Operations Research Letters, 2016, 44(3):336-341.
[138] Gotoh J Y, Takeda A, Tono K. DC formulations and algorithms for sparse optimization problems[J]. Mathematical Programming, 2018, 169(1):141-176.
[139] Liu T, Lu Z, Chen X, et al. An exact penalty method for semidefinite-box-constrained low-rank matrix optimization problems[J]. IMA Journal of Numerical Analysis, 2020, 40(1):563-586.
[140] Overton M L, Womersley R S. Optimality conditions and duality theory for minimizing sums of the largest eigenvalues of symmetric matrices[J]. Mathematical Programming, 1993, 62(1-3):321-357.
[141] Dattorro J. Convex Optimization and Euclidean Distance Geometry[M]. New York:Meboo Publishing, 2005.
[142] Hempel A B, Goulart P J. A novel method for modelling cardinality and rank constraints[C]//2014 IEEE 53rd Annual Conference on Decision and Control, 2014, 4322-4327.
[143] Delgado R A, Agüero J C, Goodwin G C. A novel representation of rank constraints for real matrices[J]. Linear Algebra and its Applications, 2016, 496:452-462.
[144] Zhou S, Xiu N, Qi H. A fast matrix majorization-projection method for penalized stress minimization with box constraints[J]. IEEE Transactions on Signal Processing, 2018, 66(16):4331-4346.
[145] Bouligand G. Sur les surfaces depourvues de points hyperlimites[J]. Ann. Soc. Polon. Math, 1930, 9:32-41.
[146] Clarke F H. Necessary Conditions for Nonsmooth Problems in Optimal Control and the Calculus of Variations[D]. Washington:University of Washington, 1973.
[147] Mordukhovich B S. Maximum principle in a problem of time optimal control with nonsmooth constraints[J]. Prikladnaia Matematika I Mekhanika, 1976, 40(6):1014-1023.
[148] Rockafellar R T, Wets R J B. Variational Analysis[M]. New York:Springer, 2013.
[149] Attouch H, Jerome B, Benar F S. Convergence of descent methods for semi-algebraic and tame problems:Proximal algorithms, forward-backward splitting, and regularized Gauss Seidel methods[J]. Mathematical Programming, 2013, 137(1-2):91-129.
[150] Pan L, Xiu N, Zhou S. On solutions of sparsity constrained optimization[J]. Journal of the Operations Research Society of China, 2015, 3(4):421-439.
[151] Beck A, Eldar Y C. Sparsity constrained nonlinear optimization:Optimality conditions and algorithms[J]. SIAM Journal on Optimization, 2013, 23(3):1480-1509.
[152] Li X, Xiu N, Zhou S. Matrix optimization over low-rank spectral sets:Stationary points and local and global minimizers[J]. Journal of Optimization Theory and Applications, 2010, 184(3):895-930.
[153] Horn R A, Johnson C R. Matrix Analysis[M]. Cambridge:Cambridge University Press, 2013.
[154] Helmke U, Shayman M A. Critical points of matrix least squares distance functions[J]. Linear Algebra and its Applications, 1995, 215(2):1-19.
[155] Schneider R, Uschmajew A. Convergence results for projected line-search methods on varieties of low-rank matrices via Lojasiewicz inequality[J]. SIAM Journal on Optimization, 2015, 25(1):622-646.
文章导航

/