运筹学

高维约束矩阵回归问题

展开
  • 1. 北京交通大学理学院, 北京 100044; 2. 南安普顿大学数学科学学院, 南安普顿, 海菲尔德 SO17 1BJ 英国

收稿日期: 2017-03-24

  网络出版日期: 2017-06-15

基金资助

国家自然科学基金(Nos.11431002, 11671029)

High-dimensional constrained matrix regression problems

Expand
  • 1. School of Science, Beijing Jiaotong University, Beijing 100044, China 2.  School of Mathematics, University of Southampton, Highfield  SO17 1BJ,  Southampton, UK

Received date: 2017-03-24

  Online published: 2017-06-15

摘要

高维约束矩阵回归是指高维情况下带非凸约束的多响应多预测统计回归问题, 其数学模型是一个NP-难的矩阵优化, 它在机器学习与人工智能、医学影像疾病诊疗、基因表达分析、脑神经网络、风险管理等领域有广泛应用. 从高维约束矩阵回归的优化理论和算法两方面总结和评述这些新成果, 同时, 列出了相应的重要文献.

本文引用格式

孔令臣, 陈丙振, 修乃华, 戚厚铎 . 高维约束矩阵回归问题[J]. 运筹学学报, 2017 , 21(2) : 31 -38 . DOI: 10.15960/j.cnki.issn.1007-6093.2017.02.004

Abstract

High-dimensional constrained matrix regression refers to non-convex constrained statistical regression with the multivariate responses and multivariate predictors in the high-dimensional setting. Its mathematical model is a matrix optimization, which is generally NP-hard and has a wide range of applications in a lot of areas such as machine learning and artificial intelligence, medical imaging and diagnosis, gene expression analysis, neural networks, risk management. This paper briefly reviews the new results on optimization theory and algorithm of high-dimensional constrained matrix regression. Moreover, we list the corresponding important references.

参考文献

[1] Zhou H, Li L. Regularized matrix regression [J]. Journal of the Royal Statistical Society, Series B, 2014, 76: 463-483.
[2] Wainwright M J. Structured regularizers for high-dimensional problems: statistical and computational issues [J]. Annual Review of Statistics and its Applications, 2014, 1: 233-253.
[3] Obozinski G,  Wainwright M J, Jordan M I. Support union recovery in high-dimensional multivariate regression [J]. Annals of Statistics, 2011, 39(1): 1-47.
[4] Negahban S, Wainwright M J. Estimation of (near) low-rank matrices with noise and high-dimensional scaling [J]. Annals of Statistics, 2011, 39(2): 1069-1097.
[5] Peng J, Zhu J, Bergamaschi A, et al. Regularized multivariate regression for identifying master predictors with application to integrative genomics study of breast cancer [J]. The Annals of Applied Statistics, 2010, 4(1):  53-77.
[6] Yin J., Li H. Model selection and estimation in the matrix normal graphical model [J]. Journal of Multivariate Analysis, 2012, 107: 119-140.
[7] Reiss P, Ogden R. Functional generalized linear models with images as predictors [J]. Biometrics, 2010, 66: 61-69.
[8] Yin X, Li L. Hypothesis testing of matrix graph model with application to brain connectivity analysis [J].  arXiv:1511.00718v1, 2 November, 2015.
[9] Leng C,  Tang C Y. Sparse matrix graphical models [J]. Journal of the American Statistical Association, 2012, 107(499): 1187-1200.
[10] Jordan M I, Mitchell T. Machine learning: trends, perspectives, and prospects [J]. Science, 2015, 349: 255-260.
[11] Pollack J R, T S{\o}rlie T,  Perou C M, et al. Microarray analysis reveals a major direct role of DNA copy number alteration in the transcriptional program of human breast tumors [C]// Proceedings of the National Academy of Sciences of the United States of America, 2002, 99: 12963-12968.
[12] Jeong H,  Mason S P, Barab\acute{a}si A L, et al. Lethality and centrality in protein networks [J]. Nature, 2002, 411: 41-42.
[13] Gardner T S, Diego D B, David L, et al. Inferring genetic networks and identifying compound mode of action via expression profiling [J]. Science, 2003, 301: 102-105.
[14] Zhang X L, Begleiter H, Porjesz B, et al. Event related potentials during object recognition tasks [J]. Brain Research Bulletin, 1995, 38: 531-538.
[15] Chen Y, Wainwright M J. Fast low-rank estimation by projected gradient descent: general statistical and algorithmic guarantees [J].  arXiv: 1509.03025, 10 September, 2015.
[16] Wang  W, Liang Y, Eric Xing, Block regularized lasso for multivariate multi-response linear regression [C]//Proceedings of the 16th International Conference on Artificial Intelligence and Statistics,  2013, 31: 608-617.
[17] Li Y, Nan B, Zhu J. Multivariate sparse group lasso for the multivariate multiple linear regression with an arbitrary group structure [J]. Biometrics, 2015, 71(2): 354-363.
[18] Katayama S, Imori S. Lasso penalized model selection criteria for high-dimensional multivariate linear regression analysis [J]. Journal of Multivariate Analysis, 2014, 132: 138-150.
[19] Chen B Z, Kong L C. High-dimensional Least Square Matrix Regression via Elastic Net Penalty [J]. Pacific Journal of Optimization, 2017, 13(2): 185-196.
[20] Cook R D, Li B, Chiaromonte F. Envelope models for parsimonious and efficient multivariate linear regression (with discussion) [J]. Statistica Sinica, 2010, 20: 927-1010.
[21] Su Z, Cook R D. Inner envelopes: efficient estimation in multivariate linear regression [J]. Biometrika, 2012, 99: 687-702.
[22] Cook R D, Su Z. Scaled envelopes: Scale invariant and efficient estimation in multivariate linear regression [J]. Biometrika, 2013, 100: 921-938.
[23] Li L, Zhang X.  Parsimonious tensor response regression [J]. arXiv:1501.07815v1, 30 January, 2015.
文章导航

/