运筹学学报(中英文) ›› 2025, Vol. 29 ›› Issue (3): 34-60.doi: 10.15960/j.cnki.issn.1007-6093.2025.03.002
• • 上一篇
胡胜龙*
收稿日期:
2025-04-10
发布日期:
2025-09-09
通讯作者:
胡胜龙 E-mail:hushenglong@nudt.edu.cn
基金资助:
HU Shenglong*
Received:
2025-04-10
Published:
2025-09-09
摘要: 张量分解的唯一性是多个应用问题中张量低秩分解和张量低秩逼近优化问题建模的关键基础,是进行系统参数识别的强有力理论。本文简要归纳唯一分解理论的基本概念、Kruskal定理等经典结论、唯一性成立的必要条件、Jennrich-Harshman理论及其延伸、分解的部分唯一性理论、块分解唯一性以及统计意义下唯一性等。通过对这些基本性质的了解,为相应张量低秩分解和张量低秩逼近优化模型的建模、分析、求解和验证等理论和方法的进一步研究提供理论基础。
中图分类号:
胡胜龙. 张量分解的唯一性[J]. 运筹学学报(中英文), 2025, 29(3): 34-60.
HU Shenglong. Uniqueness of tensor canonical polyadic decomposition[J]. Operations Research Transactions, 2025, 29(3): 34-60.
[1] Hitchcock F L. The expression of a tensor or a polyadic as a sum of products [J]. Journal of Mathematical Physics, 1927, 6: 164-189. [2] Carroll J D, Chang J J. Analysis of individual differences in multidimensional scaling via an n-way generalization of "Eckart-Young" decomposition [J]. Psychometrika, 1927, 35: 283-319. [3] Harshman R A. Foundations of the parafac procedure: Models and conditions for an "explanatory" multi-modal factor analysis [J]. UCLA Working Papers in Phonetics, 1970, 16: 1-84. [4] Tucker L R. Some mathematical notes on three-mode factor analysis [J]. Psychometrika, 1966, 31: 273-311. [5] Oseledets I V. Tensor-train decomposition [J]. SIAM Journal on Scientific Computing, 2011, 33(5): 2295-2317. [6] Hackbusch W. Tensor Spaces and Numerical Tensor Calculus [M]. Cham: Springer, 2019. [7] Cichocki A. Era of big data processing: A new approach via tensor networks and tensor decompositions [EB/OL]. [2025-04-09]. arXiv:1403.2048. [8] Landsberg J M. Tensors: Geometry and Applications [M]. Providence: American Mathematical Society, 2012. [9] Bürgisser P, Clausen M, Shokrollahi M A. Algebraic Complexity Theory [M]. Heidelberg: Springer, 2013. [10] Comon P. Independent component analysis, A new concept? [J]. Signal Processing, 1994, 36: 287-314. [11] Lathauwer L D, De Moor B, Vandewalle J. Independent component analysis and (simultaneous) third-order tensor diagonalization [J]. IEEE Transactions on Signal Processing, 2001, 49: 2262-2271. [12] Anandkumar A, Ge R, Janzamin M. Learning overcomplete latent variable models through tensor methods [C]//Proceedings of Machine Learning Research, 2015, 40: 36-112. [13] Sanchez E, Kowalski B R. Tensorial resolution: A direct trilinear decomposition [J]. Journal of Chemometrics, 1990, 4: 29-45. [14] Appellof C J, Davidson E R. Strategies for analyzing data from video fluorometric monitoring of liquid chromatographic effluents [J]. Analytical Chemistry, 1981, 53: 2053-2056. [15] Allman E S, Matias C, Rhodes J A. Identifiability of parameters in latent structure models with many observed variables [J]. Annals of Statistics, 2009, 37(6A): 3099-3132. [16] Anandkumar A, Ge R, Hsu D, et al. Tensor decompositions for learning latent variable models [J]. Journal of Machine Learning Research, 2014, 15: 2773-2832. [17] Orús R. A practical introduction to tensor networks: Matrix product states and projected entangled pair states [J]. Annals of Physics, 2014, 349: 117-158. [18] Zhou G, Zhao Q, Zhang Y, et al. Linked component analysis from matrices to high-order tensors: Applications to biomedical data [J]. Proceedings of the IEEE, 2016, 104(2): 310-331. [19] Andersson J L, Beckmann C H M. Dimensionality reduction: Models and methods for exploratory data analysis [J]. NeuroImage, 2001, 13(6): S176-S184. [20] Rendle S, Marinho L B, Nanopoulos A, et al. Learning optimal ranking with tensor factorization for tag recommendation [C]// Proceedings of the 15th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 2009: 727-736. [21] Ji S, Xu W, Yang M, et al. 3D convolutional neural networks for human action recognition [J]. IEEE Transactions on Pattern Analysis & Machine Intelligence, 2013, 35(1): 221-231. [22] Sidiropoulos N D, Bro R, Giannakis G B. Parallel factor analysis in sensor array processing [J]. IEEE Transactions on Signal Processing, 2000, 48(8): 2377-2388. [23] Sidiropoulos N D, Giannakis G B, Bro R. Blind PARAFAC receivers for DS-CDMA systems [J]. IEEE Transactions on Signal Processing, 2000, 48(3): 810-823. [24] Klerk E D, Pasechnik D V, Schrijver A. Reduction of symmetric semidefinite programs using the regular *-representation [J]. Mathematical Programming, 2007, 109(2-3): 613-624. [25] Klerk E D, Sotirov R. Exploiting group symmetry in semidefinite programming relaxations of the quadratic assignment problem [J]. Mathematical Programming, 2010, 122(2): 225-246. [26] Kofidis E, Regalia P A. Tensor approximation and signal processing applications [J]. Contemporary Mathematics, 2001, 280: 103-134. [27] Kolda T G, Bader B W. Tensor decompositions and applications [J]. SIAM Review, 2009, 51: 455-500. [28] Comon P, Luciani X, De Almeida A L F. Tensor decompositions, alternating least squares and other tales [J]. Journal of Chemometrics, 2009, 23(7-8): 393-405. [29] De Lathauwer L. A short introduction to tensor-based methods for factor analysis and blind source separation [C]//Proceedings of the 7th International Symposium on Image and Signal Processing and Analysis, 2011: 558-563. [30] Acar E, Yener B. Unsupervised multiway data analysis: A literature survey [J]. IEEE Transactions on Knowledge and Data Engineering, 2008, 21(1): 6-20. [31] Comon P. Tensor decompostions: State of the art and applications [EB/OL]. [2025-04-09]. arXiv:0905.0454. [32] Bro R. PARAFAC. Tutorial and applications [J]. Chemometrics and Intelligent Laboratory Systems, 1997, 38(2): 149-171. [33] McCullagh P. Tensor Methods in Statistics [M]. London: Chapman and Hall, 1987. [34] Lim L H. Tensors and hypermatrices [J]. Handbook of Linear Algebra, 2013: 231-260. [35] Lim L H. Tensors in computations [J]. Acta Numerica, 2021, 30: 555-764. [36] Cichocki A, Zdunek R, Phan A H. Nonnegative Matrix and Tensor Factorizations: Applications to Exploratory Multi-Way Data Analysis and Blind Source Separation [M]. New York: Wiley, 2009. [37] Grasedyck L, Kressner D, Tobler C. A literature survey of low-rank tensor approximation techniques [J]. GAMM-Mitteilungen, 2013, 36(1): 53-78. [38] Friedland S, Tammali V. Low-rank approximation of tensors [M]// Benner P, Bollhöfer M, Kressner D, et al. (eds.) Numerical Algebra, Matrix Theory, Differential-Algebraic Equations and Control Theory: Festschrift in Honor of Volker Mehrmann, 2015: 377-411. [39] Uschmajew A, Vandereycken B. Geometric methods on low-rank matrix and tensor manifolds [M]// In Philipp Grohs, Martin Holler, and Andreas Weinmann, editors, Handbook of Variational Methods for Nonlinear Geometric Data, Springer. 2020: 261-313. [40] Sidiropoulos N D, De Lathauwer L, Fu X, et al. Tensor decomposition for signal processing and machine learning [J]. IEEE Transactions on Signal Processing, 2017, 65(13): 3551-3582. [41] Cichocki A, Mandic D, De Lathauwer L, et al. Tensor decompositions for signal processing applications: From two-way to multiway component analysis [J]. IEEE Signal Processing Magazine, 2015, 32(2): 145-163. [42] Berge J M F. Least squares optimization in multivariate analysis [R]. Leiden: DSWO Press, 1993. [43] Pajarola R, Suter S K, Ballester-Ripoll R, et al. Tensor approximation for multidimensional and multivariate data [M]// Özarslan E, Schultz T, Zhang E, et al. (eds.) Anisotropy Across Fields and Scales, Cham: Springer, 2021: 73-98. [44] Hitchcock F L. Multiple invariants and generalized rank of a p-way matrix or tensor [J]. Journal of Mathematical Physics, 1928, 7(1-4): 39-79. [45] Möcks J. Topographic components model for event-related potentials and some biophysical considerations [J]. IEEE Transactions on Biomedical Engineering, 1988, 35(6): 482-484. [46] Kruskal J B. Three-way arrays: Rank and uniqueness of trilinear decompositions, with application to arithmetic complexity and statistics [J]. Linear Algebra and Its Applications, 1977, 18: 95-138. [47] Kruskal J B. More factors than subjects, tests and treatments: An indeterminacy theorem for canonical decomposition and individual differences scaling [J]. Psychometrika, 1976, 41(3): 281-293. [48] Strassen V. Gaussian elimination is not optimal [J]. Numerische Mathematik, 1969, 13(4): 354-356. [49] Landsberg J. The border rank of the multiplication of 2×2 matrices is seven [J]. Journal of the American Mathematical Society, 2006, 19(2): 447-459. [50] Li Y, Hu S L, Wang J, et al. An introduction to the computational complexity of matrix multiplication [J]. Journal of the Operations Research Society of China, 2020, 8: 29-43. [51] Bini D, Capovani M, Romani F, et al. O(n2:7799) complexity for n×n approximate matrix multiplication [J]. Information Processing Letters, 1979, 8(5): 234-235. [52] Santamaria I. Handbook of Blind Source Separation: Independent Component Analysis and Applications [J]. IEEE Signal Processing Magazine, 2013, 30(2): 133-134. [53] Comon P, Mourrain B. Decomposition of quantics in sums of powers of linear forms [J]. Signal Processing, 1996, 53(2-3): 93-107. [54] Liu X Q, Sidiropoulos N D. Cramér-Rao lower bounds for low-rank decomposition of multidimensional arrays [J]. IEEE Transactions on Signal Processing, 2001, 49(9): 2074-2086. [55] Bro R, Sidiropoulos N D, Giannakis G B. A fast least squares algorithm for separating trilinear mixtures [C]// International Workshop on Independent Component Analysis and Blind Separation, 1999: 3. [56] Rhodes J A, Sullivant S. Identifiability of large phylogenetic mixture models [J]. Bulletin of Mathematical Biology, 2012, 74: 212-231. [57] Guo X J, Miron S, Brie D. Identifiability of the parafac model for polarized source mixture on a vector sensor array [C]//2008 IEEE International Conference on Acoustics, Speech and Signal Processing, 2008: 2401-2404. [58] Kruskal J B. Rank, decomposition, and uniqueness for 3-way and N-way arrays [M]//Coppi R, Bolasco S (eds.) Multiway Data Analysis, Amsterdam: North-Holland Publishing Co., 1989: 7-18. [59] Strassen V. Rank and optimal computation of generic tensors [J]. Linear Algebra and Its Applications, 1983, 52: 645-685. [60] Harshman R A, Lundy M E. The Parafac model for three-way factor analysis and multidimensional scaling [J]. Research Methods for Multimode Data Analysis, 1984, 46: 122-215. [61] Harshman R A, Lundy M E. PARAFAC: Parallel factor analysis [J]. Computational Statistics and Data Analysis, 1994, 18(1): 39-72. [62] Bini D, Lotti G, Romani F. Approximate solutions for the bilinear form computational problem [J]. SIAM Journal on Computing, 1980, 9(4): 692-697. [63] De Lathauwer L, De Moor B, Vandewalle J. A multilinear singular value decomposition [J]. SIAM Journal on Matrix Analysis and Applications, 2000, 21(4): 1253-1278. [64] Novikov A, Podoprikhin D, Osokin A, et al. Tensorizing neural networks [C]// Proceedings of the 29th International Conference on Neural Information Processing Systems, 2015, 1: 442-450. [65] Che M, Wei Y. Randomized algorithms for the approximations of tucker and the tensor train decompositions [J]. Advances in Computational Mathematics, 2019, 45(1): 395-428. [66] Edelman A, Arias T A, Smith S T. The geometry of algorithms with orthogonality constraints [J]. SIAM Journal on Matrix Analysis and Applications, 1998, 20(2): 303-353. [67] Smith S T, Howington D A. GPU-accelerated Tucker factorization for sparse tensors [J]. Journal of Parallel and Distributed Computing, 2017, 109: 42-52. [68] De Lathauwer L, De Moor B, Vandewalle J. On the best rank-1 and rank-(r1, r2,…,rn) approximation of higher-order tensors [J]. SIAM Journal on Matrix Analysis and Applications, 2000, 21: 1324-1342. [69] Hu S, Sun D, Toh K C. Quantifying low rank approximations of third order symmetric tensors [J/OL]. [2025-04-09]. Mathematical Programming. https://doi.org/10.1007/s10107-024-02165-1 [70] Hu S, Wang Y, Zhou J. A DCA-Newton method for quartic minimization over the sphere [J]. Advances in Computational Mathematics, 2023, 49: 53. [71] Nie J. Generating polynomials and symmetric tensor decompositions [J]. Foundations of Computational Mathematics, 2017, 17: 423-465. [72] Nie J. Low rank symmetric tensor approximations [J]. SIAM Journal on Matrix Analysis and Applications, 2017, 38: 1517-1540. [73] Leurgans S E, Ross R T, Abel R B. A decomposition for three-way arrays [J]. SIAM Journal on Matrix Analysis and Applications, 1993, 14(4): 1064-1083. [74] Ten Berge J M F, Sidiropoulos N D. On uniqueness in Candecomp/Parafac [J]. Psychometrika, 2002, 67(3): 399-409. [75] Harshman R A. Determination and proof of minimum uniqueness conditions for Parafac1[J]. UCLA Working Papers in Phonetics, 1972, 22: 111-117. [76] Lim L H, Comon P. Multiarray signal processing: Tensor decomposition meets compressed sensing [J]. Comptes Rendus Mecanique, 2010, 338(6): 311-320. [77] Jiang T, Sidiropoulos N D. Kruskal’s permutation lemma and the identification of Candecomp/Parafac and bilinear models with constant modulus constraints [J]. IEEE Transactions on Signal Processing, 2004, 52(9): 2625-2636. [78] Stegeman A, Sidiropoulos N D. On Kruskal’s uniqueness condition for the Candecomp/Parafac decomposition [J]. Linear Algebra and Its Applications, 2007, 420(2-3): 540-552. [79] Rhodes J A. A concise proof of Kruskal’s theorem on tensor decomposition [J]. Linear Algebra and Its Applications, 2010, 432(7): 1818-1824. [80] Stegeman A, Ten Berge J M F. Kruskal’s condition for uniqueness in Candecomp/Parafac when ranks and k-ranks coincide [J]. Computational Statistics & Data Analysis, 2006, 50(1): 210-220. [81] Sidiropoulos N D, Bro R. On the uniqueness of multilinear decomposition of n-way arrays [J]. Journal of Chemometrics, 2000, 14: 229-239. [82] Ma X S, Hu S L, Wang J. Efficient low-rank regularization-based algorithms combining advanced techniques for solving tensor completion problems with application to color image recovering [J]. Journal of Computational and Applied Mathematics, 2023, 423: 114947. [83] De Lathauwer L. A link between the canonical decomposition in multilinear algebra and simultaneous matrix diagonalization [J]. SIAM Journal on Matrix Analysis and Applications, 2006, 28(3): 642-666. [84] Stegeman A, Ten Berge J M F, De Lathauwer L. Sufficient conditions for uniqueness in Candecomp/Parafac and Indscal with random component matrices [J]. Psychometrika, 2006, 71(2): 219-229. [85] Horn R A, Johnson C R. Matrix Analysis [M]. Cambridge: Cambridge University Press, 2012. [86] Domanov I, De Lathauwer L. Canonical polyadic decomposition of third-order tensors: Reduction to generalized eigenvalue decomposition [J]. SIAM Journal on Matrix Analysis and Applications, 2014, 35(2): 636-660. [87] Stegeman A. On uniqueness conditions for Candecomp/Parafac and Indscal with full column rank in one mode [J]. Linear Algebra And Its Applications, 2009, 431(1-2): 211-227. [88] Guo X J, Miron S, Brie D, et al. A candecomp/parafac perspective on uniqueness of DOA estimation using a vector sensor array [J]. IEEE Transactions on Signal Processing, 2011, 59(7): 3475-3481. [89] Stegeman A. On uniqueness of the n-th order tensor decomposition into rank-1 terms with linear independence in one mode [J]. SIAM Journal on Matrix Analysis and Applications, 2010, 31(5): 2498-2516. [90] Domanov I, De Lathauwer L. Canonical polyadic decomposition of third-order tensors: Relaxed uniqueness conditions and algebraic algorithm [J]. Linear Algebra and Its Applications, 2017, 513: 342-375. [91] De Lathauwer L. Blind separation of exponential polynomials and the decomposition of a tensor in rank-(Lr, Lr, 1) terms [J]. SIAM Journal on Matrix Analysis and Applications, 2011, 32(4): 1451-1474. [92] Krijnen W P. The Analysis of Three-Way Arrays by Constrained Parafac Methods [M]. DSWO Press, Leiden, 1993. [93] Sidiropoulos N D, Liu X Q. Identifiability results for blind beamforming in incoherent multipath with small delay spread [J]. IEEE Transactions on Signal Processing, 2001, 49(1): 228-236. [94] Ten Berge J M F. The k-rank of a Khatri-Rao product [R]. The Netherlands: Heijmans Institute of Psychological Research, University of Groningen, 2000. [95] Chiantini L, Mella M, Ottaviani G. One example of general unidentifiable tensors [J]. Journal of Algebraic Statistics, 2014, 5(1): 64-71. [96] Ballico E. Partially symmetric tensor rank: The description of the non-uniqueness case for low rank [J]. Linear and Multilinear Algebra, 2018, 66(3): 608-624. [97] Carroll J D, Pruzansky S, Kruskal J B. CANDELINC: A general approach to multidimensional analysis of many-way arrays with linear constraints on parameters [J]. Psychometrika, 1980, 45(1): 3-24. [98] Harshman R A. "How can I know if it’s ‘real’?" A catalog of diagnostics for use with threemode factor analysis and multidimensional scaling [M]// Law H G, Snyder C W, Hattie J A, et al.(eds.) Research Methods for Multimode Data Analysis, New York: Praeger, 1984: 566-591. [99] Bro R, Harshman R A, Sidiropoulos N D, et al. Modeling multi-way data with linearly dependent loadings [J]. Journal of Chemometrics, 2009, 23(7-8): 324-340. [100] De Almeida A L F, Favier G, Mota J C M. A constrained factor decomposition with application to MIMO antenna systems [J]. IEEE Transactions on Signal Processing, 2008, 56(6): 2429-2442. |
[1] | 宋珊, 冯岩, 徐常青. 基于正交稀疏约束非负张量分解的人脸识别算法[J]. 运筹学学报, 2021, 25(2): 55-66. |
[2] | 徐娇娇, 杨志霞. 两个基于不同张量乘法的四阶张量分解[J]. 运筹学学报, 2018, 22(2): 41-54. |
[3] | 叶敏, 吴至友, 张亮. 线性约束三次规划问题的全局最优性必要条件和最优化算法[J]. 运筹学学报, 2015, 19(2): 15-28. |
[4] | 胡毓达. 多数满意偏好规则的充分必要条件[J]. 运筹学学报, 2014, 18(4): 78-84. |
[5] | 胡勋锋, 李登峰. 满意度优化运输问题[J]. 运筹学学报, 2014, 18(4): 36-44. |
[6] | 连淑君. 不等式约束优化问题的低阶精确罚函数的光滑化算法[J]. 运筹学学报, 2012, 16(2): 51-64. |
阅读次数 | ||||||
全文 |
|
|||||
摘要 |
|
|||||