一个基于张量火车分解的张量填充方法及在图像恢复中的应用

展开
  • 1. 杭州电子科技大学理学院, 浙江杭州 310018
凌晨, E-mail: macling@hdu.edu.cn

收稿日期: 2022-01-18

  网络出版日期: 2022-09-07

基金资助

国家自然科学基金(11971138)

A tensor completion method based on tensor train decomposition and its application in image restoration

Expand
  • 1. School of Sciences, Hangzhou DianZi University, Hangzhou 310018, Zhejiang, China

Received date: 2022-01-18

  Online published: 2022-09-07

摘要

低秩张量填充在数据恢复中有广泛应用, 基于张量火车(TT) 分解的张量填充模型在彩色图像和视频以及互联网数据恢复中应用效果良好。本文提出一个基于三阶张量TT分解的填充模型。在模型中, 引入稀疏正则项与时空正则项, 分别刻画核张量的稀疏性和数据固有的块相似性。根据问题的结构特点, 引入辅助变量将原模型等价转化成可分离形式, 并采用临近交替极小化(PAM) 与交替方向乘子法(ADMM) 相结合的方法求解模型。数值实验表明, 两正则项的引入有利于提高数据恢复的稳定性和实际效果, 所提出方法优于其他方法。在采样率较低或图像出现结构性缺失时, 其方法效果较为显著。

本文引用格式

谢文蕙, 凌晨, 潘晨健 . 一个基于张量火车分解的张量填充方法及在图像恢复中的应用[J]. 运筹学学报, 2022 , 26(3) : 31 -43 . DOI: 10.15960/j.cnki.issn.1007-6093.2022.03.003

Abstract

Low-rank tensor completion is widely used in data recovery, and the tensor completion model based on tensor train (TT) decomposition works well in color image, video and internet data recovery. This paper proposes a tensor completion model based on the third-order tensor TT decomposition. In this model, the sparse regularization and the spatio-temporal regularization are introduced to characterize the sparsity of the kernel tensor and the inherent block similarity of the data, respectively. According to the structural characteristics of the problem, some auxiliary variables are introduced to convert the original model into a separable form equivalently, and the method of combining proximal alternating minimization (PAM) and alternating direction multiplier method (ADMM) is used to solve the model. Numerical experiments show that the introduction of two regular terms is beneficial to improve the stability and practical effect of data recovery, and the proposed method is superior to other methods. When the sampling rate is low or the image is structurally missing, the presented method is more effective.

参考文献

1 Kolda T G , Bader B W . Tensor decompositions and applications[J]. SIAM Review, 2009, 51 (3): 455- 500.
2 Kilmer M , Braman K , Hao N , et al. Third-order tensors as operators on matrices: a theoretical and computational framework with applications in imaging[J]. SIAM Journal on Matrix Analysis and Applications, 2013, 34 (1): 148- 172.
3 Oseledets I . Tensor-train decomposition[J]. SIAM Journal of Scientific Computing, 2011, 33 (5): 2295- 2317.
4 Liu J , Musialski P , Wonka P , et al. Tensor completion for estimating missing values in visual data[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2013, 35 (1): 208- 220.
5 Shang K , Li Y F , Huang Z H . Iterative $p$-shrinkage thresholding algorithm for low Tucker rank tensor recovery[J]. Information Sciences, 2019, 482, 374- 391.
6 Zhang Z M, Ely G, Aeron S, et al. Novel methods for multilinear data completion and de-noising based on tensor-SVD[C]//IEEE Conference on Computer Vision and Pattern Recognition, 2014: 3842-3849. Doi: 10.1109/CVPR.2014.485.
7 Zhang Z M , Aeron S C . Exact tensor completion using $t$-SVD[J]. IEEE Transactions on Signal Processing, 2017, 65 (6): 1511- 1526.
8 Lu C Y , Feng J S , Chen Y D , et al. Tensor robust principal component analysis: Exact recovery of corrupted low-rank tensors via convex optimization[J]. IEEE Conference on Computer Vision and Pattern Recognition, 2016, 5249- 5257.
9 Qiu D , Bai M R , Ng M K , et al. Robust low-rank tensor completion via transformed tensor nuclear norm with total variation regularization[J]. Neurocomputing, 2021, 435, 197- 215.
10 Acar E , Dunlavy D M , Kolda T G , et al. Scalable tensor factorizations for incomplete data[J]. Chemometrics and Intelligent Laboratory Systems, 2011, 106 (1): 41- 56.
11 Zhao Q B , Zhang L Q , Cichocki A . Bayesian CP factorization of incomplete tensors with automatic rank determination[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2015, 37 (9): 1751- 1763.
12 Zeng C , Jiang T X , Ng M K . An approximation method of CP rank for third-order tensor completion[J]. Numerische Mathematik, 2021, 147, 727- 757.
13 Mu C, Huang B, Wright J, et al. Square deal: Lower bounds and improved relaxations for tensor recovery[C]. International Conference on Machine Learning, 2014, 2: 73-81.
14 Bengua J A , Phien H N , Tuan H D , et al. Efficient tensor completion for color image and video recovery: Low-rank tensor train[J]. IEEE Transactions on Image Processing, 2017, 26 (5): 2466- 2479.
15 Ko C Y , Batselier K , Yu W J , et al. Fast and accurate tensor completion with total variation regularized tensor trains[J]. IEEE Transactions on Image Processing, 2020, 29, 6918- 6931.
16 Du S Q , Shi Y Q , Hu W J , et al. Robust tensor factorization for color image and grayscale video recovery[J]. IEEE Access, 2020, 8, 174410- 174423.
17 Zhang Z Y , Ling C , He H J , et al. A tensor train approach for internet traffic data completion[J]. Annals of Operations Research, 2021,
18 Attouch H , Bolte J , Svaiter B F . Convergence of descent methods for semialgebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods[J]. Mathematical Programming, 2013, 137 (1/2): 91- 129.
19 Ma T H , Lou Y F , Huang T Z . Truncated $\ell_{1-2}$ models for sparse recovery and rank minimization[J]. SIAM Journal on Imaging Sciences, 2017, 10 (3): 1346- 1380.
20 Hale E T , Yin W T , Zhang Y . Fixed-point continuation for $\ell_{1}$-minimization: methodology and convergence[J]. SIAM Journal on Optimization, 2008, 19 (3): 1107- 1130.
21 Eckstein J , Bertsekas D P . On the douglas-rachford splitting method and the proximal point algorithm for maximal monotone operators[J]. Mathematical Programming, 1992, 55 (1): 293- 318.
22 Chen Y L , Hsu C T , Liao H Y M . Simultaneous tensor decomposition and completion using factor priors[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2014, 36 (3): 577- 591.
文章导航

/