Enhanced Sparsity Prior Model for Low-Rank Tensor Completion
Conventional tensor completion (TC) methods generally assume that the sparsity of tensor-valued data lies in the global subspace. The so-called global sparsity prior is measured by the tensor nuclear norm. Such assumption is not reliable in recovering low-rank (LR) tensor data, especially when consi...
Saved in:
Published in: | IEEE transaction on neural networks and learning systems Vol. 31; no. 11; pp. 4567 - 4581 |
---|---|
Main Authors: | , , , , |
Format: | Journal Article |
Language: | English |
Published: |
United States
IEEE
01-11-2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Conventional tensor completion (TC) methods generally assume that the sparsity of tensor-valued data lies in the global subspace. The so-called global sparsity prior is measured by the tensor nuclear norm. Such assumption is not reliable in recovering low-rank (LR) tensor data, especially when considerable elements of data are missing. To mitigate this weakness, this article presents an enhanced sparsity prior model for LRTC using both local and global sparsity information in a latent LR tensor. In specific, we adopt a doubly weighted strategy for nuclear norm along each mode to characterize global sparsity prior of tensor. Different from traditional tensor-based local sparsity description, the proposed factor gradient sparsity prior in the Tucker decomposition model describes the underlying subspace local smoothness in real-world tensor objects, which simultaneously characterizes local piecewise structure over all dimensions. Moreover, there is no need to minimize the rank of a tensor for the proposed local sparsity prior. Extensive experiments on synthetic data, real-world hyperspectral images, and face modeling data demonstrate that the proposed model outperforms state-of-the-art techniques in terms of prediction capability and efficiency. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
ISSN: | 2162-237X 2162-2388 |
DOI: | 10.1109/TNNLS.2019.2956153 |