Self-Similarity Constrained Sparse Representation for Hyperspectral Image Super-Resolution

Fusing a low-resolution hyperspectral (HS) image with the corresponding high-resolution multispectral image to obtain a high-resolution HS image is an important technique for capturing comprehensive scene information in both the spatial and spectral domains. Existing approaches adopt sparsity promot...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on image processing Vol. 27; no. 11; pp. 5625 - 5637
Main Authors: Han, Xian-Hua, Shi, Boxin, Zheng, Yinqiang
Format: Journal Article
Language:English
Published: United States IEEE 01-11-2018
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Fusing a low-resolution hyperspectral (HS) image with the corresponding high-resolution multispectral image to obtain a high-resolution HS image is an important technique for capturing comprehensive scene information in both the spatial and spectral domains. Existing approaches adopt sparsity promoting strategy and encode the spectral information of each pixel independently, which results in noisy sparse representation. We propose a novel HS image super-resolution method via a self-similarity constrained sparse representation. We explore the similar patch structures across the whole image and the pixels with close appearance in the local regions to create global-structure groups and local-spectral super-pixels. By forcing the similarity of the sparse representations for pixels belonging to the same group and super-pixel, we alleviate the effect of the outliers in the learned sparse coding. Experiment results on benchmark datasets validate that the proposed method outperforms the state-of-the-art methods in both the quantitative metrics and visual effect.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1057-7149
1941-0042
DOI:10.1109/TIP.2018.2855418