Pan-Sharpening With Color-Aware Perceptual Loss And Guided Re-Colorization

In remote sensing, "pan-sharpening" is the task of enhancing the spatial resolution of a multi-spectral (MS) image by exploiting the high-frequency information in a panchromatic (PAN) reference image. We present a novel color-aware perceptual (CAP) loss for learning the task of pan-sharpen...

Full description

Saved in:
Bibliographic Details
Published in:2020 IEEE International Conference on Image Processing (ICIP) pp. 908 - 912
Main Authors: Bello, Juan Luis Gonzalez, Seo, Soomin, Kim, Munchurl
Format: Conference Proceeding
Language:English
Published: IEEE 01-10-2020
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In remote sensing, "pan-sharpening" is the task of enhancing the spatial resolution of a multi-spectral (MS) image by exploiting the high-frequency information in a panchromatic (PAN) reference image. We present a novel color-aware perceptual (CAP) loss for learning the task of pan-sharpening. Our CAP loss is designed to focus on the deep features of a pre-trained VGG network that are more sensitive to spatial details and ignore color information to allow the network to extract the structural information from the PAN image while keeping the color from the lower resolution MS image. Additionally, we propose "guided re-colorization", which generates a pan-sharpened image with real colors from the MS input by "picking" the closest MS pixel color for each pan-sharpened pixel, as a human operator would do in manual colorization. Such a re-colorized (RC) image is completely aligned with the pan-sharpened (PS) network output and can be used as a self-supervision signal during training, or to enhance the colors in the PS image during test. We present several experiments where our network trained with our CAP loss generates naturally looking pan-sharpened images with fewer artifacts and outperforms the state-of-the-arts on the WorldView3 dataset in terms of ERGAS, SCC, and QNR metrics.
ISSN:2381-8549
DOI:10.1109/ICIP40778.2020.9190785