Deep gradient prior network for DEM super-resolution: Transfer learning from image to DEM

Digital elevation model (DEM) super-resolution (SR) aims to increase the spatial resolution of a DEM through data processing, rather than using sensors with higher accuracy. Inspired by the success of convolutional neural networks (CNNs) in image SR, this study introduces the CNN into DEM SR. Howeve...

Full description

Saved in:
Bibliographic Details
Published in:ISPRS journal of photogrammetry and remote sensing Vol. 150; pp. 80 - 90
Main Authors: Xu, Zekai, Chen, Zixuan, Yi, Weiwei, Gui, Qiuling, Hou, Wenguang, Ding, Mingyue
Format: Journal Article
Language:English
Published: Elsevier B.V 01-04-2019
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Digital elevation model (DEM) super-resolution (SR) aims to increase the spatial resolution of a DEM through data processing, rather than using sensors with higher accuracy. Inspired by the success of convolutional neural networks (CNNs) in image SR, this study introduces the CNN into DEM SR. However, directly training a robust network for DEM SR has remained a challenge as it has been difficult to obtain sufficient high-resolution DEM samples. Therefore, we proposed a novel method that includes two measures to address this issue. The first is to design a deep CNN for acquiring gradient prior knowledge. Based on this prior, high-resolution gradient maps of the studied DEM can be estimated. The second measure is to introduce transfer learning, which applies the knowledge learned from the natural image to the DEM SR problem. Therefore, a CNN is pretrained by the gradient of numerous high-resolution natural images and then fine-tuned with the gradient maps of some DEM training data. Finally, the high-resolution DEM is reconstructed under the constraints of the estimated gradient maps and the original low-resolution DEM. Overall, the experiments indicate that the proposed framework is superior to current state-of-the-art methods.
ISSN:0924-2716
1872-8235
DOI:10.1016/j.isprsjprs.2019.02.008