URBAN CHANGE DETECTION BASED ON SEMANTIC SEGMENTATION AND FULLY CONVOLUTIONAL LSTM NETWORKS

Change detection is a very important problem for the remote sensing community. Among the several approaches proposed during recent years, deep learning provides methods and tools that achieve state of the art performances. In this paper, we tackle the problem of urban change detection by constructin...

Full description

Saved in:
Bibliographic Details
Published in:ISPRS annals of the photogrammetry, remote sensing and spatial information sciences Vol. V-2-2020; pp. 541 - 547
Main Authors: Papadomanolaki, M., Vakalopoulou, M., Karantzalos, K.
Format: Journal Article
Language:English
Published: Gottingen Copernicus GmbH 03-08-2020
Copernicus Publications
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Change detection is a very important problem for the remote sensing community. Among the several approaches proposed during recent years, deep learning provides methods and tools that achieve state of the art performances. In this paper, we tackle the problem of urban change detection by constructing a fully convolutional multi-task deep architecture. We present a framework based on the UNet model, with fully convolutional LSTM blocks integrated on top of every encoding level capturing in this way the temporal dynamics of spatial feature representations at different resolution levels. The proposed network is modular due to shared weights which allow the exploitation of multiple (more than two) dates simultaneously. Moreover, our framework provides building segmentation maps by employing a multi-task scheme which extracts additional feature attributes that can reduce the number of false positive pixels. We performed extensive experiments comparing our method with other state of the art approaches using very high resolution images of urban areas. Quantitative and qualitative results reveal the great potential of the proposed scheme, with F1 score outperforming the other compared methods by almost 2.2%.
ISSN:2194-9050
2194-9042
2194-9050
DOI:10.5194/isprs-annals-V-2-2020-541-2020