DeepCrack: Learning Hierarchical Convolutional Features for Crack Detection

Cracks are typical line structures that are of interest in many computer-vision applications. In practice, many cracks, e.g., pavement cracks, show poor continuity and low contrast, which bring great challenges to image-based crack detection by using low-level features. In this paper, we propose Dee...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on image processing Vol. 28; no. 3; pp. 1498 - 1512
Main Authors: Zou, Qin, Zhang, Zheng, Li, Qingquan, Qi, Xianbiao, Wang, Qian, Wang, Song
Format: Journal Article
Language:English
Published: United States IEEE 01-03-2019
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Cracks are typical line structures that are of interest in many computer-vision applications. In practice, many cracks, e.g., pavement cracks, show poor continuity and low contrast, which bring great challenges to image-based crack detection by using low-level features. In this paper, we propose DeepCrack-an end-to-end trainable deep convolutional neural network for automatic crack detection by learning high-level features for crack representation. In this method, multi-scale deep convolutional features learned at hierarchical convolutional stages are fused together to capture the line structures. More detailed representations are made in larger scale feature maps and more holistic representations are made in smaller scale feature maps. We build DeepCrack net on the encoder-decoder architecture of SegNet and pairwisely fuse the convolutional features generated in the encoder network and in the decoder network at the same scale. We train DeepCrack net on one crack dataset and evaluate it on three others. The experimental results demonstrate that DeepCrack achieves F -measure over 0.87 on the three challenging datasets in average and outperforms the current state-of-the-art methods.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1057-7149
1941-0042
DOI:10.1109/TIP.2018.2878966