Improvement of Concrete Crack Segmentation Performance Using Stacking Ensemble Learning

Signs of functional loss due to the deterioration of structures are primarily identified from cracks occurring on the surface of structures, and continuous monitoring of structural cracks is essential for socially important structures. Recently, many structural crack monitoring technologies have bee...

Full description

Saved in:
Bibliographic Details
Published in:Applied sciences Vol. 13; no. 4; p. 2367
Main Authors: Lee, Taehee, Kim, Jung-Ho, Lee, Sung-Jin, Ryu, Seung-Ki, Joo, Bong-Chul
Format: Journal Article
Language:English
Published: Basel MDPI AG 01-02-2023
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Signs of functional loss due to the deterioration of structures are primarily identified from cracks occurring on the surface of structures, and continuous monitoring of structural cracks is essential for socially important structures. Recently, many structural crack monitoring technologies have been developed with the development of deep-learning artificial intelligence (AI). In this study, stacking ensemble learning was applied to predict the structural cracks more precisely. A semantic segmentation model was primarily used for crack detection using a deep learning AI model. We studied the crack-detection performance by training UNet, DeepLabV3, DeepLabV3+, DANet, and FCN-8s. Owing to the unsuitable crack segmentation performance of the FCN-8s, stacking ensemble learning was conducted with the remaining four models. Individual models yielded an intersection over union (IoU) score ranging from approximately 0.4 to 0.6 for the test dataset. However, when the metamodel completed with stacking ensemble learning was used, the IoU score was 0.74, indicating a high-performance improvement. A total of 1235 test images was acquired with drones on the sea bridge, and the stacking ensemble model showed an IoU of 0.5 or higher for 64.4% of the images.
ISSN:2076-3417
2076-3417
DOI:10.3390/app13042367