Classification and quantification of cracks in concrete structures using deep learning image-based techniques

Visual inspection has been the most widely used technique for monitoring concrete structures in service. Inspectors visually evaluate defects based on experience, skill, and engineering judgment. However, this process is subjective, laborious, time-consuming, and hampered by demanding access to nume...

Full description

Saved in:
Bibliographic Details
Published in:Cement & concrete composites Vol. 114; p. 103781
Main Authors: Flah, Majdi, Suleiman, Ahmed R., Nehdi, Moncef L.
Format: Journal Article
Language:English
Published: Elsevier Ltd 01-11-2020
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Visual inspection has been the most widely used technique for monitoring concrete structures in service. Inspectors visually evaluate defects based on experience, skill, and engineering judgment. However, this process is subjective, laborious, time-consuming, and hampered by demanding access to numerous parts of complex structures. Accordingly, the present study proposes a nearly automated inspection model based on image processing and deep learning for detecting defects in typically inaccessible areas of concrete structures. Results indicate that using the Keras classifier combined with Otsu image processing can achieve superior classification accuracy of 97.63%, 96.5%, and 96.17% for training, validation, and testing data, respectively, along with low quantification error of 1.5%, 5% and 2% for the crack length, width, and angle of orientation, respectively. The type of structural damage and its severity are identified based on the allowed range of concrete crack width for different structures, including buildings and bridges based on different international standards and codes. The proposed method can deploy unmanned aerial vehicle image acquisition to offer a nearly automated inspection platform for the colossal backlog of aging concrete structures. [Display omitted]
ISSN:0958-9465
1873-393X
DOI:10.1016/j.cemconcomp.2020.103781