Open Access Research Article

Crack Detection in Environments with Complex Backgrounds Using Deep Convolution Neural Nets

Mohamed Elmorsy*1,2, Hazem Elanwar1, Ahmed Elbanna3 and Sherif A Mourad1

1Department of Structural Engineering, Faculty of Engineering, Cairo University, Egypt

2Department of Civil Engineering, McMaster University, Canada

3Department of Civil and Environmental Engineering, University of Illinois at Urbana-Champaign, USA

Corresponding Author

Received Date: March 30, 2022;  Published Date: April 05, 2022

Abstract

Structural health monitoring has become one of the most important fields of research in the engineering communities to mitigate the risk of human and economic losses due to structural damage or deterioration. In recent years, and with advancements in computational abilities, computer vision techniques have been used effectively for structural monitoring. The widespread of consumer-grade and smartphone cameras allows computer vision to be a superior data collection tool especially in large-scale events such as earthquakes. While, deep convolution neural networks are promising robust techniques in computer vision and machine learning, one of their challenges is acquiring a reliable and large enough dataset for training, which in turn affects the accuracy of the results significantly. In this paper, a deep convolution neural network for crack detection is developed and its robustness is assessed based on images in environments with complex backgrounds. A new approach in adopted to overcome the limited training dataset by creating synthetic cracked beams images using the extended finite element method. The proposed approach effectively increased training and validation accuracies and achieved an average testing accuracy of 98.2%. The network is combined with a reporting system that uses a sliding window technique to eliminate overall image size.

Keywords: Crack detection; Structural health monitoring; Deep convolution neural networks

Citation
Keywords
Signup for Newsletter
Scroll to Top