Structural Damage Assessment Using Augmented Reality
Abstract
- Civil infrastructure worldwide is ageing and enduring increasingly adverse weather conditions. Traditional structural health monitoring (SHM) involves the expensive and time-consuming installation of contact sensors. For example, inspectors use costly large-scale equipment to reach a certain area of the structure and at different heights to inspect it, which can pose a risk to the inspector's safety. Moreover, the inspectors rely only on the batch data acquired during the inspection period, which are analyzed by engineers at a later time due to the limited availability of a real-time visualization approach for structural inspection within the traditional mode of SHM. To address these timely challenges, an Augmented Reality (AR)-based automated multiclass damage identification and quantification methodology is proposed in this paper. The interactive visualization framework of AR is integrated with the autonomous decision-making of Artificial Intelligence (AI) in a unified fashion to incorporate human-sensor interaction. The proposed system uses an AI model that is trained and optimized using the YOLOv5 architecture to detect and classify four different types of anomalies/damages (i.e., cracks, spalls, pittings, and joints). The AI model is then updated to quantify the length, area, and perimeter of any damage using segmentation to further assess its severity. Once the model is developed, the model is embedded with the AR device and tested through its interactive environment for SHM of various structures. The paper concludes that the proposed approach successfully classifies four types of damage with an accuracy of more than 90% for up to 2 m, and it also quantifies the length, area, and perimeter with less than 2% of error.