AN APPROACH BASED ON DEEP LEARNING AND MULTISENSORY DATA FUSION TO DETECT AND CLASSIFY DEFECTS IN VESSEL HULLS
biofouling; Robotization; Non-destructive tests; Rotating brush; Ship hull cleaning; Taguchi
method
Inspecting vessel hulls requires using more than one non-destructive inspection technique to detect different defect types. Depending on the age and size of the inspected asset, a massive volume of inspection data can be generated, making the analysis laborious and susceptible to errors due to human fatigue.
This work aims to present an approach to speed up the data analysis and reduce human errors through multisensory data fusion to detect and classify defects in vessel hulls. In order to validate the developed approach, an experiment is conducted in a controlled environment, in which a test plate is inspected by three systems of different physical natures: camera, magnetic eddy current, and ultrasound.
The collected data from this experiment are used to develop three classifiers – each for a technique –, from which two are made based on artificial neural networks, given the complexity of the data. The classifiers are then fused at the decision level, generating an easy-to-interpret colored defect segmentation map, where each color represents a specific type of defect. Finally, a technical report model is also proposed containing information regarding each detection, such as type, location, and area, which can be helpful for structural health monitoring purposes.
This work represents a step forward in the inspection technology of large structures by exploring the complementarity of multisensory systems. Although the strategy has been exemplified in a vessel inspection scenario, it applies to Other systems that inspect large areas. The fusion of classifiers represents time savings in analysis, reduction of operational costs, and increased safety through the mitigation of doubts regarding the classification of defects.