MobileNetV2-based Transfer Learning Model with Edge Computing for Automatic Fabric Defect Detection
Abstract
In textile manufacturing, fabric defect detection is an essential quality control step and a challenging task. Earlier, manual efforts were applied to detect defects in fabric production. Human exhaustion, time consumption, and lack of concentration are the main problems in the manual defect detection process. Machine vision systems based on deep learning play a vital role in the Industrial Internet of things (IIoT) and fully automated production processes. Deep learning centered on Convolution Neural Network (CNN) models have been commonly used in fabric defect detection, but most of these models require high computing resources. This work presents a lightweight MobileNetV2-based Transfer Learning model to assist defect detection with low power consumption, low latency, easy upgrade, more efficiency, and an automatic visual inspection system with edge computing. Firstly, different image transformation techniques were performed as data augmentation on four fabric datasets for the model's adaptability in various fabrics. Secondly, fine-tuning hyperparameters of the MobileNetV2 with transfer learning gives a lightweight, adaptable and scalable model that suits the resource-constrained edge device. Finally, deploy the trained model to the NVIDIA Jetson Nano-kit edge device to make its detection faster. We assessed the model based on its accuracy, sensitivity rate, specificity rate, and F1 measure. The numerical simulation reveals that the model accuracy is 96.52%, precision is 96.52%, recall is 96.75%, and F1-Score is 96.52%.
Keyword(s)
Deep learning, Edge devices, Industrial IoT, Modeling, MobileNetV2
Full Text: PDF (downloaded 782 times)
Refbacks
- There are currently no refbacks.