Online real-time gender identification method for silkworm chrysalises based on Fast-SCNN-mp model
-
Abstract
China is the birthplace of the global sericulture industry and holds a core leading position in worldwide silk production. The accuracy of gender identification of silkworm pupae directly determines the efficiency and quality of breeding. Currently, gender identification mainly relies on manual observation of gonad characteristics at the tail of pupae, which needs to be completed within one week after pupation. With the intensification of labor shortage, the manual identification can no longer meet the needs of large-scale industrial development. Machine vision technology has become a research hotspot in the field of automatic silkworm pupae identification due to its significant advantages of low cost, easy integration, and adaptability to online detection. However, existing studies all construct models based on ideal silkworm pupa images with “intact gonads”, failing to consider the problem of gonad feature defects caused by practical working conditions such as pupa placement angle deviation and pupa curling in online detection, which seriously restricts the identification accuracy. To address the above problem, this study proposed an improved lightweight real-time semantic segmentation model named Fast-SCNN-mp. Based on the basic Fast-SCNN model, the performance was improved through multi-dimensional optimization. A multi-scale convolutional attention module was introduced in the feature extraction stage to enhance the discriminative focusing capability on gonadal regions; cascaded depthwise separable convolutions and bottleneck residual module were adopted to realize efficient feature compression and enhancement; a pyramid pooling module was integrated in the feature aggregation stage to fuse multi-scale contextual information and improve feature representation capability. Experiments were conducted on a gonad-defective dataset covering the full tilt angle range of 0~18°, >18°~45°, >45°~72°, and >72°~90°, which included 875 images of 5 silkworm pupae varieties. The results showed that the precision, recall, F1-score, and accuracy of the Fast-SCNN-mp model reached 98.57%, 98.65%, 98.61%, and 98.61%, respectively, which were 2.79, 2.73, 2.76, and 2.79 percentage points higher than the corresponding indicators of the basic Fast-SCNN model. Comparative experiments with 2 traditional classification algorithms and 5 mainstream semantic segmentation algorithms further verified the advantages of the model. On the dataset with a roll angle >72°~90°, Fast-SCNN-mp model achieved the accuracy of 96.3%, which was was comparable to that of Mask2Former, the state-of-the-art mainstream semantic segmentation algorithm, whereas the accuracy of the optimal traditional classification algorithm convolutional compact Transformer (CCT) only reached 81.48%. In terms of model parameters and inference speed (FPS), the Fast-SCNN-mp model had only 2.17 M parameters, which was the lowest among all models. Meanwhile, it achieved an inference speed of 68.10 FPS, outperforming all other comparative algorithms and representing a 33.54-fold increase compared with the top-performing model Mask2Former. In conclusion, while maintaining high discrimination accuracy, Fast-SCNN-mp model realized light weight design and high inference efficiency, which effectively balanced the trade-off between discrimination performance and real-time requirements. lt provides an efficient and reliable technical solution for the online intelligent discrimination of silkworm pupae, and also offers a valuable reference for the model optimization and application in real-time classification tasks within the agricultural field.
-
-