Acta Entomologica Sinica ›› 2021, Vol. 64 ›› Issue (12): 1444-1454.doi: 10.16380/j.kcxb.2021.12.010

• RESEARCH PAPERS • Previous Articles     Next Articles

An automatic identification and counting method of Spodoptera frugiperda (Lepidoptera: Noctuidae) adults based on sex pheromone trapping and deep learning

QIU Rong-Zhou1, ZHAO Jian2, HE Yu-Xian1, CHEN Shao-Ping1, HUANG Mei-Ling3, CHI Mei-Xiang1, LIANG Yong2, WENG Qi-Yong1,*   

  1.  (1. Fujian Key Laboratory for Monitoring and Integrated Management of Crop Pests, Institute of Plant Protection, Fujian Academy of Agricultural Sciences, Fuzhou 350013, China; 2. Institute of Digital Agriculture, Fujian Academy of Agricultural Sciences, Fuzhou 350003, China; 3. Plant Protection Station of Changtai District of Zhangzhou City, Zhangzhou, Fujian 363900, China)
  • Online:2021-12-20 Published:2021-11-26

Abstract: 【Aim】 To explore the feasibility of deep learning in automatic recognition counting of Spodoptera frugiperda adults, and to evaluate the recognition counting precision of the model, so as to provide image recognition and counting methods for intelligent pest monitoring by machine. 【Methods】 A self-designed pest image monitoring device based on sexual attraction was used to automatically and regularly collect images of trapped S. frugiperda adults. Combined with the collection of images of S. frugiperda adults on sticky coloured cards with ship-shape trap, a dataset was constructed. The YOLOv5 deep learning object detection model was used for feature learning. The models of Yolov5s-A1, Yolov5s-A2, Yolov5s-AB and Yolov5s-ABC were obtained by model training using different image datasets, including the original images of S. frugiperda adults, S. frugiperda adult images with edge incomplete objects removed, S. frugiperda adult images with similar detection objects (S. litura adults) added, and negative samples without detection objects. The detection results of test samples under different occlusion gradients using different models were compared. Precision (P), recall (R), F1-measure, average precision (AP), and counting accuracy (CA) were used to evaluate these models. 【Results】 The recognition precision, recall and F1-measure of Yolov5s-A1 trained by the original image set reached 87.37%, 90.24% and 88.78, respectively. The model Yolov5s-A2 trained by images with edge incomplete objects removed had a recognition precision of 93.15%, a recall of 84.77%, and F1-measure of 88.76. The recognition precision, recall and F1-measure of Yolov5s-AB trained by images of added S. litura adult samples reached 96.23%, 91.85% and 93.99, respectively. The model Yolov5s-ABC trained by negative samples without detection objects had a recognition precision of 94.76%, a recall of 88.23%, and F1-measure of 91.38. The order of average precision of four models from high to low was as follows: Yolov5s-AB>Yolov5s-ABC>Yolov5s-A2>Yolov5s-A1; and the result of Yolov5s-AB was similar to that of Yolov5s-ABC. The order of counting accuracy of four models from high to low was as follows: Yolov5s-AB>Yolov5s-ABC>Yolov5s-A2>Yolov5s-A1. 【Conclusion】 The results show that the method developed in this study is applicable for the recognition and counting of S. frugiperda adults on pest image monitoring equipment and the sticky coloured card with trap under control conditions, and the deep learning technology is effective for the identification and counting of S. frugiperda adults. The automatic recognition and counting method for S. frugiperda adults based on deep learning has good robustness to insect body posture changes, sundries interference, etc. It can automatically count the number of S. frugiperda adults with various body postures and damaged body. The method has a broad application prospect in pest population monitoring.

Key words: Spodoptera frugiperda, machine vision, deep learning, YOLO algorithm, population monitoring, image recognition, automatic counting