A Deep-Learning-Based Detection Approach for the Identification of Insect Species of Economic Importance
Abstract
:Simple Summary
Abstract
1. Introduction
2. Materials and Methods
2.1. Ethics Statement
2.2. Animal Rearing
2.3. Images Acquisition Setup
2.4. Dataset Creation and Pre-Processing
2.5. YOLO Network
2.6. Network Training and Testing Results
3. Results
4. Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Data Availability Statement
Conflicts of Interest
References
- Russell, S.J. Artificial Intelligence a Modern Approach; Pearson Education, Inc.: London, UK, 2010. [Google Scholar]
- Bannerjee, G.; Sarkar, U.; Das, S.; Ghosh, I. Artificial intelligence in agriculture: A literature survey. Int. J. Sci. Res. Comp. Sci. Appl. Manag. Stud. 2018, 7, 1–6. [Google Scholar]
- Dharmaraj, V.; Vijayanand, C. Artificial intelligence (AI) in agriculture. Int. J. Cur. Microb. Appl. Sci. 2018, 7, 2122–2128. [Google Scholar] [CrossRef]
- Føre, M.; Frank, K.; Norton, T.; Svendsen, E.; Alfredsen, J.A.; Dempster, T.; Eguiraun, H.; Watson, W.; Stahl, A.; Sunde, L.F. Precision fish farming: A new framework to improve production in aquaculture. Biosyst. Eng. 2018, 173, 176–193. [Google Scholar] [CrossRef]
- Eli-Chukwu, N.C. Applications of artificial intelligence in agriculture: A review. Eng. Tech. Appl. Sci. Res. 2019, 9, 4377–4383. [Google Scholar] [CrossRef]
- Smith, M.J. Getting value from artificial intelligence in agriculture. Anim. Prod. Sci. 2018, 60, 46–54. [Google Scholar] [CrossRef]
- Jha, K.; Doshi, A.; Patel, P.; Shah, M. A comprehensive review on automation in agriculture using artificial intelligence. Artif. Intell. Agric. 2019, 2, 1–12. [Google Scholar] [CrossRef]
- Dara, S.K. The new integrated pest management paradigm for the modern age. J. Integr. Pest Manage. 2019, 10, 12. [Google Scholar] [CrossRef] [Green Version]
- Ghaderi, S.; Fathipour, Y.; Asgari, S.; Reddy, G.V. Economic injury level and crop loss assessment for Tuta absoluta (Lepidoptera: Gelechiidae) on different tomato cultivars. J. Appl. Entomol. 2019, 143, 493–507. [Google Scholar] [CrossRef]
- Saha, T.; Chandran, N. Chemical ecology and pest management: A review. Int. J. Card. Sc. 2017, 5, 618–621. [Google Scholar]
- Prasad, Y.; Prabhakar, M. Pest monitoring and forecasting. Integrated Pest Management: Principles and Practice; Cabi: Oxfordshire, UK, 2012; pp. 41–57. [Google Scholar]
- Witzgall, P.; Kirsch, P.; Cork, A. Sex pheromones and their impact on pest management. J. Chem. Ecol. 2010, 36, 80–100. [Google Scholar] [CrossRef]
- Silva, D.; Salamanca, J.; Kyryczenko-Roth, V.; Alborn, H.T.; Rodriguez-Saona, C. Comparison of trap types, placement, and colors for monitoring Anthonomus musculus (Coleoptera: Curculionidae) adults in highbush blueberries. J. Insect Sci. 2018, 18, 19. [Google Scholar] [CrossRef]
- Liu, H.; Lee, S.H.; Chahl, J.S. A review of recent sensing technologies to detect invertebrates on crops. Precis. Agric. 2017, 18, 635–666. [Google Scholar] [CrossRef]
- Ding, W.; Taylor, G. Automatic moth detection from trap images for pest management. Comput. Electron. Agric. 2016, 123, 17–28. [Google Scholar] [CrossRef] [Green Version]
- Durgabai, R.P.L.; Bhargavi, P. Pest management using machine learning algorithms: A review. Int. J. Com. Sc. Eng. Inf. Tech. Res. 2018, 8, 13–22. [Google Scholar]
- Rustia, D.J.A.; Lin, C.E.; Chung, J.Y.; Zhuang, Y.J.; Hsu, J.C.; Lin, T.T. Application of an image and environmental sensor network for automated greenhouse insect pest monitoring. J. Asia-Pac. Entomol. 2020, 23, 17–28. [Google Scholar] [CrossRef]
- Clark, R.D. Putting deep learning in perspective for pest management scientists. Pest. Manage. Sci. 2020, 76, 2267–2275. [Google Scholar] [CrossRef]
- Arbuckle, T.; Schröder, S.; Steinhage, V.; Wittmann, D. Biodiversity informatics in action: Identification and monitoring of bee species using ABIS. In Proceedings of the 15th International Symposium Informatics for Environmental Protection, Zurich, Switzerland, 10–12 October 2001; Volume 1, pp. 425–430. [Google Scholar]
- Tofilski, A. DrawWing, a program for numerical description of insect wings. J. Insect Sci. 2004, 4, 1–5. [Google Scholar] [CrossRef]
- Wang, J.; Lin, C.; Ji, L.; Liang, A. A new automatic identification system of insect images at the order level. Knowl. Based Syst. 2012, 33, 102–110. [Google Scholar] [CrossRef]
- Kang, S.H.; Cho, J.H.; Lee, S.H. Identification of butterfly based on their shapes when viewed from different angles using an artificial neural network. J. Asia Pac. Entomol. 2014, 17, 143–149. [Google Scholar] [CrossRef]
- Mayo, M.; Watson, A.T. Automatic species identification of live moths. Knowl. Based Syst. 2007, 20, 195–202. [Google Scholar] [CrossRef] [Green Version]
- Larios, N.; Soran, B.; Shapiro, L.G.; Martínez-Muñoz, G.; Lin, J.; Dietterich, T.G. Haar random forest features and SVM spatial matching kernel for stonefly species identification. In Proceedings of the 2010 20th International Conference on Pattern Recognition, Istanbul, Turkey, 23–26 August 2010; IEEE: Piscataway, NJ, USA; pp. 2624–2627. [Google Scholar] [CrossRef] [Green Version]
- Lytle, D.A.; Martínez-Muñoz, G.; Zhang, W.; Larios, N.; Shapiro, L.; Paasch, R.; Moldenke, A.; Mortensen, E.N.; Todorovic, S.; Dietterich, T.G. Automated processing and identification of benthic invertebrate samples. J. N. Am. Benthol. Soc. 2010, 29, 867–874. [Google Scholar] [CrossRef]
- Kaya, Y.; Kayci, L. Application of artificial neural network for automatic detection of butterfly species using color and texture features. Vis. Comp. 2014, 30, 71–79. [Google Scholar] [CrossRef]
- Li, X.L.; Huang, S.G.; Zhou, M.Q.; Geng, G.H. KNN-spectral regression LDA for insect recognition. In Proceedings of the 2009 First International Conference on Information Science and Engineering, Nanjing, China, 26–28 December 2009; IEEE: Piscataway, NJ, USA; pp. 1315–1318. [Google Scholar] [CrossRef]
- Tuda, M.; Luna-Maldonado, A.I. Image-based insect species and gender classification by trained supervised machine learning algorithms. Ecol. Informat. 2020, 60, 101135. [Google Scholar] [CrossRef]
- Wen, C.; Guyer, D. Image-based orchard insect automated identification and classification method. Comput. Electron. Agric. 2012, 89, 110–115. [Google Scholar] [CrossRef]
- van Klink, R.; August, T.; Bas, Y.; Bodesheim, P.; Bonn, A.; Fossøy, F.; Høye, T.T.; Jongejans, E.; Menz, M.H.M.; Bowler, D.E.; et al. Emerging technologies revolutionise insect ecology and monitoring. Trends Ecol. Evol. 2022, 10, 872–885. [Google Scholar] [CrossRef]
- Kawakita, S.; Ichikawa, K. Automated classification of bees and hornet using acoustic analysis of their flight sounds. Apidologie 2019, 50, 71–79. [Google Scholar] [CrossRef] [Green Version]
- Brydegaard, M.; Jansson, S.; Malmqvist, E.; Mlacha, Y.P.; Gebru, A.; Okumu, F.; Killeen, G.F.; Kirkeby, C. Lidar reveals activity anomaly of malaria vectors during pan-African eclipse. Sci. Adv. 2020, 6, eaay5487. [Google Scholar] [CrossRef] [PubMed]
- Genoud, A.P.; Basistyy, R.; Williams, G.M.; Thomas, B.P. Optical remote sensing for monitoring flying mosquitoes, gender identification and discussion on species identification. Appl. Phys. B Lasers Opt. 2018, 124, 46. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Batovska, J.; Piper, A.M.; Valenzuela, I.; Cunningham, J.P.; Blacket, M.J. Developing a non-destructive metabarcoding protocol for detection of pest insects in bulk trap catches. Sci. Rep. 2021, 11, 1–14. [Google Scholar] [CrossRef]
- Bjerge, K.; Mann, H.M.; Høye, T.T. Real-time insect tracking and monitoring with computer vision and deep learning. Remote Sens. Ecol. Conserv. 2022, 8, 315–327. [Google Scholar] [CrossRef]
- Clare, G.; Suckling, D.M.; Bradley, S.J.; Walker, J.T.S.; Shaw, P.W.; Daly, J.M.; McLaren, G.F.; Wearing, C.H.; Wearing, C.H. Pheromone trap colour determines catch of nontarget insects. New Zealand Plant Prot. 2000, 53, 216–220. [Google Scholar] [CrossRef] [Green Version]
- Wallis, D.R.; Shaw, P.W. Evaluation of coloured sticky traps for monitoring beneficial insects in apple orchards. New Zealand Plant Prot. 2008, 61, 328–332. [Google Scholar] [CrossRef]
- Blackmer, J.L.; Byers, J.A.; Rodriguez-Saona, C. Evaluation of color traps for monitoring Lygus spp.: Design, placement, height, time of day, and non-target effects. Crop Prot. 2008, 27, 171–181. [Google Scholar] [CrossRef]
- Broughton, S.; Harrison, J. Evaluation of monitoring methods for thrips and the effect of trap colour and semiochemicals on sticky trap capture of thrips (Thysanoptera) and beneficial insects (Syrphidae, Hemerobiidae) in deciduous fruit trees in Western Australia. Crop Prot. 2012, 42, 156–163. [Google Scholar] [CrossRef]
- Benelli, G.; Romano, D. Does indirect mating trophallaxis boost male mating success and female egg load in Mediterranean fruit flies? J. Pest Sc. 2018, 91, 181–188. [Google Scholar] [CrossRef]
- Daane, K.M.; Johnson, M.W. Olive fruit fly: Managing an ancient pest in modern times. Annu. Rev. Entomol. 2010, 55, 151–169. [Google Scholar] [CrossRef]
- Pegoraro, L.; Hidalgo, O.; Leitch, I.J.; Pellicer, J.; Barlow, S.E. Automated video monitoring of insect pollinators in the field. Emerg. Top. Life Sci. 2020, 4, 87–97. [Google Scholar] [CrossRef]
- ASAB/ABS. Guidelines for the treatment of animals in behavioural research and teaching. Anim. Behav. 2020, 183, 1–11. [Google Scholar] [CrossRef]
- European Commission. 2007. Commission Recommendations of 18 June 2007 on Guidelines for the Accommodation and Care of Animals Used for Experimental and other Scientific Purposes. Annex II to European Council Directive 86/609. See 2007/526/EC. Available online: https://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=OJ:L:2007:197:0001:0089:EN:PDF (accessed on 12 November 2022).
- Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You only look once: Unified, real-time object detection. In Proceedings of the Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 779–788. [Google Scholar]
- Redmon, J.; Farhadi, A. YOLO9000: Better, faster, stronger. In Proceedings of the Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 7263–7271. [Google Scholar] [CrossRef] [Green Version]
- Farhadi, A.; Redmon, J. Yolov3: An incremental improvement. arXiv Prepr. 2018, arXiv:1804.02767. [Google Scholar] [CrossRef]
- Bochkovskiy, A.; Wang, C.Y.; Liao, H.Y.M. Yolov4: Optimal speed and accuracy of object detection. arXiv Prepr. 2020, arXiv:2004.10934. [Google Scholar] [CrossRef]
- Tian, Y.; Yang, G.; Wang, Z.; Wang, H.; Li, E.; Liang, Z. Apple detection during different growth stages in orchards using the improved YOLO-V3 model. Comput. Electron. Agric. 2019, 157, 417–426. [Google Scholar] [CrossRef]
- Hu, X.; Liu, Y.; Zhao, Z.; Liu, J.; Yang, X.; Sun, C.; Chen, S.; Li, B.; Zhou, C. Real-time detection of uneaten feed pellets in underwater images for aquaculture using an improved YOLO-V4 network. Comput. Electron. Agric. 2021, 185, 106135. [Google Scholar] [CrossRef]
- Al-Masni, M.A.; Al-Antari, M.A.; Park, J.M.; Gi, G.; Kim, T.Y.; Rivera, P.; Valarezo, E.; Choi, M.-T.; Han, S.-M.; Kim, T.S. Simultaneous detection and classification of breast masses in digital mammograms via a deep learning YOLO-based CAD system. Comput. Methods Prog. Biomed. 2018, 157, 85–94. [Google Scholar] [CrossRef] [PubMed]
- Chen, R.C. Automatic License Plate Recognition via sliding-window darknet-YOLO deep learning. Image Vis. Comp. 2019, 87, 47–56. [Google Scholar] [CrossRef]
- Loey, M.; Manogaran, G.; Taha, M.H.N.; Khalifa, N.E.M. Fighting against COVID-19: A novel deep learning model based on YOLO-v2 with ResNet-50 for medical face mask detection. Sustain. Cities Soc. 2021, 65, 102600. [Google Scholar] [CrossRef]
- GitHub Inc. 2021. Available online: https://github.com/ultralytics/yolov5 (accessed on 12 November 2022).
- Paszke, A.; Gross, S.; Massa, F.; Lerer, A.; Bradbury, J.; Chanan, G.; Killeen, T.; Lin, Z.; Gimelshein, N.; Chintala, S.; et al. Pytorch: An imperative style, high-performance deep learning library. Adv. Neural Inf. Process Syst. 2019, 32, 1–12. [Google Scholar]
- Colquhoun, D. The reproducibility of research and the misinterpretation of p-values. R. Soc. Open Sci. 2017, 4, 171085. [Google Scholar] [CrossRef] [Green Version]
- Dewi, C.; Chen, R.C.; Liu, Y.T.; Jiang, X.; Hartomo, K.D. Yolo V4 for advanced traffic sign recognition with synthetic training data generated by various GAN. IEEE Access 2021, 9, 97228–97242. [Google Scholar] [CrossRef]
Dataset Composition | Dataset Size | Dataset Type |
---|---|---|
C. capitata | 309 | Training set |
B. oleae | 330 | Training set |
C. capitata | 66 | Validation set |
B. oleae | 117 | Validation set |
C. capitata | 63 | Testing set |
B. oleae | 27 | Testing set |
C. capitata and B. oleae | 914 | Performance set |
Predicted Class | |||
---|---|---|---|
Positive | Negative | ||
Actual class | Positive | TP | FN |
Negative | FP | TN |
Parameter | Value | Parameter | Value |
---|---|---|---|
lr0 | 0.01 | lrf | 0.2 |
momentum | 0.973 | weight decay | 0.0005 |
warmup epochs | 3.0 | warmup momentum | 0.8 |
warmup bias lr | 0.1 | box | 0.05 |
cls | 0.5 | clspw | 1.0 |
obj | 1.0 | objpw | 1.0 |
IoUt | 0.2 | anchort | 4.0 |
hsvh | 0.015 | hsvs | 0.7 |
hsvv | 0.4 | translate | 0.1 |
scale | 0.5 | fliplr | 0.5 |
True Condition | |||
---|---|---|---|
Training | Mediterranean Fruit Fly | Olive Fruit Fly | |
Predicted Condition | Mediterranean Fruit Fly | ||
Olive Fruit Fly |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Tannous, M.; Stefanini, C.; Romano, D. A Deep-Learning-Based Detection Approach for the Identification of Insect Species of Economic Importance. Insects 2023, 14, 148. https://doi.org/10.3390/insects14020148
Tannous M, Stefanini C, Romano D. A Deep-Learning-Based Detection Approach for the Identification of Insect Species of Economic Importance. Insects. 2023; 14(2):148. https://doi.org/10.3390/insects14020148
Chicago/Turabian StyleTannous, Michael, Cesare Stefanini, and Donato Romano. 2023. "A Deep-Learning-Based Detection Approach for the Identification of Insect Species of Economic Importance" Insects 14, no. 2: 148. https://doi.org/10.3390/insects14020148
APA StyleTannous, M., Stefanini, C., & Romano, D. (2023). A Deep-Learning-Based Detection Approach for the Identification of Insect Species of Economic Importance. Insects, 14(2), 148. https://doi.org/10.3390/insects14020148