Agricultural Robot-Centered Recognition of Early-Developmental Pest Stage Based on Deep Learning: A Case Study on Fall Armyworm (Spodoptera frugiperda)
Abstract
:1. Introduction
2. Related Works
3. Proposed Methodology
3.1. Site Location
3.2. Data Collection and Image Sensors
3.3. Proposed Robot-Centered Deep-Learning-Based Insect Pest Scouting Method
3.4. Deep-Learning Architecture for Insect Pest Classification and Detection
Method | Description |
---|---|
AlexNet [55] | This architecture has a convolutional neural network that is eight layers deep. It takes an input image size of 227 × 227 × 3. |
GoogleNet [59] | This model architecture is tiny compared to AlexNet and VGGNet. It uses micro-architectures such as inception modules. The expected input image size is 224 × 224 × 3. |
Inception V3 [60] | This model is a 48-layers-deep convolutional neural network. It has an input image size of 299 × 299 × 3. |
ResNet50 [61] | This architecture leverage on a residual module to train convolutional neural networks upto depths previously impossible. It has 50 layers of convolutional neural network layers. It was trained with an input image size of 224 × 224 × 3. |
ResNet101 [62] | This is a 101-layers-deep convolutional neural network deep variant of ResNet50. It takes input image size of 224 × 224 × 3. |
SqueezeNet [63] | This network has an image input size of 227 × 227 × 3; it is 18 convolutional layers deep. |
VGG16 [64] | This architecture is a 16-layers-deep convolutional neural network. It expects an input image size of 224 × 224 × 3. |
VGG19 [64] | This is a 19-layers-deep variant of VGG16. It expect an input image size of 224 × 224 × 3. |
4. Experimental Results
4.1. Insect Classifier and Detector Performance
4.2. Cosimulation Results
4.3. Discussion
5. Conclusions
Supplementary Materials
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Pimentel, D. Pest control in world agriculture. Agric. Sci. 2009, 2, 272–293. [Google Scholar]
- Mesterházy, Á.; Oláh, J.; Popp, J. Losses in the Grain Supply Chain: Causes and Solutions. Sustainability 2020, 12, 2342. [Google Scholar] [CrossRef] [Green Version]
- Samways, M.J.; Barton, P.S.; Birkhofer, K.; Chichorro, F.; Deacon, C.; Fartmann, T.; Fukushima, C.S.; Gaigher, R.; Habel, J.C.; Hallmann, C.A.; et al. Solutions for humanity on how to conserve insects. Biol. Conserv. 2020, 242, 108427. [Google Scholar] [CrossRef]
- Coleman, D.C.; Crossley, D.; Hendrix, P.F. 4-Secondary Production: Activities of Heterotrophic Organisms—The Soil Fauna. In Fundamentals of Soil Ecology (Second Edition), 2nd ed.; Coleman, D.C., Crossley, D., Hendrix, P.F., Eds.; Academic Press: Burlington, NJ, USA, 2004; pp. 79–185. [Google Scholar] [CrossRef]
- Aktar, M.W.; Sengupta, D.; Chowdhury, A. Impact of pesticides use in agriculture: Their benefits and hazards. Interdiscip. Toxicol. 2009, 2, 1–12. [Google Scholar] [CrossRef] [Green Version]
- Ratan, M.; Rafiq, L.; Javid, M.; Razia, S. Imbalance due to Pesticide Contamination in Different Ecosystems. Int. J. Theor. Appl. Sci. 2018, 10, 239–246. [Google Scholar]
- Mavridou, E.; Vrochidou, E.; Papakostas, G.A.; Pachidis, T.; Kaburlasos, V.G. Machine Vision Systems in Precision Agriculture for Crop Farming. J. Imaging 2019, 5, 89. [Google Scholar] [CrossRef] [Green Version]
- Giles, D.K.; Slaughter, D.C.; Downey, D.; Brevis-Acuna, J.C.; Lanini, W.T. Application design for machine vision guided selective spraying of weeds in high value crops. Asp. Appl. Biol. 2004, 71, 75–81. [Google Scholar]
- Jeon, H.Y.; Tian, L.F. Direct application end effector for a precise weed control robot. Biosyst. Eng. 2009, 104, 458–464. [Google Scholar] [CrossRef]
- Li, Y.; Xia, C.; Lee, J. Vision-based pest detection and automatic spray of greenhouse plant. In Proceedings of the 2009 IEEE International Symposium on Industrial Electronics, Seoul, Republic of Korea, 5–8 July 2009; pp. 920–925. [Google Scholar] [CrossRef]
- Midtiby, H.S.; Mathiassen, S.K.; Andersson, K.J.; Jørgensen, R.N. Performance evaluation of a crop/weed discriminating microsprayer. Comput. Electron. Agric. 2011, 77, 35–40. [Google Scholar] [CrossRef]
- Underwood, J.P.; Calleija, M.; Taylor, Z.; Hung, C.; Nieto, J.; Fitch, R.; Sukkarieh, S. Real-time target detection and steerable spray for vegetable crops. In Proceedings of the Workshop on Robotics in Agriculture at International Conference on Robotics and Automation (ICRA), Seattle, WA, USA, 26–30 May 2015; Available online: https://www.semanticscholar.org/paper/Real-time-target-detection-and-steerable-spray-for-Underwood-Calleija/4bc35e6aa29eaf318739ad83986411a873f2d73e (accessed on 25 January 2023).
- Cantelli, L.; Bonaccorso, F.; Longo, D.; Melita, C.D.; Schillaci, G.; Muscato, G. A small versatile electrical robot for autonomous spraying in agriculture. AgriEngineering 2019, 1, 391–402. [Google Scholar] [CrossRef] [Green Version]
- Ruckelhausen, A.; Biber, P.; Dorna, M.; Gremmes, H.; Klose, R.; Linz, A.; Rahe, R.; Resch, R.; Thiel, M.; Trautz, D.; et al. BoniRob: An Autonomous Field Robot Platform for Individual Plant Phenotyping. In Precision Agriculture ’09; ECPA, European Conference on Precision Agriculture, 7; Academic Publishers: Wageningen, The Netherlands, 2009; pp. 841–847. [Google Scholar]
- Bangert, W.; Kielhorn, A.; Rahe, F.; Albert, A.; Biber, P.; Grzonka, S.; Hänsel, M.; Haug, S.; Michaels, A.; Mentrup, D.; et al. Field-Robot-Based Agriculture: “RemoteFarming.1” and “BoniRob-Apps”; VDI-Verlag, Verein Deutscher Ingenieure Dusseldorf: Düsseldorf, Germany, 2013. [Google Scholar]
- Unisydneyacfr. RIPPA Demonstrating Autonomous Crop Interaction; Australian Centre for Field Robotics (ACFR): Sydney, Australia, 2016. [Google Scholar]
- Majd Jaratly. Insecticide Damage to Human Health and the Environment; Green-Studies: Dubai, United Arab Emirates, 2018; pp. 8–11. [Google Scholar]
- Sammons, P.J.; Furukawua, T.; Bulgin, A. Autonomous pesticide spraying robot for use in a greenhouse. In Proceedings of the Australian Conference on Robotics and Automation, Sydney, Australia, 5–7 December 2005; pp. 1–9, ISBN ISBN0-9587583-7-9. [Google Scholar]
- Hu, Z.; Liu, B.; Zhao, Y.; Hu, Z.; Liu, B.; Zhao, Y. Agricultural Robot for Intelligent Detection of Pyralidae Insects. In Agricultural Robots—Fundamentals and Applications; IntechOpen: Beijing/Shanghai, China, 2018. [Google Scholar] [CrossRef] [Green Version]
- Cubero, S.; Marco-noales, E.; Aleixos, N.; Barbé, S.; Blasco, J. RobHortic: A Field Robot to Detect Pests and diseases in Horticultural Crops by Proximal Sensing. Agriculture 2020, 10, 276. [Google Scholar] [CrossRef]
- Lucet, E.; Lacotte, V.; Nguyen, T.; Sempere, J.D.; Novales, V.; Dufour, V.; Moreau, R.; Pham, M.T.; Rabenorosoa, K.; Peignier, S.; et al. Pesticide-Free Robotic Control of Aphids as Crop Pests. AgriEngineering 2022, 4, 903–921. [Google Scholar] [CrossRef]
- Meshram, A.T.; Vanalkar, A.V.; Kalambe, K.B.; Badar, A.M. Pesticide spraying robot for precision agriculture: A categorical literature review and future trends. J. Field Robot. 2022, 39, 153–171. [Google Scholar] [CrossRef]
- Capinera, J.L. Relationships between insect pests and weeds: An evolutionary perspective. Weed Sci. 2005, 53, 892–901. [Google Scholar] [CrossRef]
- Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You Only Look Once: Unified, Real-Time Object Detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016. [Google Scholar]
- Liu, W.; Anguelov, D.; Erhan, D.; Szegedy, C.; Reed, S.; Fu, C.Y.; Berg, A.C. SSD: Single Shot MultiBox Detector. In Computer Vision—ECCV 2016. ECCV 2016. Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2015. [Google Scholar] [CrossRef] [Green Version]
- Redmon, J.; Farhadi, A. YOLO9000: Better, faster, stronger. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 7263–7271. [Google Scholar]
- Yin, Y.; Li, H.; Fu, W. Faster-YOLO: An accurate and faster object detection method. Digit. Signal Process. Rev. J. 2020, 102, 102756. [Google Scholar] [CrossRef]
- Girshick, R.; Donahue, J.; Darrell, T.; Malik, J. Rich feature hierarchies for accurate object detection and semantic segmentation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Columbus, OH, USA, 23–28 June 2014; pp. 580–587. [Google Scholar]
- Girshick, R. Fast R-CNN. arXiv 2015, arXiv:1504.08083. [Google Scholar]
- Ren, S.; He, K.; Girshick, R.; Sun, J. Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks. arXiv 2015, arXiv:1506.01497. [Google Scholar] [CrossRef] [Green Version]
- Lin, T.Y.; Dollar, P.; Girshick, R.; He, K.; Hariharan, B.; Belongie, S. Feature Pyramid Networks for Object Detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017. [Google Scholar]
- Fuentes, A.; Yoon, S.; Kim, S.; Park, D.; Fuentes, A.; Yoon, S.; Kim, S.C.; Park, D.S. A Robust Deep-Learning-Based Detector for Real-Time Tomato Plant diseases and Pests Recognition. Sensors 2017, 17, 2022. [Google Scholar] [CrossRef] [Green Version]
- Nieuwenhuizen, A.; Hemming, J.; Suh, H. Detection and Classification of Insects on Stick-Traps in a Tomato Crop Using Faster R-CNN. Technical Report. 2018. Available online: https://library.wur.nl/WebQuery/wurpubs/542509 (accessed on 25 January 2023).
- Liu, J.; Wang, X. Tomato diseases and Pests Detection Based on Improved Yolo V3 Convolutional Neural Network. Front. Plant Sci. 2020, 11, 898. [Google Scholar] [CrossRef]
- Rahman, C.R.; Arko, P.S.; Ali, M.E.; Iqbal Khan, M.A.; Apon, S.H.; Nowrin, F.; Wasif, A. Identification and recognition of rice diseases and pests using convolutional neural networks. Biosyst. Eng. 2020, 194, 112–120. [Google Scholar] [CrossRef] [Green Version]
- Li, D.; Wang, R.; Xie, C.; Liu, L.; Zhang, J.; Li, R.; Wang, F.; Zhou, M.; Liu, W. A Recognition Method for Rice Plant diseases and Pests Video Detection Based on Deep Convolutional Neural Network. Sensors 2020, 20, 578. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Dawei, W.; Limiao, D.; Jiangong, N.; Jiyue, G.; Hongfei, Z.; Zhongzhi, H. Recognition pest by image-based transfer learning. J. Sci. Food Agric. 2019, 99, 4524–4531. [Google Scholar] [CrossRef] [PubMed]
- Alfarisy, A.A.; Chen, Q.; Guo, M. Deep-learning-based classification for paddy pests & diseases recognition. In Proceedings of the ACM International Conference Proceeding Series; Association for Computing Machinery: New York, NY, USA, 2018; pp. 21–25. [Google Scholar] [CrossRef]
- Kounalakis, T.; Malinowski, M.J.; Chelini, L.; Triantafyllidis, G.A.; Nalpantidis, L. A robotic system employing deep learning for visual recognition and detection of weeds in grasslands. In Proceedings of the IST 2018—IEEE International Conference on Imaging Systems and Techniques, Proceedings, Krakow, Poland, 16–18 October 2018; Institute of Electrical and Electronics Engineers Inc.: Piscataway, NJ, USA, 2018. [Google Scholar] [CrossRef]
- Kounalakis, T.; Triantafyllidis, G.A.; Nalpantidis, L. Deep learning-based visual recognition of rumex for robotic precision farming. Comput. Electron. Agric. 2019, 165, 104973. [Google Scholar] [CrossRef]
- Kounalakis, T.; Triantafyllidis, G.A.; Nalpantidis, L. Weed recognition framework for robotic precision farming. In Proceedings of the IST 2016—2016 IEEE International Conference on Imaging Systems and Techniques, Proceedings; Institute of Electrical and Electronics Engineers Inc.: Piscataway, NJ, USA, 2016; pp. 466–471. [Google Scholar] [CrossRef]
- Gogo, E.O.; Saidi, M.; Ochieng, J.M.; Martin, T.; Baird, V.; Ngouajio, M. Microclimate modification and insect pest exclusion using agronet improve pod yield and quality of French bean. HortScience 2014, 49, 1298–1304. [Google Scholar] [CrossRef]
- Dara, S.K.; Peck, D.; Murray, D. Chemical and non-chemical options for managing twospotted spider mite, western tarnished plant bug and other arthropod pests in strawberries. Insects 2018, 9, 156. [Google Scholar] [CrossRef] [Green Version]
- Curry, C. The Life Cycle of Fall Armyworm—The Plantwize Blog. 2017. Available online: https://blog.plantwise.org/2017/07/17/the-life-cycle-of-fall-armyworm/ (accessed on 25 January 2023).
- Hardke, J.T.; Lorenz, G.M., III; Leonard, B.R. Fall Armyworm (Lepidoptera: Noctuidae) Ecology in Southeastern Cotton. J. Integr. Pest Manag. 2015, 6, 10. [Google Scholar] [CrossRef] [Green Version]
- FAO. Integrated Management of the Fall Armyworm on Maize; Technical Report; Food and Agriculture Organization of the United Nations: Rome, Italy, 2018. [Google Scholar]
- FAO. The Global Action for Fall Armyworm Control: Action Framework 2020–2022, Working Together to Tame the Global Threat; Technical Report; Food and Agriculture Organization of the United Nations: Rome, Italy, 2018. [Google Scholar]
- Kok, K.Y.; Rajendran, P. A review on stereo vision algorithms: Challenges and solutions. ECTI Trans. Comput. Inf. Technol. 2019, 13, 134–151. [Google Scholar] [CrossRef]
- Wang, Q.; Fu, L.; Liu, Z. Review on camera calibration. In Proceedings of the 2010 Chinese Control and Decision Conference, CCDC 2010, Xuzhou, China, 26–28 May 2010; pp. 3354–3358. [Google Scholar] [CrossRef]
- Lowe, D.G. Distinctive Image Features from Scale-Invariant Keypoints. Int. J. Comput. 2004, 60, 91–110. [Google Scholar] [CrossRef]
- Vedaldi, A.; Fulkerson, B. VLFeat: An Open and Portable Library of Computer Vision Algorithms. In MM ’10: Proceedings of the 18th ACM international conference on Multimedia, New York, NY, USA, 25 October 2010. [Google Scholar] [CrossRef]
- da Silva, S.L.A.; Tommaselli, A.M.G.; Artero, A.O. Utilização de alvos codificados do tipo aruco na automação do processo de calibração de câmaras. Bol. CiêNcias GeodéSicas 2014, 20, 626–646. [Google Scholar] [CrossRef] [Green Version]
- Dinsmore, J.J. Foraging Success of Cattle Egrets, Bubulcus ibis. Am. Midl. Nat. 1973, 89, 242–246. [Google Scholar] [CrossRef]
- Gould, S.; Arfvidsson, J.; Kaehler, A.; Sapp, B.; Messner, M.; Bradski, G.; Baumstarck, P.; Chung, S.; Ng, A.Y. Peripheral-Foveal Vision for Real-Time Object Recognition and Tracking in Video. In Proceedings of the 20th International Joint Conference on Artifical Intelligence, Hyderabad, India, 6–12 January 2007; Morgan Kaufmann Publishers Inc.: San Francisco, CA, USA, 2007; pp. 2115–2121. [Google Scholar]
- Krizhevsky, A.; Sutskever, I.; Hinton, G.E. Imagenet classification with deep convolutional neural networks. In Proceedings of the Advances in Neural Information Processing Systems, Lake Tahoe, NV, USA, 3–6 December 2012; pp. 1097–1105. [Google Scholar]
- Robbins, H.; Monro, S. A stochastic approximation method. Ann. Math. Stat. 1951, 22, 400–407. [Google Scholar] [CrossRef]
- Kingma, D.P.; Ba, J. Adam: A Method for Stochastic Optimization. arXiv 2017, arXiv:1412.6980. [Google Scholar]
- Nair, V.; Hinton, G.E. Rectified linear units improve restricted boltzmann machines. In Proceedings of the ICML, Haifa, Israel, 21–24 June 2010. [Google Scholar]
- Szegedy, C.; Liu, W.; Jia, Y.; Sermanet, P.; Reed, S.; Anguelov, D.; Erhan, D.; Vanhoucke, V.; Rabinovich, A. Going Deeper with Convolutions. arXiv 2014, arXiv:1409.4842. [Google Scholar]
- Szegedy, C.; Vanhoucke, V.; Ioffe, S.; Shlens, J.; Wojna, Z. Rethinking the Inception Architecture for Computer Vision. arXiv 2015, arXiv:1512.00567. [Google Scholar]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep Residual Learning for Image Recognition. arXiv 2015, arXiv:1512.03385. [Google Scholar]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep Residual Learning for Image Recognition. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 26 June–1 July 2016; pp. 770–778. [Google Scholar] [CrossRef] [Green Version]
- Iandola, F.N.; Han, S.; Moskewicz, M.W.; Ashraf, K.; Dally, W.J.; Keutzer, K. SqueezeNet: AlexNet-level accuracy with 50x fewer parameters and <0.5 MB model size. arXiv 2016, arXiv:1602.07360. [Google Scholar]
- Simonyan, K.; Zisserman, A. Very Deep Convolutional Networks for Large-Scale Image Recognition. arXiv 2014, arXiv:1409.1556. [Google Scholar]
- Horn, G.V.; Aodha, O.M.; Song, Y.; Shepard, A.; Adam, H.; Perona, P.; Belongie, S.J. The iNaturalist Challenge 2017 Dataset. arXiv 2017, arXiv:1707.06642. [Google Scholar]
- Everingham, M.; Van Gool, L.; Williams, C.K.I.; Winn, J.; Zisserman, A. The Pascal Visual Object Classes (VOC) Challenge. Int. J. Comput. Vis. 2009, 88, 303–338. [Google Scholar] [CrossRef] [Green Version]
- Russakovsky, O.; Deng, J.; Su, H.; Krause, J.; Satheesh, S.; Ma, S.; Huang, Z.; Karpathy, A.; Khosla, A.; Bernstein, M.; et al. ImageNet Large Scale Visual Recognition Challenge. Int. J. Comput. Vis. (IJCV) 2015, 115, 211–252. [Google Scholar] [CrossRef] [Green Version]
- Obasekore, H.; Fanni, M.; Ahmed, S.M. Insect Killing Robot for Agricultural Purposes. In Proceedings of the 2019 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM), Hong Kong, 8–12 July 2019; pp. 1068–1074. [Google Scholar] [CrossRef]
- Rohmer, E.; Singh, S.P.N.; Freese, M. CoppeliaSim (formerly V-REP): A Versatile and Scalable Robot Simulation Framework. In Proceedings of the International Conference on Intelligent Robots and Systems (IROS), Tokyo, Japan, 3–8 November 2013. [Google Scholar]
- ELP 1.3megapixels OV9715 MJPEG 60fps Dual Lens Synchronous Stereo Camera Module USB For Robot VR Camera (960P2CAM-LC1100)—Welcome to ELP. Available online: https://www.svpro.cc/product/elp-1-3megapixels-ov9715-mjpeg-60fps-dual-lens-synchronous-stereo-camera-module-usb-for-robot-vr-camera-elp-960p2cam-lc1100/ (accessed on 25 January 2023).
- Kim, J.; Seol, J.; Lee, S.; Hong, S.W.; Son, H.I. An Intelligent Spraying System with Deep Learning-based Semantic Segmentation of Fruit Trees in Orchards. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Virtually, 31 May–31 August 2020; pp. 3923–3929. [Google Scholar] [CrossRef]
- Zhang, Z.; Khanal, S.; Raudenbush, A.; Tilmon, K.; Stewart, C. Assessing the efficacy of machine learning techniques to characterize soybean defoliation from unmanned aerial vehicles. Comput. Electron. Agric. 2022, 193, 106682. [Google Scholar] [CrossRef]
- Vyavasahaaya. Fall ArmyWorm Digital Technology Challenge. 2018. Available online: https://vyavasahaaya.com/innovator/challenges/9/tabs?mode_number=1#invitationstab (accessed on 25 January 2023).
Models | Accuracy (%) | Time on PC (fps) | Time on Embedded Device (fps) |
---|---|---|---|
AlexNet (%) | 95.95 | 16 | 13 |
GoogleNet | 94.99 | 14 | 14 |
SqueezeNet | 96.92 | 34 | 23 |
Inception V3 | 93.26 | 2 | 7 |
VGG16 | 94.03 | 82 | 19 |
VGG19 | 99.04 | 87 | 24 |
ResNet101 | 93.83 | 3 | 6 |
ResNet50 | 94.03 | 4 | 10 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Obasekore, H.; Fanni, M.; Ahmed, S.M.; Parque, V.; Kang, B.-Y. Agricultural Robot-Centered Recognition of Early-Developmental Pest Stage Based on Deep Learning: A Case Study on Fall Armyworm (Spodoptera frugiperda). Sensors 2023, 23, 3147. https://doi.org/10.3390/s23063147
Obasekore H, Fanni M, Ahmed SM, Parque V, Kang B-Y. Agricultural Robot-Centered Recognition of Early-Developmental Pest Stage Based on Deep Learning: A Case Study on Fall Armyworm (Spodoptera frugiperda). Sensors. 2023; 23(6):3147. https://doi.org/10.3390/s23063147
Chicago/Turabian StyleObasekore, Hammed, Mohamed Fanni, Sabah Mohamed Ahmed, Victor Parque, and Bo-Yeong Kang. 2023. "Agricultural Robot-Centered Recognition of Early-Developmental Pest Stage Based on Deep Learning: A Case Study on Fall Armyworm (Spodoptera frugiperda)" Sensors 23, no. 6: 3147. https://doi.org/10.3390/s23063147
APA StyleObasekore, H., Fanni, M., Ahmed, S. M., Parque, V., & Kang, B. -Y. (2023). Agricultural Robot-Centered Recognition of Early-Developmental Pest Stage Based on Deep Learning: A Case Study on Fall Armyworm (Spodoptera frugiperda). Sensors, 23(6), 3147. https://doi.org/10.3390/s23063147