A Convolutional Neural Network Algorithm for Pest Detection Using GoogleNet
Abstract
:1. Introduction
2. Materials and Methods
2.1. Data Collecting
2.2. Data Preprocessing
- Width Shift
- This was carried out to shift the image width to the left and right to provide variations for the model to learn. The width shift range value given was 0.2.
- Height Shift
- This was carried out to shift the image height up and down to provide variations for the model to learn. The height shift range value given was 0.2.
- Shears
- This was carried out to tilt the image regarding the x-axis and y-axis to provide variations for the model to learn. The value of the shear range given was 0.2.
- Zoom
- This was carried out to enlarge or reduce the image to provide variations for the model to learn. The value of the given zoom range is 0.2.
- Horizontal Flip
- This involved flipping an image vertically or horizontally to provide variations for the model to learn. It was assisted using external tools in the form of an image data generator library that had been previously provided by Tensorflow to make the process easier.
2.3. Classification
- The input used is an image measuring 224 × 224.
- The image data undergo five phases of convolution prior to reaching the fully connected layer.
- In the initial phase, the data are subjected to 7 × 7 convolution and 3 × 3 max pooling.
- The data are then transit through two convolution layers with dimensions of 1 × 1 and 3 × 3, followed by a max pooling layer with dimensions of 3 × 3.
- In the third stage, data enter the Inception model layer twice and are processed by max pooling with 3 × 3 dimensions.
- In the fourth stage, the data are first processed by an Inception model layer and then by an auxiliary classifier. The data then pass through three layers of the inception model and then the second auxiliary classifier. Then, the data pass through an Inception model layer once more prior to max pooling with 3 × 3 dimensions.
- In the fifth stage, the data travel through two layers of the Inception model and then undergo 7 × 7-dimensional average pooling. The data then proceed to the output after passing through the fully connected layer, which is also the sixth stage of the architecture.
- At stages 2 through 5, a 1 × 1 convolution layer is used to reduce the dimensions of the output.
2.4. Evaluation
2.5. Implementation of Mobile Applications
3. Results
4. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Rozaki, Z.; Wijaya, O.; Rahmawati, N.; Rahayu, L. Farmers’ disaster mitigation strategies in Indonesia. Rev. Agric. Sci. 2021, 9, 178–194. [Google Scholar] [CrossRef]
- Gandharum, L.; Mulyani, M.E.; Hartono, D.M.; Karsidi, A.; Ahmad, M. Remote sensing versus the area sampling frame method in paddy rice acreage estimation in Indramayu regency, West Java province, Indonesia. Int. J. Remote Sens. 2021, 42, 1738–1767. [Google Scholar] [CrossRef]
- Yatoo, A.M.; Ali, M.N.; Baba, Z.A.; Hassan, B. Sustainable management of diseases and pests in crops by vermicompost and vermicompost tea. A review. Agron. Sustain. Dev. 2021, 41, 7. [Google Scholar] [CrossRef]
- Nuryitmawan, T.R. The Impact of Credit on Multidimensional Poverty in Rural Areas: A Case Study of the Indonesian Agricultural Sector. Agriecobis J. Agric. Socioecon. Bus. 2021, 4, 32–45. [Google Scholar] [CrossRef]
- Safitri, K.I.; Abdoellah, O.S.; Gunawan, B.; Suparman, Y.; Mubarak, A.Z.; Pardede, M. The Adaptation of Export-Scale Urban Farmers Amid the COVID-19 Pandemic in Bandung Metropolitan. Qual. Rep. 2022, 27, 1169–1196. [Google Scholar] [CrossRef]
- Agustina, K.K.; Wirawan, I.M.A.; Sudarmaja, I.M.; Subrata, M.; Dharmawan, N.S. The first report on the prevalence of soil-transmitted helminth infections and associated risk factors among traditional pig farmers in Bali Province, Indonesia. Vet. World 2022, 15, 1154. [Google Scholar] [CrossRef]
- Thao, L.Q.; Cuong, D.D.; Anh, N.T.; Minh, N.; Tam, N.D. Pest Early Detection in Greenhouse Using Machine Learning. Rev. D’intelligence Artif. 2022, 36, 209–214. [Google Scholar] [CrossRef]
- Wang, X.; Liu, J.; Zhu, X. Early real-time detection algorithm of tomato diseases and pests in the natural environment. Plant Methods 2021, 17, 43. [Google Scholar] [CrossRef]
- Gupta, N.; Slawson, D.D.; Moffat, A.J. Using citizen science for early detection of tree pests and diseases: Perceptions of professional and public participants. Biol. Invasions 2022, 24, 123–138. [Google Scholar] [CrossRef]
- Pocock, M.J.; Marzano, M.; Bullas-Appleton, E.; Dyke, A.; De Groot, M.; Shuttleworth, C.M.; White, R. Ethical dilemmas when using citizen science for early detection of invasive tree pests and diseases. Manag. Biol. Invasions 2020, 11, 720–732. [Google Scholar] [CrossRef]
- White, R.; Marzano, M.; Fesenko, E.; Inman, A.; Jones, G.; Agstner, B.; Mumford, R. Technology development for the early detection of plant pests: A framework for assessing Technology Readiness Levels (TRLs) in environmental science. J. Plant Dis. Prot. 2022, 129, 1249–1261. [Google Scholar] [CrossRef]
- Rempelos, L.; Baranski, M.; Wang, J.; Adams, T.N.; Adebusuyi, K.; Beckman, J.J.; Brockbank, C.J.; Douglas, B.S.; Feng, T.; Greenway, J.D.; et al. Integrated soil and crop management in organic agriculture: A logical framework to ensure food quality and human health? Agronomy 2021, 11, 2494. [Google Scholar] [CrossRef]
- Braga, M.P.; Janz, N. Host repertoires and changing insect–plant interactions. Ecol. Entomol. 2021, 46, 1241–1253. [Google Scholar] [CrossRef]
- Yulita, I.N.; Amri, N.A.; Hidayat, A. Mobile Application for Tomato Plant Leaf Disease Detection Using a Dense Convolutional Network Architecture. Computation 2023, 11, 20. [Google Scholar] [CrossRef]
- van der Merwe, D.; Burchfield, D.R.; Witt, T.D.; Price, K.P.; Sharda, A. Drones in agriculture. Adv. Agron. 2020, 162, 1. [Google Scholar] [CrossRef]
- Dutta, G.; Goswami, P. Application of drone in agriculture: A review. Int. J. Chem. Stud. 2020, 8, 203–218. [Google Scholar] [CrossRef]
- Ahirwar, S.; Swarnkar, R.; Bhukya, S.; Namwade, G. Application of Drone in Agriculture. Int. J. Curr. Microbiol. Appl. Sci. 2019, 8, 2500–2505. [Google Scholar] [CrossRef]
- Rejeb, A.; Abdollahi, A.; Rejeb, K.; Treiblmaier, H. Drones in agriculture: A review and bibliometric analysis. Comput. Electron. Agric. 2022, 198, 107017. [Google Scholar] [CrossRef]
- Tannous, M.; Stefanini, C.; Romano, D. A Deep-Learning-Based Detection Approach for the Identification of Insect Species of Economic Importance. Insects 2023, 14, 148. [Google Scholar] [CrossRef]
- Gondal, M.D.; Khan, Y.N. Early pest detection from crop using image processing and computational intelligence. Int. J. Sci. Res. Eng. Manag. 2022, 6, 59–68. [Google Scholar] [CrossRef]
- Hadipour-Rokni, R.; Asli-Ardeh, E.A.; Jahanbakhshi, A.; Esmaili paeen-Afrakoti, I.; Sabzi, S. Intelligent detection of citrus fruit pests using machine vision system and convolutional neural network through transfer learning technique. Comput. Biol. Med. 2023, 155, 106611. [Google Scholar] [CrossRef]
- Kasinathan, T.; Singaraju, D.; Uyyala, S.R. Insect classification and detection in field crops using modern machine learning techniques. Inf. Process. Agric. 2021, 8, 446–457. [Google Scholar] [CrossRef]
- Barbedo, J.G.A. Detecting and Classifying Pests in Crops Using Proximal Images and Machine Learning: A Review. AI 2020, 1, 21. [Google Scholar] [CrossRef]
- Sangeetha, T.; Mohanapriya, M. A Novel Exploration of Plant Disease and Pest Detection Using Machine Learning and Deep Learning Algorithms. Publ. Issue 2022, 71, 1399–1418. [Google Scholar]
- Chithambarathanu, M.; Jeyakumar, M.K. Survey on crop pest detection using deep learning and machine learning approaches. Multimed. Tools Appl. 2023, 11, 1–34. [Google Scholar] [CrossRef]
- Du, L.; Sun, Y.; Chen, S.; Feng, J.; Zhao, Y.; Yan, Z.; Zhang, X.; Bian, Y. A Novel Object Detection Model Based on Faster R-CNN for Spodoptera frugiperda According to Feeding Trace of Corn Leaves. Agriculture 2022, 12, 248. [Google Scholar] [CrossRef]
- Jiao, L.; Xie, C.; Chen, P.; Du, J.; Li, R.; Zhang, J. Adaptive feature fusion pyramid network for multi-classes agricultural pest detection. Comput. Electron. Agric. 2022, 195, 106827. [Google Scholar] [CrossRef]
- Prabha, R.; Kennedy, J.S.; Vanitha, G.; Sathiah, N.; Priya, M.B. Android application development for identifying maize infested with fall armyworms with Tamil Nadu Agricultural University Integrated proposed pest management (TNAU IPM) capsules. J. Appl. Nat. Sci. 2022, 14, 138–144. [Google Scholar] [CrossRef]
- Chen, J.W.; Lin, W.J.; Cheng, H.J.; Hung, C.L.; Lin, C.Y.; Chen, S.P. A smartphone-based application for scale pest detection using multiple-object detection methods. Electronics 2021, 10, 372. [Google Scholar] [CrossRef]
- Panggabean, H.; Tampubolon, S.; Sembiring, M. Indonesia’s Ambivalent Language Policy on English: Cause and Effect. Int. J. Innov. Creat. Change 2020, 14, 588–605. Available online: www.ijicc.net (accessed on 16 August 2023).
- Hasan, K.; Masriadi; Muchlis; Husna, A. Digital Farming and Smart Farming from the Perspective of Agricultural Students at Malikussaleh University 2022. In Proceedings of the 3rd Malikussaleh International Conference on Multidisciplinary Studies (MICoMS), Virtual, 30 November–1 December 2022. [Google Scholar] [CrossRef]
- Yuesheng, F.; Jian, S.; Fuxiang, X.; Yang, B.; Xiang, Z.; Peng, G.; Zhengtao, W.; Shengqiao, X. Circular Fruit and Vegetable Classification Based on Optimized GoogLeNet. IEEE Access 2021, 9, 113599–1135611. [Google Scholar] [CrossRef]
- Muhammad, N.A.; Nasir, A.A.; Ibrahim, Z.; Sabri, N. Evaluation of CNN, alexnet and GoogleNet for fruit recognition. Indones. J. Electr. Eng. Comput. Sci. 2018, 12, 468–475. [Google Scholar] [CrossRef]
- Jayalakshmy, S.; Lakshmipriya, B.; Sudha, G.F. Bayesian optimized GoogLeNet based respiratory signal prediction model from empirically decomposed gammatone visualization. Biomed. Signal Process Control 2023, 86, 105239. [Google Scholar] [CrossRef]
- Loo, M.C.; Logeswaran, R.; Salam, Z.A.A. CNN Aided Surface Inspection for SMT Manufacturing. In Proceedings of the 2023 15th International Conference on Developments in eSystems Engineering (DeSE), Baghdad/Anbar, Iraq, 9–12 January 2023. [Google Scholar] [CrossRef]
- Panchbhaiyye, V.; Ogunfunmi, T. A Fifo Based Accelerator for Convolutional Neural Networks. In Proceedings of the ICASSP 2020—2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Barcelona, Spain, 4–8 May 2020. [Google Scholar] [CrossRef]
- Xie, Y.; Lu, H.; Yan, J.; Yang, X.; Tomizuka, M.; Zhan, W. Active Finetuning: Exploiting Annotation Budget in the Pretraining-Finetuning Paradigm. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Vancouver, BC, Canada, 17–24 June 2023. [Google Scholar] [CrossRef]
- Yapıcı, M.M.; Topaloğlu, N. Performance comparison of deep learning frameworks. Comput. Inform. 2021, 1, 1–11. [Google Scholar]
- Saleh, A.M.; Hamoud, T. Analysis and best parameters selection for person recognition based on gait model using CNN algorithm and image augmentation. J. Big Data 2021, 8, 1. [Google Scholar] [CrossRef]
- Shorten, C.; Khoshgoftaar, T.M. A survey on Image Data Augmentation for Deep Learning. J. Big Data 2019, 6, 60. [Google Scholar] [CrossRef]
- Khalifa, N.E.; Loey, M.; Mirjalili, S. A comprehensive survey of recent trends in deep learning for digital images augmentation. Artif. Intell. Rev. 2022, 55, 2351–2377. [Google Scholar] [CrossRef] [PubMed]
- Xu, M.; Yoon, S.; Fuentes, A.; Park, D.S. A Comprehensive Survey of Image Augmentation Techniques for Deep Learning. Pattern Recognit. 2023, 137, 109347. [Google Scholar] [CrossRef]
- Kattenborn, T.; Leitloff, J.; Schiefer, F.; Hinz, S. Review on Convolutional Neural Networks (CNN) in vegetation remote sensing. ISPRS J. Photogramm. Remote Sens. 2021, 173, 24–49. [Google Scholar] [CrossRef]
- Lei, X.; Pan, H.; Huang, X. A dilated cnn model for image classification. IEEE Access 2019, 7, 124087–124095. [Google Scholar] [CrossRef]
- Tuggener, L.; Schmidhuber, J.; Stadelmann, T. Is it enough to optimize CNN architectures on ImageNet? Front. Comput. Sci. 2022, 4, 1703. [Google Scholar] [CrossRef]
- Lu, T.C. CNN Convolutional layer optimisation based on quantum evolutionary algorithm. Conn. Sci. 2021, 33, 482–494. [Google Scholar] [CrossRef]
- Ardison, A.C.I.; Hutagalung, M.J.R.; Chernando, R.; Cenggoro, T.W. Observing Pre-Trained Convolutional Neural Network (CNN) Layers as Feature Extractor for Detecting Bias in Image Classification Data. CommIT J. 2022, 16, 149–158. [Google Scholar] [CrossRef]
- Hao, K.; Lin, S.; Qiao, J.; Tu, Y. A Generalized Pooling for Brain Tumor Segmentation. IEEE Access 2021, 9, 159283–159290. [Google Scholar] [CrossRef]
- Jang, B.; Kim, M.; Harerimana, G.; Kang, S.U.; Kim, J.W. Bi-LSTM model to increase accuracy in text classification: Combining word2vec CNN and attention mechanism. Appl. Sci. 2020, 10, 5841. [Google Scholar] [CrossRef]
- Wang, S.H.; Tang, C.; Sun, J.; Yang, J.; Huang, C.; Phillips, P.; Zhang, Y.D. Multiple sclerosis identification by 14-layer convolutional neural network with batch normalization, dropout, and stochastic pooling. Front. Neurosci. 2018, 12, 818. [Google Scholar] [CrossRef]
- Kora, P.; Ooi, C.P.; Faust, O.; Raghavendra, U.; Gudigar, A.; Chan, W.Y.; Meenakshi, K.; Swaraja, K.; Plawiak, P.; Acharya, U.R. Transfer learning techniques for medical image analysis: A review. Biocybern. Biomed. Eng. 2022, 42, 79–107. [Google Scholar] [CrossRef]
- Yoo, H.-J. Deep Convolution Neural Networks in Computer Vision: A Review. IEIE Trans. Smart Process. Comput. 2015, 4, 35–43. [Google Scholar] [CrossRef]
- Yu, H.; Yang, L.T.; Zhang, Q.; Armstrong, D.; Deen, M.J. Convolutional neural networks for medical image analysis: State-of-the-art, comparisons, improvement and perspectives. Neurocomputing 2021, 444, 92–110. [Google Scholar] [CrossRef]
- Chen, S.L.; Chen, T.Y.; Huang, Y.C.; Chen, C.A.; Chou, H.S.; Huang, Y.Y.; Lin, W.C.; Li, T.C.; Yuan, J.J.; Abu, P.A.; et al. Missing Teeth and Restoration Detection Using Dental Panoramic Radiography Based on Transfer Learning with CNNs. IEEE Access 2022, 10, 118654–118664. [Google Scholar] [CrossRef]
- Yilmaz, E.; Trocan, M. A modified version of GoogLeNet for melanoma diagnosis. J. Inf. Telecommun. 2021, 5, 395–405. [Google Scholar] [CrossRef]
- Lyu, Z.; Yu, Y.; Samali, B.; Rashidi, M.; Mohammadi, M.; Nguyen, T.N.; Nguyen, A. Back-Propagation Neural Network Optimized by K-Fold Cross-Validation for Prediction of Torsional Strength of Reinforced Concrete Beam. Materials 2022, 15, 1477. [Google Scholar] [CrossRef] [PubMed]
- Fushiki, T. Estimation of prediction error by using K-fold cross-validation. Stat. Comput. 2011, 21, 137–146. [Google Scholar] [CrossRef]
- Wong, T.T.; Yeh, P.Y. Reliable Accuracy Estimates from k-Fold Cross Validation. IEEE Trans. Knowl. Data Eng. 2020, 32, 1586–1594. [Google Scholar] [CrossRef]
Dropout | Number of Dense Layer | Number of Neurons | Accuracy |
---|---|---|---|
0.3 | 0 | 0 | 92.22% |
0.3 | 1 | 256 | 91.33% |
0.3 | 3 | 512; 256; 128 | 89.33% |
0.3 | 5 | 512; 256; 128; 64; 32 | 93.11% |
0.4 | 0 | 0 | 92.88% |
0.4 | 1 | 256 | 91.77% |
0.4 | 3 | 512; 256; 128 | 92.44% |
0.4 | 5 | 512; 256; 128; 64; 32 | 92.22% |
0.5 | 0 | 0 | 93.78% |
0.5 | 1 | 256 | 93.33% |
0.5 | 3 | 512; 256; 128 | 93.55% |
0.5 | 5 | 512; 256; 128; 64; 32 | 93.78% |
No | Questions | Mean | Median | Variation | Min. | Max. |
---|---|---|---|---|---|---|
1 | Do you think this application will be effective in detecting pests? | 4.36 | 4.00 | 0.13 | 3 | 5 |
2 | Will this application help in identifying pest problems in plants? | 4.31 | 4.00 | 0.15 | 3 | 5 |
3 | Are you satisfied with the performance of this application in detecting pests? | 4.08 | 4.00 | 0.19 | 2 | 5 |
4 | Do you like the features of this app? | 4.11 | 4.00 | 0.18 | 2 | 5 |
5 | Is this application fast and accurate in detecting pests? | 3.89 | 4.00 | 0.19 | 2 | 5 |
6 | Can this app detect pests better than humans? | 3.78 | 4.00 | 0.19 | 3 | 5 |
7 | Will this application increase efficiency or productivity in agriculture? | 4.31 | 4.00 | 0.17 | 3 | 5 |
8 | Will this application make a positive contribution to overcoming pest problems in plants or the surrounding environment? | 4.36 | 4.50 | 0.16 | 3 | 5 |
9 | Is this application easy to use? | 4.36 | 5.00 | 0.17 | 3 | 5 |
10 | Would you recommend this app to others, especially farmers? | 4.33 | 5.00 | 0.19 | 3 | 5 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Yulita, I.N.; Rambe, M.F.R.; Sholahuddin, A.; Prabuwono, A.S. A Convolutional Neural Network Algorithm for Pest Detection Using GoogleNet. AgriEngineering 2023, 5, 2366-2380. https://doi.org/10.3390/agriengineering5040145
Yulita IN, Rambe MFR, Sholahuddin A, Prabuwono AS. A Convolutional Neural Network Algorithm for Pest Detection Using GoogleNet. AgriEngineering. 2023; 5(4):2366-2380. https://doi.org/10.3390/agriengineering5040145
Chicago/Turabian StyleYulita, Intan Nurma, Muhamad Farid Ridho Rambe, Asep Sholahuddin, and Anton Satria Prabuwono. 2023. "A Convolutional Neural Network Algorithm for Pest Detection Using GoogleNet" AgriEngineering 5, no. 4: 2366-2380. https://doi.org/10.3390/agriengineering5040145
APA StyleYulita, I. N., Rambe, M. F. R., Sholahuddin, A., & Prabuwono, A. S. (2023). A Convolutional Neural Network Algorithm for Pest Detection Using GoogleNet. AgriEngineering, 5(4), 2366-2380. https://doi.org/10.3390/agriengineering5040145