Convolutional Neural Networks in Computer Vision for Grain Crop Phenotyping: A Review
Abstract
:1. Introduction
2. Computer Vision (CV) and Convolutional Neural Networks (CNNs)
2.1. CV
2.2. CNN
2.3. CNNs Combined with CV Tasks
2.3.1. Image Classification
2.3.2. Object Detection
2.3.3. Semantic and Instance Segmentations
3. Advances in Phenotyping of Four Grain Crops Based on CV and CNN
3.1. Crop Organ Detection and Counting
3.2. Weed and Crop Recognition and Segmentation
3.3. Crop Disease Detection and Classification
3.4. Crop Insect Infestation Detection
3.5. Abiotic Crop Stress Phenotype Assessment
3.6. Crop Seed Variety Classification
4. Discussion
5. Challenges of CV and CNNs in Grain Crops and Future Trends
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Tilman, D.; Balzer, C.; Hill, J.; Befort, B.L. Global food demand and the sustainable intensification of agriculture. Proc. Natl. Acad. Sci. USA 2011, 108, 20260–20264. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Steensland, A.; Thompson, T.L. 2020 Global Agricultural Productivity Report: Productivity in a Time of Pandemics. Global Agricultural Productivity Report: Productivity in a Time of Pandemics; College of Agriculture and Life Sciences: Blacksburg, VA, USA, 2020. [Google Scholar]
- Yu, Q.Y.; Xiang, M.T.; Wu, W.B.; Tang, H.J. Changes in global cropland area and cereal production: An inter-country comparison. Agric. Ecosyst. Environ. 2019, 269, 140–147. [Google Scholar] [CrossRef]
- Pan, Y.H. Analysis of concepts and categories of plant phenome and phenomics. Acta Agron. Sin. 2015, 41, 175–186. [Google Scholar] [CrossRef]
- Vithu, P.; Moses, J.A. Machine vision system for food grain quality evaluation: A review. Trends Food Sci. Technol. 2016, 56, 13–20. [Google Scholar] [CrossRef]
- Patrício, D.I.; Rieder, R. Computer vision and artificial intelligence in precision agriculture for grain crops: A systematic review. Comput. Electron. Agric. 2018, 153, 69–81. [Google Scholar] [CrossRef] [Green Version]
- Ngugi, L.C.; Abelwahab, M.; Abo-Zahhad, M. Recent advances in image processing techniques for automated leaf pest and disease recognition–A review. Inf. Process. Agric. 2021, 8, 27–51. [Google Scholar] [CrossRef]
- Wang, A.C.; Zhang, W.; Wei, X.H. A review on weed detection using ground-based machine vision and image processing techniques. Comput. Electron. Agric. 2019, 158, 226–240. [Google Scholar] [CrossRef]
- Sun, D.; Robbins, K.; Morales, N.; Shu, Q.; Cen, H. Advances in optical phenotyping of cereal crops. Trends Plant Sci. 2022, 27, 191–208. [Google Scholar] [CrossRef] [PubMed]
- Khan, A.; Sohail, A.; Zahoora, U.; Qureshi, A.S. A survey of the recent architectures of deep convolutional neural networks. Artif. Intell. Rev. 2020, 53, 5455–5516. [Google Scholar] [CrossRef] [Green Version]
- Dhillon, A.; Verma, G.K. Convolutional neural network: A review of models, methodologies and applications to object detection. Prog. Artif. Intell. 2020, 9, 85–112. [Google Scholar] [CrossRef]
- Mo, Y.; Wu, Y.; Yang, X.; Liu, F.; Liao, Y. Review the state-of-the-art technologies of semantic segmentation based on deep learning. Neurocomputing 2022, 493, 626–646. [Google Scholar] [CrossRef]
- Singh, A.K.; Ganapathysubramanian, B.; Sarkar, S.; Singh, A. Deep Learning for Plant Stress Phenotyping: Trends and Future Perspectives. Trends Plant Sci. 2018, 23, 883–898. [Google Scholar] [CrossRef] [Green Version]
- Diba, A.; Sharma, V.; Pazandeh, A.; Pirsiavash, H.; van Gool, L. Weakly supervised cascaded convolutional networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition 2017, Honolulu, HI, USA, 21–26 July 2017; pp. 914–922. [Google Scholar]
- Alzubaidi, L.; Zhang, J.; Humaidi, A.J.; Al-Dujaili, A.; Duan, Y.; Al-Shamma, O.; Santamaria, J.; Fadhel, M.A.; Al-Amidie, M.; Farhan, L. Review of deep learning: Concepts, CNN architectures, challenges, applications, future directions. J. Big Data 2021, 8, 53. [Google Scholar] [CrossRef]
- Liu, J.; Wang, X. Plant diseases and pests detection based on deep learning: A review. Plant Methods 2021, 17, 22. [Google Scholar] [CrossRef] [PubMed]
- Krizhevsky, A.; Sutskever, I.; Hinton, G.E. Imagenet classification with deep convolutional neural networks. NIPS 2012, 25, 84–90. [Google Scholar] [CrossRef] [Green Version]
- Szegedy, C.; Vanhoucke, V.; Ioffe, S.; Shlens, J.; Wojna, Z. Rethinking the inception architecture for computer vision. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition 2016, Las Vegas, NV, USA, 27–30 June 2016; pp. 2818–2826. [Google Scholar]
- Simonyan, K.; Zisserman, A. Very deep convolutional networks for large-scale image recognition. arXiv 2014, arXiv:1409.1556. [Google Scholar]
- Huang, G.; Liu, Z.; van der Maaten, L.; Weinberger, K.Q. Densely connected convolutional networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition 2017, Honolulu, HI, USA, 21–26 July 2017; pp. 4700–4708. [Google Scholar]
- Zoph, B.; Le, Q.V. Neural architecture search with reinforcement learning. arXiv 2016, arXiv:1611.01578. [Google Scholar]
- Montavon, G.; Samek, W.; Muller, K.R. Methods for interpreting and understanding deep neural networks. Digit. Signal Process. 2018, 73, 1–15. [Google Scholar] [CrossRef]
- Arrieta, A.B.; Diaz-Rodriguez, N.; del Ser, J.; Bennetot, A.; Tabik, S.; Barbado, A.; Garcia, S.; Gil-Lopez, S.; Molina, D.; Benjamins, R.; et al. Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI. Inf. Fusion 2020, 58, 82–115. [Google Scholar] [CrossRef] [Green Version]
- Sermanet, P.; Eigen, D.; Zhang, X.; Mathieu, M.; Fergus, R.; LeCun, Y. Overfeat: Integrated recognition, localization and detection using convolutional networks. arXiv 2013, arXiv:1312.6229. [Google Scholar]
- Girshick, R.; Donahue, J.; Darrell, T.; Malik, J. Rich feature hierarchies for accurate object detection and semantic segmentation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition 2014, Columbus, OH, USA, 23–28 June 2014; pp. 580–587. [Google Scholar]
- Girshick, R. Fast r-cnn. In Proceedings of the IEEE International Conference on Computer Vision 2015, Santiago, Chile, 7–13 December 2015; pp. 1440–1448. [Google Scholar]
- Ren, S.Q.; He, K.M.; Girshick, R.; Sun, J. Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks. In Proceedings of the 28th International Conference on Neural Information Processing Systems, Montreal, QC, Canada, 7–12 December 2015. [Google Scholar]
- Long, J.; Shelhamer, E.; Darrell, T. Fully convolutional networks for semantic segmentation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition 2015, Boston, MA, USA, 7–12 June 2015; pp. 3431–3440. [Google Scholar]
- Chen, L.C.; Papandreou, G.; Kokkinos, I.; Murphy, K.; Yuille, A.L. DeepLab: Semantic Image Segmentation with Deep Convolutional Nets, Atrous Convolution, and Fully Connected CRFs. IEEE Trans. Pattern Anal. Mach. Intell. 2018, 40, 834–848. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Ronneberger, O.; Fischer, P.; Brox, T. U-net: Convolutional networks for biomedical image segmentation. In Proceedings of the International Conference on Medical Image Computing and Computer-Assisted Intervention 2015, Munich, Germany, 5–9 October 2015; pp. 234–241. [Google Scholar]
- Hariharan, B.; Arbeláez, P.; Girshick, R.; Malik, J. Simultaneous detection and segmentation. In Proceedings of the European Conference on Computer Vision 2014, Zurich, Switzerland, 6–12 September 2014; pp. 297–312. [Google Scholar]
- He, K.; Gkioxari, G.; Dollár, P.; Girshick, R. Mask R-CNN. In Proceedings of the IEEE International Conference on Computer Vision 2017, Venice, Italy, 22–29 October 2017; pp. 2961–2969. [Google Scholar]
- Jiang, Y.; Li, C. Convolutional Neural Networks for Image-Based High-Throughput Plant Phenotyping: A Review. Plant Phenomics 2020, 2020, 4152816. [Google Scholar] [CrossRef] [Green Version]
- Watt, M.; Fiorani, F.; Usadel, B.; Rascher, U.; Muller, O.; Schurr, U. Phenotyping: New Windows into the Plant for Breeders. Annu. Rev. Plant Biol. 2020, 71, 689–712. [Google Scholar] [CrossRef] [PubMed]
- Crossa, J.; Perez-Rodriguez, P.; Cuevas, J.; Montesinos-Lopez, O.; Jarquin, D.; de los Campos, G.; Burgueno, J.; Gonzalez-Camacho, J.M.; Perez-Elizalde, S.; Beyene, Y.; et al. Genomic Selection in Plant Breeding: Methods, Models, and Perspectives. Trends Plant Sci. 2017, 22, 961–975. [Google Scholar] [CrossRef] [PubMed]
- Furbank, R.T.; Tester, M. Phenomics–technologies to relieve the phenotyping bottleneck. Trends Plant Sci. 2011, 16, 635–644. [Google Scholar] [CrossRef] [PubMed]
- Furbank, R.T.; Jimenez-Berni, J.A.; George-Jaeggli, B.; Potgieter, A.B.; Deery, D.M. Field crop phenomics: Enabling breeding for radiation use efficiency and biomass in cereal crops. New Phytol. 2019, 223, 1714–1727. [Google Scholar] [CrossRef] [Green Version]
- Kolhar, S.; Jagtap, J. Plant trait estimation and classification studies in plant phenotyping using machine vision–A review. Inf. Process. Agric. 2021. [Google Scholar] [CrossRef]
- Kamilaris, A.; Prenafeta-Boldu, F.X. A review of the use of convolutional neural networks in agriculture. J. Agric. Sci. 2018, 156, 312–322. [Google Scholar] [CrossRef] [Green Version]
- Deng, R.; Tao, M.; Huang, X.; Bangura, K.; Jiang, Q.; Jiang, Y.; Qi, L. Automated Counting Grains on the Rice Panicle Based on Deep Learning Method. Sensors 2021, 21, 281. [Google Scholar] [CrossRef]
- Li, J.; Li, C.; Fei, S.; Ma, C.; Chen, W.; Ding, F.; Wang, Y.; Li, Y.; Shi, J.; Xiao, Z. Wheat Ear Recognition Based on RetinaNet and Transfer Learning. Sensors 2021, 21, 4845. [Google Scholar] [CrossRef]
- Pratama, M.T.; Kim, S.; Ozawa, S.; Ohkawa, T.; Chona, Y.; Tsuji, H.; Murakami, N. Deep Learning-based Object Detection for Crop Monitoring in Soybean Fields. In Proceedings of the 2020 International Joint Conference on Neural Networks (IJCNN) 2020, Glasgow, UK, 19–24 July 2020; pp. 1–7. [Google Scholar]
- Gong, B.; Ergu, D.; Cai, Y.; Ma, B. Real-Time Detection for Wheat Head Applying Deep Neural Network. Sensors 2020, 21, 191. [Google Scholar] [CrossRef] [PubMed]
- Zou, H.; Lu, H.; Li, Y.; Liu, L.; Cao, Z. Maize tassels detection: A benchmark of the state of the art. Plant Methods 2020, 16, 108. [Google Scholar] [CrossRef] [PubMed]
- Lu, H.; Cao, Z.; Xiao, Y.; Fang, Z.; Zhu, Y.; Xian, K. Fine-grained maize tassel trait characterization with multi-view representations. Comput. Electron. Agric. 2015, 118, 143–158. [Google Scholar] [CrossRef]
- Lu, H.; Cao, Z.; Xiao, Y.; Li, Y.; Zhu, Y. Region-based colour modelling for joint crop and maize tassel segmentation. Biosyst. Eng. 2016, 147, 139–150. [Google Scholar] [CrossRef]
- Sadeghi-Tehran, P.; Virlet, N.; Ampe, E.M.; Reyns, P.; Hawkesford, M.J. DeepCount: In-Field Automatic Quantification of Wheat Spikes Using Simple Linear Iterative Clustering and Deep Convolutional Neural Networks. Front. Plant Sci. 2019, 10, 1176. [Google Scholar] [CrossRef]
- Xiong, H.; Cao, Z.; Lu, H.; Madec, S.; Liu, L.; Shen, C. TasselNetv2: In-field counting of wheat spikes with context-augmented local regression networks. Plant Methods 2019, 15, 150. [Google Scholar] [CrossRef]
- Kienbaum, L.; Abondano, M.C.; Blas, R.; Schmid, K. DeepCob: Precise and high-throughput analysis of maize cob geometry using deep learning with an application in genebank phenomics. Plant Methods 2021, 17, 91. [Google Scholar] [CrossRef]
- Khaki, S.; Pham, H.; Han, Y.; Kuhl, A.; Kent, W.; Wang, L.Z. DeepCorn: A semi-supervised deep learning method for high-throughput image-based corn kernel counting and yield estimation. Knowl. Based Syst. 2021, 218, 106874. [Google Scholar] [CrossRef]
- Yang, S.; Zheng, L.; He, P.; Wu, T.; Sun, S.; Wang, M. High-throughput soybean seeds phenotyping with convolutional neural networks and transfer learning. Plant Methods 2021, 17, 50. [Google Scholar] [CrossRef]
- Machefer, M.; Lemarchand, F.; Bonnefond, V.; Hitchins, A.; Sidiropoulos, P. Mask R-CNN refitting strategy for plant counting and sizing in UAV imagery. Remote Sens. 2020, 12, 3015. [Google Scholar] [CrossRef]
- Li, S.; Yan, Z.; Guo, Y.; Su, X.; Cao, Y.; Jiang, B.; Yang, F.; Zhang, Z.; Xin, D.; Chen, Q.; et al. SPM-IS: An auto-algorithm to acquire a mature soybean phenotype based on instance segmentation. Crop J. 2021, 10, 1412–1423. [Google Scholar] [CrossRef]
- Tan, C.; Li, C.; He, D.; Song, H. Anchor-free deep convolutional neural network for plant and plant organ detection and counting. In Proceedings of the 2021 ASABE Annual International Virtual Meeting, Online, 12–16 July 2021; p. 1. [Google Scholar] [CrossRef]
- Li, Y.; Jia, J.D.; Zhang, L.; Khattak, A.M.; Sun, S.; Gao, W.L.; Wang, M.J. Soybean Seed Counting Based on Pod Image Using Two-Column Convolution Neural Network. IEEE Access 2019, 7, 64177–64185. [Google Scholar] [CrossRef]
- Ying, W.; Yue, L.; Tingting, W.; Shi, S.; Minjuan, W. Fast Counting Method of Soybean Seeds Based on Density Estimation and VGG-Two. Smart Agric. 2021, 3, 111. [Google Scholar]
- Korav, S.; Dhaka, A.K.; Singh, R.; Premaradhya, N.; Reddy, G.C. A study on crop weed competition in field crops. J. Pharmacogn. Phytochem. 2018, 7, 3235–3240. [Google Scholar]
- Agrawal, K.N.; Singh, K.; Bora, G.C.; Lin, D. Weed recognition using image-processing technique based on leaf parameters. J. Agric. Sci. Technol. B 2012, 2, 899. [Google Scholar]
- Lin, S.M.; Jiang, Y.; Chen, X.S.; Biswas, A.; Li, S.; Yuan, Z.H.; Wang, H.L.; Qi, L. Automatic Detection of Plant Rows for a Transplanter in Paddy Field Using Faster R-CNN. IEEE Access 2020, 8, 147231–147240. [Google Scholar] [CrossRef]
- Wang, S.S.; Zhang, W.Y.; Wang, X.S.; Yu, S.S. Recognition of rice seedling rows based on row vector grid classification. Comput. Electron. Agric. 2021, 190, 106454. [Google Scholar] [CrossRef]
- Quan, L.; Feng, H.; Lv, Y.; Wang, Q.; Zhang, C.; Liu, J.; Yuan, Z. Maize seedling detection under different growth stages and complex field environments based on an improved Faster R–CNN. Biosyst. Eng. 2019, 184, 1–23. [Google Scholar] [CrossRef]
- Jiang, H.H.; Zhang, C.Y.; Qiao, Y.L.; Zhang, Z.; Zhang, W.J.; Song, C.Q. CNN feature based graph convolutional network for weed and crop recognition in smart farming. Comput. Electron. Agric. 2020, 174, 105450. [Google Scholar] [CrossRef]
- Kim, Y.H.; Park, K.R. MTS-CNN: Multi-task semantic segmentation-convolutional neural network for detecting crops and weeds. Comput. Electron. Agric. 2022, 199, 107146. [Google Scholar] [CrossRef]
- Weirong, Z.; Haojun, W.; Chaofan, Q.; Guangyan, W. Maize Seedling and Core Detection Method Based on Mask R-CNN. Xinjiang Agric. Sci. 2021, 58, 1918–1928. [Google Scholar]
- Zhang, R.; Wang, C.; Hu, X.; Liu, Y.; Chen, S. Weed location and recognition based on UAV imaging and deep learning. Int. J. Precis. Agric. Aviat. 2020, 3, 23–29. [Google Scholar] [CrossRef]
- Haq, M.A. CNN Based Automated Weed Detection System Using UAV Imagery. Comput. Syst. Sci. Eng. 2022, 42, 837–849. [Google Scholar]
- Babu, V.S.; Ram, N.V. Deep Residual CNN with Contrast Limited Adaptive Histogram Equalization for Weed Detection in Soybean Crops. Traitement Signal 2022, 39, 717–722. [Google Scholar] [CrossRef]
- Hu, K.; Coleman, G.; Zeng, S.; Wang, Z.; Walsh, M. Graph weeds net: A graph-based deep learning method for weed recognition. Comput. Electron. Agric. 2020, 174, 105520. [Google Scholar] [CrossRef]
- Olsen, A.; Konovalov, D.A.; Philippa, B.; Ridd, P.; Wood, J.C.; Johns, J.; Banks, W.; Girgenti, B.; Kenny, O.; Whinney, J.; et al. DeepWeeds: A multiclass weed species image dataset for deep learning. Sci. Rep. 2019, 9, 2058. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Zhang, J.-L.; Su, W.-H.; Zhang, H.-Y.; Peng, Y. SE-YOLOv5x: An Optimized Model Based on Transfer Learning and Visual Attention Mechanism for Identifying and Localizing Weeds and Vegetables. Agronomy 2022, 12, 2061. [Google Scholar] [CrossRef]
- Zhang, Z.Z.; Kayacan, E.; Thompson, B.; Chowdhary, G. High precision control and deep learning-based corn stand counting algorithms for agricultural robot. Auton. Robot. 2020, 44, 1289–1302. [Google Scholar] [CrossRef]
- Fina, F.; Birch, P.; Young, R.; Obu, J.; Faithpraise, B.; Chatwin, C. Research. Automatic plant pest detection and recognition using k-means clustering algorithm and correspondence filters. Int. J. Adv. Biotechnol. Res. 2013, 4, 189–199. [Google Scholar]
- Sharma, R.; Kukreja, V.; Kadyan, V. Hispa Rice Disease Classification using Convolutional Neural Network. In Proceedings of the 2021 3rd International Conference on Signal Processing and Communication (ICPSC), Tamil Nadu, India, 13–14 May 2021; pp. 377–381. [Google Scholar]
- Krishnamoorthy, N.; Prasad, L.V.N.; Kumar, C.S.P.; Subedi, B.; Abraha, H.B.; Sathishkumar, V.E. Rice leaf diseases prediction using deep neural networks with transfer learning. Environ. Res. 2021, 198, 111275. [Google Scholar] [CrossRef]
- Singh, A.; Arora, M. CNN Based Detection of Healthy and Unhealthy Wheat Crop. In Proceedings of the 2020 International Conference on Smart Electronics and Communication (ICOSEC), Trichy, India, 10–12 September 2020; pp. 121–125. [Google Scholar]
- Kumar, D.; Kukreja, V. N-CNN Based Transfer Learning Method for Classification of Powdery Mildew Wheat Disease. In Proceedings of the 2021 International Conference on Emerging Smart Computing and Informatics (ESCI), Pune, India, 5–7 March 2021; pp. 707–710. [Google Scholar]
- Jiang, J.L.; Liu, H.Y.; Zhao, C.; He, C.; Ma, J.F.; Cheng, T.; Zhu, Y.; Cao, W.X.; Yao, X. Evaluation of Diverse Convolutional Neural Networks and Training Strategies for Wheat Leaf Disease Identification with Field-Acquired Photographs. Remote Sens. 2022, 14, 3446. [Google Scholar] [CrossRef]
- Bari, B.S.; Islam, M.N.; Rashid, M.; Hasan, M.J.; Razman, M.A.M.; Musa, R.M.; Ab Nasir, A.F.; Majeed, A.P.P.A. A real-time approach of diagnosing rice leaf disease using deep learning-based faster R-CNN framework. PeerJ Comput. Sci. 2021, 7, e432. [Google Scholar] [CrossRef]
- Zhou, G.X.; Zhang, W.Z.; Chen, A.B.; He, M.F.; Ma, X.S. Rapid Detection of Rice Disease Based on FCM-KM and Faster R-CNN Fusion. IEEE Access 2019, 7, 143190–143206. [Google Scholar] [CrossRef]
- Zhang, K.K.; Wu, Q.F.; Chen, Y.P. Detecting soybean leaf disease from synthetic image using multi-feature fusion faster R-CNN. Comput. Electron. Agric. 2021, 183, 106064. [Google Scholar] [CrossRef]
- Shrivastava, S.; Singh, S.K.; Hooda, D.S. Soybean plant foliar disease detection using image retrieval approaches. Multimed. Tools Appl. 2017, 76, 26647–26674. [Google Scholar] [CrossRef]
- Pires, R.D.L.; Gonçalves, D.N.; Oruê, J.P.M.; Kanashiro, W.E.S.; Rodrigues, J.F., Jr.; Machado, B.B.; Gonçalves, W.N. Local descriptors for soybean disease recognition. Comput. Electron. Agric. 2016, 125, 48–55. [Google Scholar] [CrossRef]
- Ennadifi, E.; Laraba, S.; Vincke, D.; Mercatoris, B.; Gosselin, B. Wheat Diseases Classification and Localization Using Convolutional Neural Networks and GradCAM Visualization. In Proceedings of the 2020 International Conference on Intelligent Systems and Computer Vision (ISCV), Fez, Morocco, 9–11 June 2020; pp. 1–5. [Google Scholar]
- Lin, Z.Q.; Mu, S.M.; Huang, F.; Mateen, K.A.; Wang, M.J.; Gao, W.L.; Jia, J.D. A Unified Matrix-Based Convolutional Neural Network for Fine-Grained Image Classification of Wheat Leaf Diseases. IEEE Access 2019, 7, 11570–11590. [Google Scholar] [CrossRef]
- Su, W.-H.; Zhang, J.; Yang, C.; Page, R.; Szinyei, T.; Hirsch, C.D.; Steffenson, B. Automatic evaluation of wheat resistance to fusarium head blight using dual mask-RCNN deep learning frameworks in computer vision. Remote Sens. 2020, 13, 26. [Google Scholar] [CrossRef]
- Gao, Y.; Wang, H.; Li, M.; Su, W.-H. Automatic Tandem Dual BlendMask Networks for Severity Assessment of Wheat Fusarium Head Blight. Agriculture 2022, 12, 1493. [Google Scholar] [CrossRef]
- Krishnamoorthi, M.; Sankavi, R.S.; Aishwarya, V.; Chithra, B. Maize Leaf Diseases Identification using Data Augmentation and Convolutional Neural Network. In Proceedings of the 2021 2nd International Conference on Smart Electronics and Communication (ICOSEC), Trichy, India, 7–9 October 2021; pp. 1672–1677. [Google Scholar]
- Zhang, Y.; Wa, S.Y.; Liu, Y.T.; Zhou, X.Y.; Sun, P.S.; Ma, Q. High-Accuracy Detection of Maize Leaf Diseases CNN Based on Multi-Pathway Activation Function Module. Remote Sens. 2021, 13, 4218. [Google Scholar] [CrossRef]
- Hasan, M.J.; Alom, M.S.; Dina, U.F.; Moon, M.H. Maize Diseases Image Identification and Classification by Combining CNN with Bi-Directional Long Short-Term Memory Model. In Proceedings of the 2020 IEEE Region 10 Symposium (TENSYMP), Dhaka, Bangladesh, 5–7 June 2020; pp. 1804–1807. [Google Scholar]
- Arora, J.; Agrawal, U. Systems. Classification of Maize leaf diseases from healthy leaves using Deep Forest. J. Artif. Intell. Syst. 2020, 2, 14–26. [Google Scholar] [CrossRef] [Green Version]
- Karlekar, A.; Seal, A. SoyNet: Soybean leaf diseases classification. Comput. Electron. Agric. 2020, 172, 105342. [Google Scholar] [CrossRef]
- Bao, W.X.; Yang, X.H.; Liang, D.; Hu, G.S.; Yang, X.J. Lightweight convolutional neural network model for field wheat ear disease identification. Comput. Electron. Agric. 2021, 189, 106367. [Google Scholar] [CrossRef]
- Pan, Q.; Gao, M.; Wu, P.; Yan, J.; Li, S. A Deep-Learning-Based Approach for Wheat Yellow Rust Disease Recognition from Unmanned Aerial Vehicle Images. Sensors 2021, 21, 6540. [Google Scholar] [CrossRef] [PubMed]
- Baliyan, A.; Kukreja, V.; Salonki, V.; Kaswan, K.S. Detection of Corn Gray Leaf Spot Severity Levels using Deep Learning Approach. In Proceedings of the 2021 9th International Conference on Reliability, Infocom Technologies and Optimization (Trends and Future Directions) (ICRITO), Noida, India, 3–4 September 2021; pp. 1–5. [Google Scholar]
- Samanta, R.; Ghosh, I. Tea insect pests classification based on artificial neural networks. Int. J. Comput. Eng. Sci. 2012, 2, 1–13. [Google Scholar]
- Geiger, F.; Bengtsson, J.; Berendse, F.; Weisser, W.W.; Emmerson, M.; Morales, M.B.; Ceryngier, P.; Liira, J.; Tscharntke, T.; Winqvist, C.; et al. Persistent negative effects of pesticides on biodiversity and biological control potential on European farmland. Basic Appl. Ecol. 2010, 11, 97–105. [Google Scholar] [CrossRef]
- Clement, A.; Verfaille, T.; Lormel, C.; Jaloux, B. A new colour vision system to quantify automatically foliar discolouration caused by insect pests feeding on leaf cells. Biosyst. Eng. 2015, 133, 128–140. [Google Scholar] [CrossRef]
- Liu, T.; Chen, W.; Wu, W.; Sun, C.; Guo, W.; Zhu, X. Detection of aphids in wheat fields using a computer vision technique. Biosyst. Eng. 2016, 141, 82–93. [Google Scholar] [CrossRef]
- Ishengoma, F.S.; Rai, I.A.; Said, R.N. Identification of maize leaves infected by fall armyworms using UAV-based imagery and convolutional neural networks. Comput. Electron. Agric. 2021, 184, 106124. [Google Scholar] [CrossRef]
- Tetila, E.C.; Machado, B.B.; Menezes, G.V.; de Belete, N.A.S.; Astolfi, G.; Pistori, H. A deep-learning approach for automatic counting of soybean insect pests. IEEE Geosci. Remote Sens. Lett. 2019, 17, 1837–1841. [Google Scholar] [CrossRef]
- Abade, A.; Porto, L.F.; Ferreira, P.A.; de Vidal, F.B. NemaNet: A convolutional neural network model for identification of soybean nematodes. Biosyst. Eng. 2022, 213, 39–62. [Google Scholar] [CrossRef]
- Li, R.; Wang, R.J.; Zhang, J.; Xie, C.J.; Liu, L.; Wang, F.Y.; Chen, H.B.; Chen, T.J.; Hu, H.Y.; Jia, X.F.; et al. An Effective Data Augmentation Strategy for CNN-Based Pest Localization and Recognition in the Field. IEEE Access 2019, 7, 160274–160283. [Google Scholar] [CrossRef]
- Sheema, D.; Ramesh, K.; Renjith, P.N.; Lakshna, A. Comparative Study of Major Algorithms for Pest Detection in Maize Crop. In Proceedings of the 2021 International Conference on Intelligent Technologies (CONIT), Hubli, India, 25–27 June 2021; pp. 1–7. [Google Scholar]
- Verma, S.; Tripathi, S.; Singh, A.; Ojha, M.; Saxena, R.R. Insect Detection and Identification using YOLO Algorithms on Soybean Crop. In Proceedings of the TENCON 2021–2021 IEEE Region 10 Conference (TENCON), Auckland, New Zealand, 7–10 December 2021; pp. 272–277. [Google Scholar]
- Chen, P.; Li, W.L.; Yao, S.J.; Ma, C.; Zhang, J.; Wang, B.; Zheng, C.H.; Xie, C.J.; Liang, D. Recognition and counting of wheat mites in wheat fields by a three-step deep learning method. Neurocomputing 2021, 437, 21–30. [Google Scholar] [CrossRef]
- Anwar, A.; Kim, J.K. Transgenic Breeding Approaches for Improving Abiotic Stress Tolerance: Recent Progress and Future Perspectives. Int. J. Mol. Sci. 2020, 21, 2695. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Singh, A.; Ganapathysubramanian, B.; Singh, A.K.; Sarkar, S. Machine learning for high-throughput stress phenotyping in plants. Trends Plant Sci. 2016, 21, 110–124. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Sethy, P.K.; Barpanda, N.K.; Rath, A.K.; Behera, S.K. Nitrogen Deficiency Prediction of Rice Crop Based on Convolutional Neural Network. J. Ambient. Intell. Humaniz. Comput. 2020, 11, 5703–5711. [Google Scholar] [CrossRef]
- Wang, C.; Ye, Y.; Tian, Y.; Yu, Z. Classification of nutrient deficiency in rice based on CNN model with Reinforcement Learning augmentation. In Proceedings of the 2021 International Symposium on Artificial Intelligence and Its Application on Media (ISAIAM), Xi’an, China, 21–23 May 2021; pp. 107–111. [Google Scholar]
- Rizal, S.; Pratiwi, N.K.C.; Ibrahim, N.; Syalomta, N.; Nasution, M.I.K.; Mz, I.M.U.; Oktavia, D.A.P. Classification Of Nutrition Deficiency In Rice Plant Using CNN. In Proceedings of the 2022 1st International Conference on Information System & Information Technology (ICISIT), Yogyakarta, Indonesia, 26–27 July 2022; pp. 382–385. [Google Scholar]
- Zhuang, S.; Wang, P.; Jiang, B.R.; Li, M.S. Learned features of leaf phenotype to monitor maize water status in the fields. Comput. Electron. Agric. 2020, 172, 105347. [Google Scholar] [CrossRef]
- Das, S.; Christopher, J.; Apan, A.; Choudhury, M.R.; Chapman, S.; Menzies, N.W.; Dang, Y.P. Evaluation of water status of wheat genotypes to aid prediction of yield on sodic soils using UAV-thermal imaging and machine learning. Agric. For. Meteorol. 2021, 307, 108477. [Google Scholar] [CrossRef]
- Shouche, S.P.; Rastogi, R.; Bhagwat, S.G.; Sainis, J.K. Shape analysis of grains of Indian wheat varieties. Comput. Electron. Agric. 2001, 33, 55–76. [Google Scholar] [CrossRef]
- Laabassi, K.; Belarbi, M.A.; Mahmoudi, S.; Mahmoudi, S.A.; Ferhat, K. Wheat varieties identification based on a deep learning approach. J. Saudi Soc. Agric. Sci. 2021, 20, 281–289. [Google Scholar] [CrossRef]
- Javanmardi, S.; Ashtiani, S.H.M.; Verbeek, F.J.; Martynenko, A. Computer-vision classification of corn seed varieties using deep convolutional neural network. J. Stored Prod. Res. 2021, 92, 101800. [Google Scholar] [CrossRef]
- Gao, J.; Liu, C.; Han, J.; Lu, Q.; Wang, H.; Zhang, J.; Bai, X.; Luo, J. Identification Method of Wheat Cultivars by Using a Convolutional Neural Network Combined with Images of Multiple Growth Periods of Wheat. Symmetry 2021, 13, 2012. [Google Scholar] [CrossRef]
- Velesaca, H.O.; Mira, R.; Suárez, P.L.; Larrea, C.X.; Sappa, A.D. Deep learning based corn kernel classification. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops 2020, Seattle, WA, USA, 14–19 June 2020; pp. 66–67. [Google Scholar]
- Xu, P.; Tan, Q.; Zhang, Y.; Zha, X.; Yang, S.; Yang, R. Research on Maize Seed Classification and Recognition Based on Machine Vision and Deep Learning. Agriculture 2022, 12, 232. [Google Scholar] [CrossRef]
- ElMasry, G.; ElGamal, R.; Mandour, N.; Gou, P.; Al-Rejaie, S.; Belin, E.; Rousseau, D. Emerging thermal imaging techniques for seed quality evaluation: Principles and applications. Food Res. Int. 2020, 131, 109025. [Google Scholar] [CrossRef]
- Zhu, S.; Zhang, J.; Chao, M.; Xu, X.; Song, P.; Zhang, J.; Huang, Z. A Rapid and Highly Efficient Method for the Identification of Soybean Seed Varieties: Hyperspectral Images Combined with Transfer Learning. Molecules 2019, 25, 152. [Google Scholar] [CrossRef] [Green Version]
- Khosrokhani, M.; Nasr, A.H. Applications of the Remote Sensing Technology to Detect and Monitor the Rust Disease in the Wheat–A Literature Review. Geocarto Int. 2022, 1–27, accepted. [Google Scholar] [CrossRef]
- Murthy, V.N.; Maji, S.; Manmatha, R. Automatic image annotation using deep learning representations. In Proceedings of the 5th ACM on International Conference on Multimedia Retrieval, Shanghai, China, 23–26 June 2015; pp. 603–606. [Google Scholar]
- Blok, P.M.; Kootstra, G.; Elghor, H.E.; Diallo, B.; van Evert, F.K.; van Henten, E.J. Active learning with MaskAL reduces annotation effort for training Mask R-CNN. Comput. Electron. Agric. 2021, 197, 106917. [Google Scholar] [CrossRef]
- Li, J.; Jia, J.; Xu, D. Unsupervised representation learning of image-based plant disease with deep convolutional generative adversarial networks. In Proceedings of the 2018 37th Chinese Control Conference (CCC), Wuhan, China, 25–27 July 2018; pp. 9159–9163. [Google Scholar]
- Eckardt, J.-N.; Wendt, K.; Bornhäuser, M.; Middeke, J.M. Reinforcement learning for precision oncology. Cancers 2021, 13, 4624. [Google Scholar] [CrossRef]
- Wang, X.; Qi, G.-J. Contrastive learning with stronger augmentations. IEEE Trans. Pattern Anal. Mach. Intell. 2022, 1–12. [Google Scholar] [CrossRef] [PubMed]
- Khaled, A.Y.; Aziz, S.A.; Bejo, S.K.; Nawi, N.M.; Seman, I.A.; Onwude, D.I. Early detection of diseases in plant tissue using spectroscopy–applications and limitations. Appl. Spectrosc. Rev. 2018, 53, 36–64. [Google Scholar] [CrossRef]
- Zhang, S.M.; Li, X.H.; Ba, Y.X.; Lyu, X.G.; Zhang, M.Q.; Li, M.Z. Banana Fusarium Wilt Disease Detection by Supervised and Unsupervised Methods from UAV-Based Multispectral Imagery. Remote Sens. 2022, 14, 1231. [Google Scholar] [CrossRef]
- Allmendinger, A.; Spaeth, M.; Saile, M.; Peteinatos, G.G.; Gerhards, R. Precision Chemical Weed Management Strategies: A Review and a Design of a New CNN-Based Modular Spot Sprayer. Agronomy 2022, 12, 1620. [Google Scholar] [CrossRef]
- Bouguettaya, A.; Zarzour, H.; Kechida, A.; Taberkit, A.M. Recent Advances on UAV and Deep Learning for Early Crop Diseases Identification: A Short Review. In Proceedings of the 2021 International Conference on Information Technology (ICIT), Amman, Jordan, 14–15 July 2021; pp. 334–339. [Google Scholar]
- Ang, K.L.-M.; Seng, J.K.P. Big data and machine learning with hyperspectral information in agriculture. IEEE Access 2021, 9, 36699–36718. [Google Scholar] [CrossRef]
- Khan, S.; Naseer, M.; Hayat, M.; Zamir, S.W.; Khan, F.S.; Shah, M. Transformers in vision: A survey. ACM Comput. Surv. 2022, 54, 1–41. [Google Scholar] [CrossRef]
- Zhu, W.; Sun, J.; Wang, S.; Shen, J.; Yang, K.; Zhou, X. Identifying Field Crop Diseases Using Transformer-Embedded Convolutional Neural Network. Agriculture 2022, 12, 1083. [Google Scholar] [CrossRef]
- Wang, H.; Chen, X.; Zhang, T.; Xu, Z.; Li, J. CCTNet: Coupled CNN and Transformer Network for Crop Segmentation of Remote Sensing Images. Remote Sens. 2022, 14, 1956. [Google Scholar] [CrossRef]
- Su, W.-H.; He, H.-J.; Sun, D.-W. Non-destructive and rapid evaluation of staple foods quality by using spectroscopic techniques: A review. Crit. Rev. Food Sci. Nutr. 2017, 57, 1039–1051. [Google Scholar] [CrossRef] [PubMed]
- Su, W.-H.; Bakalis, S.; Sun, D.-W. Fingerprinting study of tuber ultimate compressive strength at different microwave drying times using mid-infrared imaging spectroscopy. Dry. Technol. 2019, 37, 1113–1130. [Google Scholar] [CrossRef]
- Liu, B.-Y.; Fan, K.-J.; Su, W.-H.; Peng, Y. Two-Stage Convolutional Neural Networks for Diagnosing the Severity of Alternaria Leaf Blotch Disease of the Apple Tree. Remote Sens. 2022, 14, 2519. [Google Scholar] [CrossRef]
- Su, W.-H.; Xue, H. Imaging Spectroscopy and Machine Learning for Intelligent Determination of Potato and Sweet Potato Quality. Foods 2021, 10, 2146. [Google Scholar] [CrossRef]
- Fan, K.J.; Su, W.H. Applications of Fluorescence Spectroscopy, RGB-and Multispectral Imaging for Quality Determinations of White Meat: A Review. Biosensors 2022, 12, 76. [Google Scholar] [CrossRef]
- Su, W.-H.; Sheng, J.; Huang, Q.-Y. Development of a Three-Dimensional Plant Localization Technique for Automatic Differentiation of Soybean from Intra-Row Weeds. Agriculture 2022, 12, 195. [Google Scholar] [CrossRef]
- Su, W.-H.; Sun, D.-W. Advanced analysis of roots and tubers by hyperspectral techniques. In Advances in Food and Nutrition Research; Fidel, T., Ed.; Elsevier: Amsterdam, The Netherlands, 2019; Volume 87, pp. 255–303. [Google Scholar]
- Su, W.-H. Advanced Machine Learning in Point Spectroscopy, RGB-and hyperspectral-imaging for automatic discriminations of crops and weeds: A review. Smart Cities 2020, 3, 39. [Google Scholar] [CrossRef]
Vision Task | Crop | Phenotyping Task | Image Type | Model | Number of Total Samples | Accuracy | References |
---|---|---|---|---|---|---|---|
Object detection | Rice | Counting of grain per panicle | RGB | Faster R-CNN with FPN | 796 | 99.4% | Deng et al. [40] |
Wheat | Ear recognition | RGB | RetinaNet with FPN | 52,920 | 98.6% | Li et al. [41] | |
Wheat | Detection of head | RGB | YOLOv4 with dual SPP | 3432 | 94.5% | Gong et al. [43] | |
Maize | Tassel counting | RGB | TasselNet (ResNet34) | 361 | 88.97% | Zou et al. [44] | |
Soybean | Flower and seedpod detection | RGB | Cascade R-CNN, RetinaNet, Faster R-CNN. | 76,524 | AP = 89.6% AP = 83.3% AP = 88.7% | Pratama et al. [42] | |
Soybean | Seed counting | RGB | TCNN | 32,126 | MAE = 13.21 MSE = 17.62 | Li et al. [55] | |
Soybean | Counting of seeds | RGB | VGG-Two | 37,563 | MAE = 0.6 MSE = 0.6 | Ying et al. [56] | |
Semantic segmentation | Wheat | Counting of spikes | RGB | TasseINetv2 | 675,322 | 91.01% | Xiong et al. [48] |
Wheat | Quantification of spikes | RGB | VGG | 580,000 | 98% | Sadeghi-Tehran et al. [47] | |
Maize | Corn kernel counting | RGB | VGG-16 | 19,848 | 90.48% | Khaki et al. [50] | |
Maize | Analysis of cob geometry | RGB | Mask R-CNN | 19,867 | 100% for length 99% for diameter | Kienbaum et al. [49] | |
Instance segmentation | Soybean | Phenotype measurement | RGB | ResNet-101 with FPN | 3207 | MAP = 95.7% | Li et al. [53] |
Vision Task | Crop | Phenotyping Task | Image Type | Model | Number of Total Samples | Accuracy | References |
---|---|---|---|---|---|---|---|
Image classification | Wheat | Identification of weed species | RGB | YOLOv3-tiny | 2000 | mAP = 72.5% | Zhang et al. [65] |
Soybean | Weed detection | RGB | CNN-LVQ | 15,000 | 99.44% | Haq [66] | |
Soybean | Weed classification | RGB | DRCNN | 15,336 | 97.25% | Babu and Ram [67] | |
Object detection | Rice | Seedling rows recognition | RGB | ResNet-18 | 4500 | 88.54% | Wang et al. [60] |
Rice | Location of seedlings | RGB | Faster R-CNN | 240 | 89.8% | Lin et al. [59] | |
Maize | Seedling detection | RGB | Faster R-CNN (VGG19) | 32,354 | 97.71% | Quan et al. [61] | |
Maize | Weed recognition | RGB | GCN-ResNet-101 | 6000 | 97.8% | Jiang et al. [62] | |
Maize | Plant detection | RGB | Faster R-CNN | 211 | 99.8% at 0.5 Intersection over Union | Zhang et al. [71] | |
Semantic segmentation | Rice | Segmentation of weeds | RGB | MTS-CNN | 224 | 96.48% | Kim and Park [63] |
Maize | Seedling and core detection | RGB | Mask R-CNN (ResNet50/101-FPN) | 1800 | 94.7% | Weirong et al. [64] |
Vision Task | Crop | Phenotyping Task | Image Type | Model | Number of Total Samples | Accuracy | References |
---|---|---|---|---|---|---|---|
Image classification | Rice | Hispa disease classification | RGB | CNN | 1000 | 94% | Sharma et al. [73] |
Rice | Leaf disease recognition | RGB | InceptionResNetV2 | 5200 | 95.67% | Krishnamoorthy et al. [74] | |
Wheat | Classification of powdery mildew disease | RGB | N-CNN | 450 | 89.9% | Kumar and Kukreja [76] | |
Wheat | Detection of healthy and unhealthy wheat | RGB | ResNet101, VGG-19, AlexNet | 750 | 98.6% 96.6% 92.6% | Singh and Arora [75] | |
Maize | Leaf disease identification | RGB | GoogleNet | 8604 | 98.55% | Krishnamoorthi et al. [87] | |
Maize | Detection of leaf diseases | RGB | MAF-ResNet50 | 59,778 | 97.41% | Zhang et al. [88] | |
Maize | Disease classification | RGB | CNN with BiLSTM | 29,065 | 99.02% | Hasan et al. [89] | |
Maize | Classification of leaf diseases | RGB | DeepForest (gcForest) | 400 | 96.25% | Arora et al. [90] | |
Soybean | Leaf disease classification | RGB | SoyNet | 17,240 | 98.14% | Karlekar and Seal [91] | |
Object detection | Rice | Leaf disease detection | RGB | Faster R-CNN with RPN | 16,800 | 98.09% for blast, 98.85% for brown spot, 99.17% for hispa. | Bari et al. [78] |
Rice | Disease detection | RGB | Faster R-CNN with FCM-KM | 3010 | 96.71% for blast, 97.53% for bacterial blight, 98.3% for blight. | Zhou et al. [79] | |
Soybean | Leaf disease detection | RGB | Muti-feature fusion Faster R-CNN | 2230 | 96.43% for virus, 87.76% for frogeye spot, 65.63% for bacterial spot. | Zhang et al. [80] | |
Semantic segmentation | Wheat | Classification of leaf diseases | RGB | M-bCNN | 83,260 | 96.5% | Lin et al. [84] |
Wheat | Disease classification and localization | RGB | Mask R-CNN (DensNet12) | 1163 | 93.47% | Ennadifi et al. [83] | |
Wheat | Ear disease identification | RGB | SimpleNet | 1205 | 93% for blotch 93% for scab | Bao et al. [92] | |
Wheat | Yellow rust disease recognition | RGB | PSP Net | 5580 | 98% | Pan et al. [93] | |
Maize | Detection of gray leaf spot severity levels | RGB | CNN | 1500 | 95.33% | Baliyan et al. [94] | |
Instance segmentation | Wheat | Evaluation of resistance to fusarium head blight (FHB) | RGB | Mask-RCNN (ResNet-101, FPN, RPN) | 17,340 | 77.76% for spike, 98.81% for diseased area, 77.19% for FHB severity. | Su et al. [85] |
Wheat | Severity assessment of FHB | RGB | Tandem Dual BlendMask (ResNet-50, FPN) | 3754 | 85.56% for spike, 99.32% for diseased area, 91.80% for FHB severity. | Gao, Wang, Li and Su [86] |
Vision Task | Crop | Phenotyping Task | Image Type | Model | Number of Total Samples | Accuracy | References |
---|---|---|---|---|---|---|---|
Image classification | Maize | Identification of leaves infected by fall armyworms | RGB | InceptionV3 MbileNetV2 | 11,280 | 100% 100% | Ishengoma et al. [99] |
Soybean | Classifying and counting of insect pests | RGB | DensNet-201 ResNet-50 Incetion-Resnet-v2 | 10,000 | 94.89% 93.78% 93.40% | Tetila et al. [100] | |
Soybean | Identification of nematodes | RGB | NemaNet | 3063 | 96.76% for FS 98.82% for TL | Abade et al. [101] | |
Object detection | Wheat | Pest localization | RGB | Resnet-50 with RPN | 519,752 | 83.23% | Li et al. [102] |
Wheat | Recognition and counting of mites | RGB | ZFnet-5 VGG-16 | 546 | 94.6% 96.4% | Chen et al. [105] | |
Maize | Pest detection | RGB | Faster R-CNN with RPN | 15,000 | 97.53% | Sheema et al. [103] | |
Soybean | Insect identification | RGB | YOLO v3, v4, and v5 | 3710 | 99.5% for the best AP | Verma et al. [104] |
Vision Task | Crop | Phenotyping Task | Image Type | Model | Number of Total Samples | Accuracy | References |
---|---|---|---|---|---|---|---|
Image classification | Rice | Prediction of nitrogen deficiency | RGB | ResNet-50 +SVM | 5790 | 99.84% | Sethy et al. [108] |
Rice | Classification of nutrient deficiency | RGB | Densenet-121 | 1500 | 97% | Wang et al. [109] | |
Rice | Nutrient deficiency evaluation | RGB | ResNet-50 | 1156 | 98% | Rizal et al. [110] | |
Maize | Water stress recognition | RGB | CNN +SVM | 18,040 | 88.41% | Zhuang et al. [111] |
Vision Task | Crop | Phenotyping Task | Image Type | Model | Number of Total Samples | Accuracy | References |
---|---|---|---|---|---|---|---|
Image classification | Wheat | Varieties identification | RGB | DenseNet InceptionV3 MobileNet | 31,606 | 95.68% 95.62% 95.49% | Laabassi et al. [114] |
Wheat | Identification of cultivars | RGB | ResNet-50 SE-ResNet SE-ResNeXt | 4540 | 92.07% | Gao et al. [116] | |
Maize | Maize seed classification | RGB | P-ResNet | 8080 | 99.7% | Xu et al. [118] | |
Object detection | Maize | Classification of seed varieties | RGB | VGG-16 | 9000 | 98.1% | Javanmardi et al. [115] |
Instance segmentation | Maize | Seed variety classification | RGB | Mask R-CNN VGG16 ResNet50 CK-CNN | 16,500 | 64.7% 89% 92.5% 95.6% | Velesaca et al. [117] |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wang, Y.-H.; Su, W.-H. Convolutional Neural Networks in Computer Vision for Grain Crop Phenotyping: A Review. Agronomy 2022, 12, 2659. https://doi.org/10.3390/agronomy12112659
Wang Y-H, Su W-H. Convolutional Neural Networks in Computer Vision for Grain Crop Phenotyping: A Review. Agronomy. 2022; 12(11):2659. https://doi.org/10.3390/agronomy12112659
Chicago/Turabian StyleWang, Ya-Hong, and Wen-Hao Su. 2022. "Convolutional Neural Networks in Computer Vision for Grain Crop Phenotyping: A Review" Agronomy 12, no. 11: 2659. https://doi.org/10.3390/agronomy12112659
APA StyleWang, Y. -H., & Su, W. -H. (2022). Convolutional Neural Networks in Computer Vision for Grain Crop Phenotyping: A Review. Agronomy, 12(11), 2659. https://doi.org/10.3390/agronomy12112659