Classification of Fluorescently Labelled Maize Kernels Using Convolutional Neural Networks
Abstract
:1. Introduction
2. Materials and Methods
2.1. Maize Kernel Samples
2.2. Machine Vision System
2.2.1. Image Acquisition Hardware
2.2.2. Excitation Light Source and Filter Selection
2.3. Maize Kernel Sorting System
2.3.1. Structural Design
2.3.2. Control System Design
2.4. Image Acquisition and Dataset Production
2.5. Kernel Sorting Based on YOLOv5s Network
2.5.1. YOLOv5s Network
2.5.2. Enhancing the YOLOv5s Kernel Detection Network
- Data augmentation via gamma transformation
- Addition of the CBAM module
- Improving predicted bounding box accuracy with CIOU_Loss
2.5.3. Model Parameters and Assessment Indicators
3. Results and Discussion
3.1. Comparison of Model Results
3.2. Experiments and Discussion
4. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Shafinas, M.N.I.; Bernard, D.; Nazira, M.; Rosentrater, K.A. Effect of Grain Moisture Content and Roller Mill Gap Size on Various Physical Properties of Yellow Dent Corn Flour. J. Food Res. 2022, 11, 16. [Google Scholar] [CrossRef]
- Ghețe, A.B.; Haș, V.; Vidican, R.; Ana Copândean, A.; Ranta, O.; Moldovan, C.M.; Cris, I.; Duda, M.M. Influence of Detasseling Methods on Seed Yield of Some Parent Inbred Lines of Turda Maize Hybrids. Agronomy 2020, 10, 729. [Google Scholar] [CrossRef]
- Zhang, D.F.; Wu, S.W.; An, X.L.; Xie, K.; Zhou, Y.; Xu, L.W.; Fang, W.; Liu, S.S.; Liu, S.S.; Zhu, T.T.; et al. Construction of a multicontrol sterility system for a maize male-sterile line and hybrid seed production based on the ZmMs7 gene encoding a PHD-finger transcription factor. Plant Biotechnol. J. 2018, 16, 459–471. [Google Scholar] [CrossRef] [Green Version]
- Wu, S.W.; Wan, X.Y. Construction of Male-sterility System Using Biotechnology and Application in Crop Breeding and Hybrid Seed Production. China Biotechnol. 2018, 38, 78–87. [Google Scholar] [CrossRef]
- Subach, F.V.; Patterson, G.H.; Manley, S.; Gillette, G.M.; Lippincott-Schwart, J.; Verkhusha, V.V. Photoactivatable mCherry for high-resolution two-color fluorescence microscopy. Nat. Methods 2009, 6, 153–159. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Baird, G.S.; Zacharias, D.A.; Tsien, R.Y. Biochemistry, mutagenesis, and oligomerization of DsRed, a red fluorescent protein from coral. Proc. Natl. Acad. Sci. USA 2000, 97, 11984–11989. [Google Scholar] [CrossRef] [Green Version]
- Yang, Y.; Liang, S.W.; Samali, B.; Nguyen, T.N.; Zhai, C.X.; Li, J.C.; Xie, X.Y. Torsional Capacity Evaluation of RC Beams Using an Improved Bird Swarm Algorithm Optimised 2D Convolutional Neural Network. Eng. Struct. 2022, 273, 115066. [Google Scholar] [CrossRef]
- Yang, Y.; Samali, B.; Rashidi, M.; Mohammadi, M.; Nguyen, T.N.; Zhang, G. Vision-Based Concrete Crack Detection Using a Hybrid Framework Considering Noise Effect. J. Build. Eng. 2022, 61, 105246. [Google Scholar] [CrossRef]
- Jia, L.; Chow, T.W.; Yuan, Y. GTFE-Net: A Gramian Time Frequency Enhancement CNN for bearing fault diagnosis. Eng. Appl. Artif. Intell. 2023, 119, 105794. [Google Scholar] [CrossRef]
- Choudhary, A.; Mishra, R.K.; Fatima, S.; Panigrahi, B. Multi-input CNN based vibro-acoustic fusion for accurate fault diagnosis of induction motor. Eng. Appl. Artif. Intell. 2023, 120, 105872. [Google Scholar] [CrossRef]
- Zhang, X.J.; Gu, S.M.; Li, B. Research on the application of machine vision technology in agricultural production. Rural. Econ. Sci. -Technol. 2019, 23, 55–56. [Google Scholar]
- Yin, Y.; Hao, Y.; Yu, H.; Liu, Y.; Hao, F. Detection Potential of Multi-Features Representation of E-Nose Data in Classification of Moldy Maize Samples. Food Bioprocess Technol. 2017, 10, 2226–2239. [Google Scholar] [CrossRef]
- Sabanci, K.; Kayabasi, A.; Toktas, A. Computer vision-based method for classification of wheat grains using artificial neural network. J. Sci. Food Agric. 2017, 97, 2588–2593. [Google Scholar] [CrossRef] [PubMed]
- Lin, P.; Li, X.L.; Chen, Y.M.; He, Y. A Deep Convolutional Neural Network Architecture for Boosting Image Discrimination Accuracy of Rice Species. Food Bioprocess Technol. 2018, 11, 765–773. [Google Scholar] [CrossRef]
- Fiona, J.R.; Anitha, J. Automated detection of plant diseases and crop analysis in agriculture using image processing techniques: A survey. In Proceedings of the 2019 IEEE International Conference on Electrical, Computer and Communication Technologies (ICECCT), Coimbatore, India, 20–22 February 2019; IEEE: New York, NY, USA, 2019; pp. 1–5. [Google Scholar] [CrossRef]
- Velesaca, H.O.; Suárez, P.L.; Mira, R.; Sappa, A.D. Computer vision based food grain classification: A comprehensive survey. Comput. Electron. Agric. 2021, 187, 106187. [Google Scholar] [CrossRef]
- Choudhary, R.; Paliwal, J.; Jayas, D.S. Classification of cereal grains using wavelet, morphological, colour, and textural features of non-touching kernel images. Biosyst. Eng. 2008, 99, 330–337. [Google Scholar] [CrossRef]
- Pearson, T.C. Low-cost bi-chromatic image sorting device for grains. In Proceedings of the 2006 ASAE Annual Meeting, Boston, MA, USA, 19–22 August 2006. [Google Scholar] [CrossRef]
- Zhang, X.; Liu, J.; Song, H. Corn ear test using SIFT-based panoramic photography and machine vision technology. Artif. Intell. Agric. 2020, 4, 162–171. [Google Scholar] [CrossRef]
- Song, P.; Zhang, H.; Wang, C.; Wang, X.D.; Luo, B. Development of an Automatic Testing System for Corn. In Proceedings of the 3rd International Conference on Electromechanical Control Technology and Transportation, Chongqing, China, 19–21 January 2018. [Google Scholar] [CrossRef]
- Momin, M.A.; Yamamoto, K.; Miyamoto, M.; Kondo, N.; Grift, T. Machine vision based soybean quality evaluation. Comput. Electron. Agric. 2017, 140, 452–460. [Google Scholar] [CrossRef]
- Oguri, K.; Kitazato, H.; Glud, R.N. Platinum octaetylporphyrin based planar optodes combined with an UV-LED excitation light source: An ideal tool for high-resolution O2 imaging in O2 depleted environments. Mar. Chem. 2006, 100, 95–107. [Google Scholar] [CrossRef]
- Sharif, M.; Khan, M.A.; Iqbal, Z.; Azam, M.F.; Lali, M.I.U.; Javed, M.Y. Detection and classification of citrus diseases in agriculture based on optimized weighted segmentation and feature selection. Comput. Electron. Agric. 2018, 150, 220–234. [Google Scholar] [CrossRef]
- Thanh, D.N.; Hien, N.N.; Kalavathi, P.; Prasath, V.S. Adaptive Switching Weight Mean Filter for Salt and Pepper Image Denoising. Procedia Comput. Sci. 2020, 171, 292–301. [Google Scholar] [CrossRef]
- Zhou, W.; Liang, X.F. Color image segmentation algorithm based on target characteristic analysis. In Proceedings of the 2011 2nd International Conference on Artificial Intelligence, Management Science and Electronic Commerce (AIMSEC), Dengfeng, China, 8–10 August 2011; IEEE: New York, NY, USA, 2011; pp. 3828–3830. [Google Scholar] [CrossRef]
- Maldonado, W., Jr.; Barbosa, J.C. Automatic green fruit counting in orange trees using digital images. Comput. Electron. Agric. 2016, 127, 572–581. [Google Scholar] [CrossRef] [Green Version]
- Hamuda, E.; Mc Ginley, B.; Glavin, M.; Jones, E. Improved image processing-based crop detection using Kalman filtering and the Hungarian algorithm. Comput. Electron. Agric. 2018, 148, 37–44. [Google Scholar] [CrossRef]
- Wang, K.; Liang, X.Y.; Ceng, Y.; Yang, W.N. Study on Detection Method for Grain Size Characteristics of Maize. J. Hunan Agric. Univ. (Nat. Sci.) 2017, 43, 329–335. [Google Scholar]
- Tripathy, D.P.; Reddy, K.G.R. Novel Methods for Separation of Gangue from Limestone and Coal using Multispectral and Joint Color-Texture Features. J. Inst. Eng. Ser. D 2017, 98, 109–117. [Google Scholar] [CrossRef]
- Zhang, N.N. Research on the Moldy Grain Detection Device Based on the Embcdded System. Master’s Thesis, Shandong University of Technology, Zibo, China, 2014. [Google Scholar]
- Zhou, Z.Q.; Guo, S.C.; Cheng, J.H. Research Progress in Computer Vision Identification and Evaluation of Maize Quality. J. Henan Univ. Technol. (Nat. Sci. Ed.) 2011, 32, 83–88. [Google Scholar]
- Toda, Y.; Okura, F.; Ito, J.; Okada, S.; Kinoshita, T.; Tsuji, H.; Saisho, D. Training instance segmentation neural network with synthetic datasets for crop seed phenotyping. Commun. Biol. 2020, 3, 173. [Google Scholar] [CrossRef] [Green Version]
- Velesaca, H.O.; Mira, R.; Suarez, P.L.; Larrea, C.X.; Sappa, A.D. Deep learning based corn kernel classification. In Proceedings of the 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), Seattle, WA, USA, 14–19 June 2020; pp. 294–302. [Google Scholar] [CrossRef]
- Altuntaş, Y.; Cömert, Z.; Kocamaz, A.F. Identification of haploid and diploid maize seeds using convolutional neural networks and a transfer learning approach. Comput. Electron. Agric. 2019, 163, 104874. [Google Scholar] [CrossRef]
- Aukkapinyo, K.; Sawangwong, S.; Pooyoi, P.; Kusakunniran, W. Localization and Classification of Rice-grain Images Using Region Proposals-based Convolutional Neural Network. Int. J. Autom. Comput. 2020, 17, 233–246. [Google Scholar] [CrossRef]
- Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You Only Look Once: Unified, Real-Time Object Detection. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016; pp. 779–788. [Google Scholar] [CrossRef] [Green Version]
- Redmon, J.; Farhadi, A. YOLO9000: Better, Faster, Stronger. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017; IEEE: New York, NY, USA, 2016; pp. 6517–6525. [Google Scholar] [CrossRef] [Green Version]
- Redmon, J.; Farhadi, A. YOLOv3: An Incremental Improvement. arXiv 2018, arXiv:1804.02767. [Google Scholar] [CrossRef]
- Ma, T.; Ye, W.; Leng, S.; Chen, M. Solid waste surface feature enhancement method based on gamma correction and wavelet transform. Signal Image Video Process. 2021, 15, 1627–1634. [Google Scholar] [CrossRef]
- Girshick, R.; Donahue, J.; Darrell, T.; Malik, J. Rich Feature Hierarchies for Accurate Object Detection and Semantic Segmentation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Columbus, OH, USA, 23–28 June 2014; IEEE: New York, NY, USA, 2014; pp. 580–587. [Google Scholar] [CrossRef] [Green Version]
- Zhao, S.K.; Nguyen, T.H.; Ma, B. Monaural Speech Enhancement with Complex Convolutional Block Attention Module and Joint Time Frequency Losses. In Proceedings of the ICASSP 2021–2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Toronto, ON, Canada, 6–11 June 2021; IEEE: New York, NY, USA, 2021; pp. 6648–6652. [Google Scholar] [CrossRef]
Category | Type | Identifier |
---|---|---|
Fertile line fluorescent kernels | mCherry fluorescent proteins | mCherry-1 |
DsRed fluorescent proteins | DsRed-2 | |
Sterile line non-fluorescent kernels | No fluorescent proteins | Non-3 |
Excitation Light | Filter Central Wavelength | Identifier |
---|---|---|
Yellow LED light source | 645 nm | Yellow+645 |
Yellow LED light source | 650 nm | Yellow+650 |
Light Source Combination | 645 nm Narrow-Band Filter and Yellow LED | 650 nm Narrow-Band Filter and Yellow LED | ||||
---|---|---|---|---|---|---|
Colour Moment | mCherry Fluorescent | Dsred Fluorescent | Non -Fluorescent | mCherry Fluorescent | DsRed Fluorescent | Non -Fluorescent |
R mean | 2.9193 | 0.3407 | 0.2607 | 3.3229 | 0.9551 | 0.6344 |
G mean | 0.9948 | 0.0070 | 0.0008 | 1.0079 | 0.3876 | 0.3271 |
B mean | 1.6094 | 0.0931 | 0.1490 | 1.7842 | 0.2043 | 0.0909 |
R second moment | 26.04787 | 3.9312 | 2.4627 | 28.0582 | 8.8454 | 6.3863 |
G second moment | 39.2998 | 0.0075 | 0.0532 | 9.1578 | 3.5720 | 3.2688 |
B second moment | 60.1261 | 1.0997 | 1.4021 | 15.4567 | 1.9155 | 0.9588 |
R third moment | 117.3413 | 9.3319 | 5.3579 | 57.0579 | 19.0368 | 14.1845 |
G third moment | 39.2998 | 0.0075 | 0.0532 | 9.1578 | 3.5720 | 3.2687 |
B third moment | 73.0132 | 2.6273 | 3.0805 | 32.1373 | 4.1688 | 2.1865 |
H mean | 1.7845 | 0.1232 | 0.0043 | 1.4268 | 0.2006 | 0.0909 |
S mean | 1.0079 | 0.0009 | 0.0002 | 0.8058 | 0.3839 | 0.3271 |
V mean | 3.3229 | 0.2473 | 0.0019 | 2.8280 | 0.9426 | 0.6344 |
H second moment | 15.4567 | 1.1981 | 0.2997 | 13.2123 | 1.8943 | 0.9587 |
S second moment | 9.1578 | 0.0219 | 0.0189 | 7.8973 | 3.5609 | 3.2687 |
V second moment | 28.0582 | 2.4308 | 0.1334 | 25.0905 | 8.7925 | 6.3862 |
H third moment | 32.1373 | 2.6146 | 1.2391 | 28.5106 | 4.1333 | 2.1864 |
S third moment | 15.7378 | 0.1438 | 0.1438 | 17.7489 | 7.6404 | 7.2276 |
V third moment | 57.0579 | 5.3356 | 0.5823 | 52.1457 | 18.9703 | 14.1846 |
Algorithm | AP50 | F1 Score | Inference Time |
---|---|---|---|
Fast R-CNN | 85.41% | 86.15% | 22.6 ms |
YOLOv3 | 84.92% | 89.56% | 18.3 ms |
YOLOv5s | 85.67% | 91.31% | 8.6 ms |
Improved YOLOv5s | 95.71% | 96.26% | 9.1 ms |
Algorithm | TP | TN | FP | FN | Precision | Recall | Accuracy |
---|---|---|---|---|---|---|---|
YOLOv3 | 436 | 456 | 44 | 64 | 90.83% | 87.20% | 89.20% |
YOLOv5s | 446 | 466 | 34 | 54 | 92.92% | 89.20% | 91.20% |
Improved YOLOv5s | 480 | 480 | 20 | 20 | 96.00% | 96.00% | 96.00% |
Algorithm | Kernel Interval Time | Kernel Drop Speed (n/min) | Conveyor Belt Speed (m/s) | TP | TN | FP | FN | Precision | Recall | Accuracy |
---|---|---|---|---|---|---|---|---|---|---|
Improved YOLOv5s | 0.5 s | 360 | 0.060 | 476 | 468 | 32 | 24 | 93.70% | 95.20% | 94.40% |
0.45 s | 405 | 0.067 | 470 | 465 | 35 | 30 | 93.07% | 94.00% | 93.50% | |
0.4 s | 450 | 0.075 | 461 | 460 | 40 | 39 | 92.02% | 92.20% | 92.10% |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wang, Z.; Guan, B.; Tang, W.; Wu, S.; Ma, X.; Niu, H.; Wan, X.; Zang, Y. Classification of Fluorescently Labelled Maize Kernels Using Convolutional Neural Networks. Sensors 2023, 23, 2840. https://doi.org/10.3390/s23052840
Wang Z, Guan B, Tang W, Wu S, Ma X, Niu H, Wan X, Zang Y. Classification of Fluorescently Labelled Maize Kernels Using Convolutional Neural Networks. Sensors. 2023; 23(5):2840. https://doi.org/10.3390/s23052840
Chicago/Turabian StyleWang, Zilong, Ben Guan, Wenbo Tang, Suowei Wu, Xuejie Ma, Hao Niu, Xiangyuan Wan, and Yong Zang. 2023. "Classification of Fluorescently Labelled Maize Kernels Using Convolutional Neural Networks" Sensors 23, no. 5: 2840. https://doi.org/10.3390/s23052840
APA StyleWang, Z., Guan, B., Tang, W., Wu, S., Ma, X., Niu, H., Wan, X., & Zang, Y. (2023). Classification of Fluorescently Labelled Maize Kernels Using Convolutional Neural Networks. Sensors, 23(5), 2840. https://doi.org/10.3390/s23052840