From Organelle Morphology to Whole-Plant Phenotyping: A Phenotypic Detection Method Based on Deep Learning
Abstract
:1. Introduction
- Deep learning was used to classify Arabidopsis thaliana from the macro (plant) to micro (organelle) level.
- A multi-task model was used to study plant phenotypes (Arabidopsis thaliana), complete the classification of Arabidopsis thaliana genotypes, and predict the growth state of the plant.
- Arabidopsis organelles (chloroplast, mitochondria and peroxisome) were identified using an improved multitasking model to test the migratory nature of the method.
2. Materials and Methods
2.1. Materials
2.2. CNN
2.3. Experiment Process
2.3.1. Image Data Preprocessing
2.3.2. Experimental Data Division
2.3.3. Multiple Output Model Structure
2.3.4. Network Training Parameter
2.3.5. Model Performance Metrics
3. Results
3.1. The Performance of Multi-Output Model Identification in Arabidopsis
3.2. The Performance of Multi-Output Model Identification in Arabidopsis
3.3. Performance of Multiple Output Models in Organelle Recognition
4. Discussion
4.1. Advantages of the Multiple-Output Model
4.2. Effects of Images with and without Background on Model Performance
4.3. The Effect of Increasing Interval Days on the Model
4.4. Effect of Fluorescence on Image Visualization
4.5. Discussion of Image Data and Deep Learning
4.6. Discussion of Deep Learning Models
4.7. Future Work
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Rahaman, M.M.; Chen, D.; Gillani, Z.; Klukas, C.; Chen, M. Advanced phenotyping and phenotype data analysis for the study of plant growth and development. Front. Plant Sci. 2015, 6, 619. [Google Scholar] [CrossRef] [PubMed]
- Subramanian, R.; Spalding, E.P.; Ferrier, N.J. A high throughput robot system for machine vision based plant phenotype studies. Mach. Vis. Appl. 2013, 24, 619–636. [Google Scholar] [CrossRef]
- Du, J.; Lu, X.; Fan, J.; Qin, Y.; Yang, X.; Guo, X. Image-Based High-Throughput Detection and Phenotype Evaluation Method for Multiple Lettuce Varieties. Front. Plant Sci. 2020, 11, 563386. [Google Scholar] [CrossRef]
- Dhondt, S.; Wuyts, N.; Inzé, D. Cell to whole-plant phenotyping: The best is yet to come. Trends Plant Sci. 2013, 18, 428–439. [Google Scholar] [CrossRef] [PubMed]
- Jiang, Y.; Li, C.J. Convolutional neural networks for image-based high-throughput plant phenotyping: A review. Plant Phenom. 2020, 2020, 4152816. [Google Scholar] [CrossRef] [PubMed]
- Pathan, M.; Patel, N.; Yagnik, H.; Shah, M. Artificial cognition for applications in smart agriculture: A comprehensive review. Artif. Intell. Agric. 2020, 4, 81–95. [Google Scholar] [CrossRef]
- Hairmansis, A.; Berger, B.; Tester, M.; Roy, S.J. Image-based phenotyping for non-destructive screening of different salinity tolerance traits in rice. Rice 2014, 7, 16. [Google Scholar] [CrossRef]
- Chung, Y.S.; Choi, S.C.; Silva, R.R.; Kang, J.W.; Eom, J.H.; Kim, C. Case study: Estimation of sorghum biomass using digital image analysis with Canopeo. Biomass Bioenergy 2017, 105, 207–210. [Google Scholar] [CrossRef]
- Gebhardt, S.; Kühbauch, W. A new algorithm for automatic Rumex obtusifolius detection in digital images using colour and texture features and the influence of image resolution. Precis. Agric. 2007, 8, 1–13. [Google Scholar] [CrossRef]
- Abbasgholipour, M.; Omid, M.; Keyhani, A.; Mohtasebi, S.S. Color image segmentation with genetic algorithm in a raisin sorting system based on machine vision in variable conditions. Expert Syst. Appl. 2011, 38, 3671–3678. [Google Scholar] [CrossRef]
- Azim, M.A.; Islam, M.K.; Rahman, M.M.; Jahan, F. An effective feature extraction method for rice leaf disease classification. Telkomnika (Telecommun. Comput. Electron. Control) 2021, 19, 463–470. [Google Scholar] [CrossRef]
- Saleem, M.H.; Potgieter, J.; Arif, K.M. Plant disease detection and classification by deep learning. Plants 2019, 8, 468. [Google Scholar] [CrossRef] [PubMed]
- Giuffrida, M.V.; Doerner, P.; Tsaftaris, S.A. Pheno-deep counter: A unified and versatile deep learning architecture for leaf counting. Plant J. 2018, 96, 880–890. [Google Scholar] [CrossRef] [PubMed]
- Lim, E.C.; Kim, J.; Park, J.; Kim, E.J.; Kim, J.; Park, Y.M.; Cho, H.S.; Byun, D.; Henderson, I.R.; Copenhaver, G.P. DeepTetrad: High-throughput image analysis of meiotic tetrads by deep learning in Arabidopsis thaliana. Plant J. 2020, 101, 473–483. [Google Scholar] [CrossRef] [PubMed]
- Kolhar, S.; Jagtap, J. Spatio-temporal deep neural networks for accession classification of Arabidopsis plants using image sequences. Ecol. Inform. 2021, 64, 101334. [Google Scholar] [CrossRef]
- Mishra, P.; Sadeh, R.; Ryckewaert, M.; Bino, E.; Polder, G.; Boer, M.P.; Rutledge, D.N.; Herrmann, I. A generic workflow combining deep learning and chemometrics for processing close-range spectral images to detect drought stress in Arabidopsis thaliana to support digital phenotyping. Chemom. Intell. Lab. Syst. 2021, 216, 104373. [Google Scholar] [CrossRef]
- Xu, W.; Zhao, L.; Li, J.; Shang, S.; Ding, X.; Wang, T. Detection and classification of tea buds based on deep learning. Comput. Electron. Agric. 2022, 192, 106547. [Google Scholar] [CrossRef]
- Crichton, G.; Pyysalo, S.; Chiu, B.; Korhonen, A. A neural network multi-task learning approach to biomedical named entity recognition. BMC Bioinform. 2017, 18, 368. [Google Scholar] [CrossRef] [PubMed]
- Thung, K.-H.; Wee, C.-Y. A brief review on multi-task learning. Multimed. Tools Appl. 2018, 77, 29705–29725. [Google Scholar] [CrossRef]
- Dobrescu, A.; Giuffrida, M.V.; Tsaftaris, S.A. Doing more with less: A multitask deep learning approach in plant phenotyping. Front. Plant Sci. 2020, 11, 141. [Google Scholar] [CrossRef]
- Wang, D.; Wang, J.; Ren, Z.; Li, W. DHBP: A dual-stream hierarchical bilinear pooling model for plant disease multi-task classification. Comput. Electron. Agric. 2022, 195, 106788. [Google Scholar] [CrossRef]
- Arvidsson, S.; Pérez-Rodríguez, P.; Mueller-Roeber, B. A growth phenotyping pipeline for Arabidopsis thaliana integrating image analysis and rosette area modeling for robust quantification of genotype effects. New Phytol. 2011, 191, 895–907. [Google Scholar] [CrossRef] [PubMed]
- Dhondt, S.; Gonzalez, N.; Blomme, J.; De Milde, L.; Van Daele, T.; Van Akoleyen, D.; Storme, V.; Coppens, F.; Beemster, G.T.S.; Inzé, D. High-resolution time-resolved imaging of in vitro Arabidopsis rosette growth. Plant J. 2014, 80, 172–184. [Google Scholar] [CrossRef] [PubMed]
- Vasseur, F.; Bresson, J.; Wang, G.; Schwab, R.; Weigel, D. Image-based methods for phenotyping growth dynamics and fitness components in Arabidopsis thaliana. Plant Methods 2018, 14, 63. [Google Scholar] [CrossRef]
- Taghavi Namin, S.; Esmaeilzadeh, M.; Najafi, M.; Brown, T.B.; Borevitz, J.O. Deep phenotyping: Deep learning for temporal phenotype/genotype classification. Plant Methods 2018, 14, 66. [Google Scholar] [CrossRef] [PubMed]
- Li, J.; Peng, J.; Jiang, X.; Rea, A.C.; Peng, J.; Hu, J. DeepLearnMOR: A deep-learning framework for fluorescence image-based classification of organelle morphology. Plant Physiol. 2021, 186, 1786–1799. [Google Scholar] [CrossRef]
- Gu, J.; Wang, Z.; Kuen, J.; Ma, L.; Shahroudy, A.; Shuai, B.; Liu, T.; Wang, X.; Wang, G.; Cai, J. Recent advances in convolutional neural networks. Pattern Recognit. 2018, 77, 354–377. [Google Scholar] [CrossRef]
- Weimer, D.; Scholz-Reiter, B.; Shpitalni, M. Design of deep convolutional neural network architectures for automated feature extraction in industrial inspection. CIRP Ann. 2016, 65, 417–420. [Google Scholar] [CrossRef]
- He, T.; Liu, Y.; Yu, Y.; Zhao, Q.; Hu, Z. Application of deep convolutional neural network on feature extraction and detection of wood defects. Measurement 2020, 152, 107357. [Google Scholar] [CrossRef]
- Selvaraju, R.R.; Das, A.; Vedantam, R.; Cogswell, M.; Parikh, D.; Batra, D. Grad-CAM: Why did you say that? arXiv 2016, arXiv:1611.07450. [Google Scholar]
- Zhang, Q.; Yang, L.T.; Chen, Z.; Li, P. A survey on deep learning for big data. Inf. Fusion 2018, 42, 146–157. [Google Scholar] [CrossRef]
- Bai, L.; Ong, Y.-S.; He, T.; Gupta, A. Multi-task gradient descent for multi-task learning. Memetic Comput. 2020, 12, 355–369. [Google Scholar] [CrossRef]
- Yasrab, R.; Zhang, J.; Smyth, P.; Pound, M.P. Predicting Plant Growth from Time-Series Data Using Deep Learning. Remote Sens. 2021, 13, 331. [Google Scholar] [CrossRef]
- Chen, Y.; Gao, Y.; Zhou, H.; Zuo, Y.; Zhang, Y.; Yue, Z. AthEDL: Identifying Enhancers in Arabidopsis thaliana Using an Attention-based Deep Learning Method. Curr. Bioinform. 2022, 17, 531–540. [Google Scholar] [CrossRef]
- Panwar, H.; Gupta, P.; Siddiqui, M.K.; Morales-Menendez, R.; Bhardwaj, P.; Singh, V. A deep learning and grad-CAM based color visualization approach for fast detection of COVID-19 cases using chest X-ray and CT-Scan images. Chaos Solitons Fractals 2020, 140, 110190. [Google Scholar] [CrossRef]
- Scebba, G.; Zhang, J.; Catanzaro, S.; Mihai, C.; Distler, O.; Berli, M.; Karlen, W. Detect-and-segment: A deep learning approach to automate wound image segmentation. Inform. Med. Unlocked 2022, 29, 100884. [Google Scholar] [CrossRef]
- Ni, J.; Gao, J.; Deng, L.; Han, Z. Monitoring the change process of banana freshness by GoogLeNet. IEEE Access 2020, 8, 228369–228376. [Google Scholar] [CrossRef]
- Jakob, L.; Gust, A.; Grohmann, D. Evaluation and optimisation of unnatural amino acid incorporation and bioorthogonal bioconjugation for site-specific fluorescent labelling of proteins expressed in mammalian cells. Biochem. Biophys. Rep. 2019, 17, 1–9. [Google Scholar] [CrossRef]
- Biener, G.; Stoneman, M.R.; Raicu, V. Fluorescence intensity fluctuation analysis of receptor oligomerization in membrane domains. Biophys. J. 2021, 120, 3028–3039. [Google Scholar] [CrossRef]
- Shibata, Y.; Saga, Y.; Tamiaki, H.; Itoh, S. Anisotropic distribution of emitting transition dipoles in chlorosome from Chlorobium tepidum: Fluorescence polarization anisotropy study of single chlorosomes. Photosynth. Res. 2009, 100, 67–78. [Google Scholar] [CrossRef]
- Zhang, W.; Li, R.; Zeng, T.; Sun, Q.; Kumar, S.; Ye, J.; Ji, S. Deep model based transfer and multi-task learning for biological image analysis. In Proceedings of the 21th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, Sydney, NSW, Australia, 10–13 August 2015; pp. 1475–1484. [Google Scholar]
- Chen, Y.; Huang, Y.; Zhang, Z.; Wang, Z.; Liu, B.; Liu, C.; Huang, C.; Dong, S.; Pu, X.; Wan, F. Plant image recognition with deep learning: A review. Comput. Electron. Agric. 2023, 212, 108072. [Google Scholar] [CrossRef]
- Picek, L.; Šulc, M.; Patel, Y.; Matas, J. Plant recognition by AI: Deep neural nets, transformers, and kNN in deep embeddings. Front. Plant Sci. 2022, 13, 787527. [Google Scholar] [CrossRef] [PubMed]
- Zhou, C.-L.; Ge, L.-M.; Guo, Y.-B.; Zhou, D.-M.; Cun, Y.-P. A comprehensive comparison on current deep learning approaches for plant image classification. J. Phys. Conf. Ser. 2021, 1873, 012002. [Google Scholar] [CrossRef]
- Gao, J.; Li, P.; Chen, Z.; Zhang, J. A survey on deep learning for multimodal data fusion. Neural Comput. 2020, 32, 829–864. [Google Scholar] [CrossRef] [PubMed]
- Jiang, H.-X. The analysis of plants image recognition based on deep learning and artificial neural network. IEEE Access 2020, 8, 68828–68841. [Google Scholar]
- Suherman, E.; Hindarto, D.; Makmur, A.; Santoso, H. Comparison of Convolutional Neural Network and Artificial Neural Network for Rice Detection. Sink. J. Dan Penelit. Tek. Inform. 2023, 8, 247–255. [Google Scholar] [CrossRef]
- Chen, Y.-R.; Chao, K.; Kim, M.S. Machine vision technology for agricultural applications. Comput. Electron. Agric. 2002, 36, 173–191. [Google Scholar] [CrossRef]
- Patel, K.K.; Kar, A.; Jha, S.; Khan, M. Machine vision system: A tool for quality inspection of food and agricultural products. J. Food Sci. Technol. 2012, 49, 123–141. [Google Scholar] [CrossRef] [PubMed]
- Li, Z.; Guo, R.; Li, M.; Chen, Y.; Li, G. A review of computer vision technologies for plant phenotyping. Comput. Electron. Agric. 2020, 176, 105672. [Google Scholar] [CrossRef]
Category | Resolution | Training Set | Validation Set | Total |
---|---|---|---|---|
Cvi | 320 × 320 | 385 | 165 | 550 |
Col-0 | 320 × 320 | 370 | 158 | 528 |
Ler-1 | 320 × 320 | 400 | 172 | 572 |
Sf-2 | 320 × 320 | 339 | 145 | 484 |
Category | Resolution | Training Set | Validation Set | Total |
---|---|---|---|---|
chloroplast | 1024 × 1024 | 210 | 90 | 300 |
mitochondria | 1024 × 1024 | 210 | 90 | 300 |
peroxisome | 1024 × 1024 | 210 | 90 | 300 |
Model | Classification | Regression | Training Time | ||||
---|---|---|---|---|---|---|---|
Accuracy | F1Score | FPR | RMSE | R2 | RPD | ||
Model 1 | 99.92 | 99.85 | 0.0051 | 1.5631 | 0.9377 | 3.8974 | 3676 |
Model 2 | 93.36 | 86.62 | 0.0444 | 2.4109 | 0.8518 | 2.4697 | 3253 |
Task 1 | 98.97 | 97.91 | 0.0067 | 2.0202 | 0.8962 | 2.7532 | 356 |
Task 2 | 98.55 | 97.17 | 0.0098 | 2.6659 | 0.8389 | 2.4060 | 261 |
Task 3 | 97.43 | 94.93 | 0.0168 | 2.9951 | 0.8135 | 1.8728 | 204 |
Task 4 | 94.88 | 90.01 | 0.0343 | 2.7598 | 0.8374 | 2.1739 | 159 |
Task 5 | 96.00 | 91.98 | 0.0274 | 2.5517 | 0.8441 | 2.1249 | 127 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Liu, H.; Zhu, H.; Liu, F.; Deng, L.; Wu, G.; Han, Z.; Zhao, L. From Organelle Morphology to Whole-Plant Phenotyping: A Phenotypic Detection Method Based on Deep Learning. Plants 2024, 13, 1177. https://doi.org/10.3390/plants13091177
Liu H, Zhu H, Liu F, Deng L, Wu G, Han Z, Zhao L. From Organelle Morphology to Whole-Plant Phenotyping: A Phenotypic Detection Method Based on Deep Learning. Plants. 2024; 13(9):1177. https://doi.org/10.3390/plants13091177
Chicago/Turabian StyleLiu, Hang, Hongfei Zhu, Fei Liu, Limiao Deng, Guangxia Wu, Zhongzhi Han, and Longgang Zhao. 2024. "From Organelle Morphology to Whole-Plant Phenotyping: A Phenotypic Detection Method Based on Deep Learning" Plants 13, no. 9: 1177. https://doi.org/10.3390/plants13091177
APA StyleLiu, H., Zhu, H., Liu, F., Deng, L., Wu, G., Han, Z., & Zhao, L. (2024). From Organelle Morphology to Whole-Plant Phenotyping: A Phenotypic Detection Method Based on Deep Learning. Plants, 13(9), 1177. https://doi.org/10.3390/plants13091177