Deep Learning in Controlled Environment Agriculture: A Review of Recent Advancements, Challenges and Prospects
Abstract
:1. Introduction
1.1. Review Scope
1.2. Paper Organization
2. Research Methodology
2.1. Review Protocol
2.2. Research Questions
- RQ.1: What are the most often utilized DL models in CEA, and their benefits and drawbacks?
- RQ.2: What are the main application domains of DL in CEA?
- RQ.3: What evaluation parameters are used for DL models in CEA?
- RQ.4: What are the DL backbone networks used in CEA applications?
- RQ.5: What are the optimization methods used for CEA applications?
- RQ.6: What are the primary growing media and plants used for DL in the CEA?
2.3. Search Method
- Science Direct: (“controlled environment agriculture” OR “greenhouse” OR “plant factory” OR “vertical farm”) AND (“Deep Learning”) NOT (“Internet of Things” OR “GREENHOUSE GAS” OR “gas emissions” OR “Machine learning”)
- Wiley: (“controlled environment agriculture” OR “greenhouse” OR “plant factory” OR “vertical farm*”) AND (“deep learning”) NOT (“Internet of Things” OR “greenhouse gas” OR “Gas emissions” OR “machine learning” OR “Review”)
- Web of Science: (AB = (((“controlled environment agriculture” OR “vertical farm” OR “greenhouse” OR “plant factory”) AND (“deep learning” ) NOT ( “Gas Emissions” OR “Internet of Things” OR “Greenhouse Gas” OR “machine learning” OR “Review”))))
- Springer Link: (“deep learning”) AND (“Greenhouse” OR “controlled environment agriculture” OR “vertical farm” OR “plant factory”) NOT (“Internet of things” OR “review” OR “survey” OR “greenhouse gas” OR “IoT” OR “machine learning” OR “gas emissions”)
- Google Scholar: “greenhouse” OR “vertical farm” OR “controlled environment agriculture” OR “plant factory” “deep learning”—“Internet of Things”—“IoT”—“greenhouse gas”—“review”—“survey”—“greenhouse gases”—“Gas Emissions”—“machine learning”
- Scopus: TITLE-ABS-KEY ((“deep learning”) AND (“vertical farm*” OR “controlled environment agriculture” OR “plant factory” OR “greenhouse”)) AND (LIMIT-TO (PUBYEAR, 2022 ) OR LIMIT-TO (PUBYEAR, 2021) OR LIMIT-TO ( PUBYEAR, 2020) OR LIMIT-TO ( PUBYEAR, 2019 )) AND (LIMIT-TO (LANGUAGE, “English” )) AND (EXCLUDE (EXACTKEYWORD, “Greenhouse Gases”) OR EXCLUDE ( EXACTKEYWORD, “Gas Emissions”) OR EXCLUDE (EXACTKEYWORD, “Machine Learning”) OR EXCLUDE (EXACTKEYWORD, “Internet of Things”))
- IEEEXplore: (“controlled environment agriculture” OR “greenhouse” OR “plant factory” OR “vertical farm”) AND (“Deep Learning”) NOT (“Internet of Things” OR “GREENHOUSE GAS” OR “gas emissions” OR “Machine learning”)
2.4. Selection/Inclusion Criteria
- IC.1: Peer-reviewed journal publications and conference papers.
- IC.2: Studies published during the period between 2019 and April 2022.
- IC.3: Studies should offer answers to the RQs.
- EC.1: Study unrelated to DL for CEA.
- EC.2: Full text not accessible.
- EC.3: Duplicate or obtained from another database.
- EC.4: Publication is a review or survey article.
- EC.5: Publications such as book reviews, editorials, and summaries of conferences and seminars are not subjected to peer review.
- EC.6: Studies published before 2019.
2.5. Data Extraction
3. Deep Learning in CEA
3.1. Deep Learning in Greenhouses
3.1.1. Microclimate Condition Prediction
3.1.2. Yield Estimation
3.1.3. Disease Detection and Classification
3.1.4. Growth Monitoring
3.1.5. Nutrient Detection and Estimation
3.1.6. Small Insect Detection
3.1.7. Robotic Harvesting
3.1.8. Others
3.2. Deep Learning in Indoor Farms
3.2.1. Stress-Level Monitoring
3.2.2. Growth Monitoring
3.2.3. Yield Estimation
4. Discussion
4.1. Summary of Reviewed Studies
4.2. Evaluation Parameters
4.3. DL Backbone Networks
4.4. Optimizer
4.5. Growing Medium and Plant Distribution
4.6. Challenges and Future Directions
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Abbreviations
MDPI | Multidisciplinary Digital Publishing Institute |
DOAJ | Directory of open access journals |
TLA | Three letter acronym |
LD | Linear dichroism |
AE | Autoencoder |
AI | Artificial Intelligence |
Adam | Adaptive Moment Estimation |
AGA | Average Gripping Accuracy |
ANN | Artificial Neural Networks |
AP | Average Precision |
CEA | Controlled Environment Agriculture |
CNN | Convolutional Neural Networks |
DBN | Deep Belief Network |
DCNN | Deep Convolutional Neural Networks |
DL | Deep Learning |
DM | Downy Mildew |
DNN | Deep Neural Networks |
FPN | Feature Pyramid Networks |
HSI | Hue, Saturation, Intensity |
HSV | Hue, Saturation, Value |
GRU | Gated Recurrent Unit |
IoU | Intersection Over Union |
LDPE | Low-density Polyethylene |
LiDAR | Light Detection and Ranging |
LiPo | Lithium-ion Polymer |
LSTM | Long Short Term Memory |
mAP | Mean Average Precision |
MAPE | Mean Average Percent Error |
NS | Not Specified |
NTM | Neural Turing Machines |
P | Precision |
PM | Powdery Mildew |
PSNR | Peak Signal Noise Ratio |
R | Recall |
R2 | R-Square |
RBM | Restricted Boltzmann machine |
R-CNN | Region-Based Convolutional Neural Networks |
ResNet | Residual Networks |
RMSE | Root Mean Square Error |
RNN | Recurrent Neural Networks |
RMSProp | Root Mean Squared Propagation |
RPN | Region Proposal Network |
RQ | Research Questions |
SGD | Stochastic Gradient Descent |
SEP | Standard Error of Prediction |
SSD | Single Shot Multibox Detector |
SSIM | Structural Similarity Index Measure |
SLR | Systematic Literature Review |
STN | Spatial Transformer Network |
SVM | Support Vector Machine |
TCN | Temporal Convolutional Networks |
VGG | Visual Geometry Group |
VPD | Vapor Pressure Deficit |
YOLO | You Only Look Once |
References
- World Health Organization. The State of Food Security and Nutrition in the World 2018: Building Climate Resilience for Food Security and Nutrition; Food and Agriculture Organization: Rome, Italy, 2018. [Google Scholar]
- Avtar, R.; Tripathi, S.; Aggarwal, A.K.; Kumar, P. Population–Urbanization–Energy Nexus: A Review. Resources 2019, 8, 136. [Google Scholar] [CrossRef] [Green Version]
- Benke, K.; Tomkins, B. Future Food-Production Systems: Vertical Farming and Controlled-Environment Agriculture. Sustain. Sci. Pract. Policy 2017, 13, 13–26. [Google Scholar] [CrossRef] [Green Version]
- Saad, M.H.M.; Hamdan, N.M.; Sarker, M.R. State of the Art of Urban Smart Vertical Farming Automation System: Advanced Topologies, Issues and Recommendations. Electronics 2021, 10, 1422. [Google Scholar] [CrossRef]
- Fortune Business Insights. Vertical Farming Market to Rise at 25.2% CAGR by 2028; Increasing Number of Product Launches Will Aid Growth, Says Fortune Business Insights™. Available online: https://www.globenewswire.com/news-release/2021/06/08/2243245/0/en/vertical-farming-market-to-rise-at-25-2-cagr-by-2028-increasing-number-of-product-launches-will-aid-growth-says-fortune-business-insights.html (accessed on 18 July 2022).
- Cision. United States $3 Billion Vertical Farming Market to 2024: Growing Popularity of Plug & Play Farms Scope for Automation Using Big Data and AI. Based on Report, Vertical Farming Market in the U.S.—Industry Outlook and Forecast 2019–2024”, by Research and Markets. Available online: https://www.prnewswire.com/news-releases/united-states-3-billion-vertical-farming-market-to-2024-growing-popularity-of-plug--play-farms--scope-for-automation-using-big-data-and-ai-300783042.html (accessed on 18 July 2022).
- Asseng, S.; Guarin, J.R.; Raman, M.; Monje, O.; Kiss, G.; Despommier, D.D.; Meggers, F.M.; Gauthier, P.P. Wheat Yield Potential in Controlled-Environment Vertical Farms. Proc. Natl. Acad. Sci. USA 2020, 117, 19131–19135. [Google Scholar] [CrossRef] [PubMed]
- Naus, T. Is Vertical Farming Really Sustainable. EIT Food. Available online: https://www.eitfood.eu/blog/post/is-vertical-farming-really-sustainable (accessed on 18 July 2022).
- Chia, T.-C.; Lu, C.-L. Design and Implementation of the Microcontroller Control System for Vertical-Garden Applications. In Proceedings of the 2011 Fifth International Conference on Genetic and Evolutionary Computing, Xiamen, China, 29 August–1 September 2011; pp. 139–141. [Google Scholar]
- Michael, G.; Tay, F.; Then, Y. Development of Automated Monitoring System for Hydroponics Vertical Farming. J. Phys. Conf. 2021, 1844, 012024. [Google Scholar] [CrossRef]
- Bhowmick, S.; Biswas, B.; Biswas, M.; Dey, A.; Roy, S.; Sarkar, S.K. Application of IoT-Enabled Smart Agriculture in Vertical Farming. In Advances in Communication, Devices and Networking, Lecture Notes in Electrical Engineering; Springer: Sinngapore, 2019; Volume 537, pp. 521–528. [Google Scholar]
- Monteiro, J.; Barata, J.; Veloso, M.; Veloso, L.; Nunes, J. Towards Sustainable Digital Twins for Vertical Farming. In Proceedings of the 2018 Thirteenth International Conference on Digital Information Management (ICDIM), Berlin, Germany, 24–26 September 2018; pp. 234–239. [Google Scholar]
- Siregar, R.R.A.; Palupiningsih, P.; Lailah, I.S.; Sangadji, I.B.; Sukmajati, S.; Pahiyanti, A.N.G. Automatic Watering Systems in Vertical Farming Using the Adaline Algorithm. In Proceedings of the International Seminar of Science and Applied Technology (ISSAT 2020), Virtual, 24 November 2020; pp. 429–435. [Google Scholar]
- Ruscio, F.; Paoletti, P.; Thomas, J.; Myers, P.; Fichera, S. Low-cost Monitoring System for Hydroponic Urban Vertical Farms. Int. J. Agric. Biosyst. Eng. 2019, 13, 267–271. [Google Scholar]
- Leblanc, R. What You Should Know about Vertical Farming. Available online: https://www.thebalancesmb.com/what-you-should-know-about-vertical-farming-4144786 (accessed on 18 July 2022).
- Statista-Research. Labor Demand for Indoor Farming Worldwide as of 2016, by Farm Size. Available online: https://www.statista.com/statistics/752196/labor-demand-for-indoor-farming-by-farm-size/ (accessed on 18 July 2022).
- Iron-OX. Available online: https://ironox.com/technology/ (accessed on 18 July 2022).
- Bac, C.W.; Hemming, J.; Henten, E.J.V. Stem Localization of Sweet-Pepper Plants using the Support Wire as a Visual Cue. Comput. Electron. Agric. 2014, 105, 111–120. [Google Scholar] [CrossRef]
- Feng, Q.; Zou, W.; Fan, P.; Zhang, C.; Wang, X. Design and Test of Robotic Harvesting System for Cherry Tomato. Int. J. Agric. Biol. 2018, 11, 96–100. [Google Scholar] [CrossRef]
- Yaguchi, H.; Nagahama, K.; Hasegawa, T.; Inaba, M. Development of an Autonomous Tomato Harvesting Robot with Rotational Plucking Gripper. In Proceedings of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Korea, 9–14 October 2016; pp. 652–657. [Google Scholar]
- Tsoulias, N.; Paraforos, D.S.; Xanthopoulos, G.; Zude-Sasse, M. Apple Shape Detection Based on Geometric and Radiometric Features using a LiDAR Laser Scanner. Remote Sens. 2020, 12, 2481. [Google Scholar] [CrossRef]
- Kamilaris, A.; Prenafeta-Boldú, F.X. Deep Learning in Agriculture: A Survey. Comput. Electron. Agric. 2018, 147, 70–90. [Google Scholar] [CrossRef] [Green Version]
- Koirala, A.; Walsh, K.B.; Wang, Z.; McCarthy, C. Deep Learning–Method Overview and Review of Use for Fruit Detection and Yield Estimation. Comput. Electron. Agric. 2019, 162, 219–234. [Google Scholar] [CrossRef]
- Saleem, M.H.; Potgieter, J.; Arif, K.M. Plant Disease Detection and Classification by Deep Learning. Plants 2019, 8, 468. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Zhang, Q.; Liu, Y.; Gong, C.; Chen, Y.; Yu, H. Applications of Deep Learning for Dense Scenes Analysis in Agriculture: A Review. Sensors 2020, 20, 1520. [Google Scholar] [CrossRef] [Green Version]
- Li, L.; Zhang, S.; Wang, B. Plant Disease Detection and Classification by Deep Learning—A Review. IEEE Access 2021, 9, 56683–56698. [Google Scholar] [CrossRef]
- Hasan, A.M.; Sohel, F.; Diepeveen, D.; Laga, H.; Jones, M.G. A Survey of Deep Learning Techniques for Weed Detection from Images. Comput. Electron. Agric. 2021, 184, 106067. [Google Scholar] [CrossRef]
- Darwin, B.; Dharmaraj, P.; Prince, S.; Popescu, D.E.; Hemanth, D.J. Recognition of Bloom/Yield in Crop Images Using Deep Learning Models for Smart Agriculture: A Review. Agronomy 2021, 11, 646. [Google Scholar] [CrossRef]
- Okoli, C.; Schabram, K. A Guide to Conducting a Systematic Literature Review of Information Systems Research; Elsevier: Amsterdam, The Netherlands, 2010. [Google Scholar]
- Nam, D.S.; Moon, T.; Lee, J.W.; Son, J.E. Estimating Transpiration Rates of Hydroponically-Grown Paprika Via an Artificial Neural Network Using Aerial and Root-Zone Environments and GrowthFactors in Greenhouses. Hortic. Environ. Biotechnol. 2019, 60, 913–923. [Google Scholar] [CrossRef]
- Gong, L.; Yu, M.; Jiang, S.; Cutsuridis, V.; Pearson, S. Deep Learning Based Prediction on Greenhouse Crop Yield Combined TCN and RNN. Sensors 2021, 21, 4537. [Google Scholar] [CrossRef]
- Jung, D.-H.; Kim, H.S.; Jhin, C.; Kim, H.-J.; Park, S.H. Time-Serial Analysis of Deep Neural Network Models for Prediction of Climatic Conditions inside a Greenhouse. Comput. Electron. Agric. 2020, 173, 105402. Available online: https://www.sciencedirect.com/science/article/pii/S0168169919317326 (accessed on 15 September 2022). [CrossRef]
- Ali, A.; Hassanein, H.S. Wireless Sensor Network and Deep Learning for Prediction Greenhouse Environments. In Proceedings of the 2019 International Conference on Smart Applications, Communications and Networking (SmartNets), Sharm El Sheikh, Egypt, 17–19 December 2019; pp. 1–5. [Google Scholar]
- Liu, Y.; Li, D.; Wan, S.; Wang, F.; Dou, W.; Xu, X.; Li, S.; Ma, R.; Qi, L. A Long Short-Term Memory-Based Model for Greenhouse Climate Prediction. Int. J. Intell. Syst. 2022, 37, 135–151. [Google Scholar] [CrossRef]
- Picon, A.; San-Emeterio, M.G.; Bereciartua-Perez, A.; Klukas, C.; Eggers, T.; Navarra-Mestre, R. Deep Learning-based Segmentation of Multiple Species of Weeds and Corn Crop Using Synthetic and Real Image Datasets. Comput. Electron. Agric. 2022, 194, 106719. [Google Scholar] [CrossRef]
- Li, X.; Pan, J.; Xie, F.; Zeng, J.; Li, Q.; Huang, X.; Liu, D.; Wang, X. Fast and Accurate Green Pepper Detection in Complex Backgrounds Via an Improved YOLOv4-tiny Model. Comput. Electron. Agric. 2021, 191, 106503. [Google Scholar] [CrossRef]
- Tenorio, G.L.; Caarls, W. Automatic Visual Estimation of Tomato Cluster Maturity in Plant Rows. Mach. Vis. Appl. 2021, 32, 1–18. [Google Scholar] [CrossRef]
- Sun, J.; He, X.; Wu, M.; Wu, X.; Shen, J.; Lu, B. Detection of Tomato Organs based on Convolutional Neural Network under the Overlap and Occlusion Backgrounds. Mach. Vis. Appl. 2020, 31, 1–13. [Google Scholar] [CrossRef]
- Rong, J.; Wang, P.; Yang, Q.; Huang, F. A Field-Tested Harvesting Robot for Oyster Mushroom in Greenhouse. Agronomy 2021, 11, 1210. [Google Scholar] [CrossRef]
- Fonteijn, H.; Afonso, M.; Lensink, D.; Mooij, M.; Faber, N.; Vroegop, A.; Polder, G.; Wehrens, R. Automatic Phenotyping of Tomatoes in Production Greenhouses using Robotics and Computer Vision: From Theory to Practice. Agronomy 2021, 11, 1599. [Google Scholar] [CrossRef]
- Lu, C.-P.; Liaw, J.-J.; Wu, T.-C.; Hung, T.-F. Development of a Mushroom Growth Measurement System Applying Deep Learning for Image Recognition. Agronomy 2019, 9, 32. [Google Scholar] [CrossRef] [Green Version]
- Seo, D.; Cho, B.-H.; Kim, K. Development of Monitoring Robot System for Tomato Fruits in Hydroponic Greenhouses. Agronomy 2021, I, 2211. [Google Scholar] [CrossRef]
- Yuan, T.; Lv, L.; Zhang, F.; Fu, J.; Gao, J.; Zhang, J.; Li, W.; Zhang, C.; Zhang, W. Robust Cherry Tomatoes Detection Algorithm in Greenhouse Scene Based on SSD. Agriculture 2020, 10, 160. [Google Scholar] [CrossRef]
- Islam, M.P.; Nakano, Y.; Lee, U.; Tokuda, K.; Kochi, N. TheLNet270v1–A Novel Deep-Network Architecture for the Automatic Classification of Thermal Images for Greenhouse Plants. Front. Plant Sci. 2021, 12, 630425. [Google Scholar] [CrossRef]
- Afonso, M.; Fonteijn, H.; Fiorentin, F.S.; Lensink, D.; Mooij, M.; Faber, N.; Polder, G.; Wehrens, R. Tomato Fruit Detection and Counting in Greenhouses Using Deep Learning. Front. Plant Sci. 2020, 11, 571299. [Google Scholar] [CrossRef]
- Zhang, P.; Li, D. YOLO-VOLO-LS: A Novel Method for Variety Identification of Early Lettuce Seedlings. Front. Plant Sci. 2022, 13, 806878. [Google Scholar] [CrossRef]
- Lawal, O.M.; Zhao, H. YOLOFig Detection Model Development Using Deep Learning. IET Image Process. 2021, 15, 3071–3079. [Google Scholar] [CrossRef]
- Zhou, C.; Hu, J.; Xu, Z.; Yue, J.; Ye, H.; Yang, G. A Novel Greenhouse-Based System for the Detection and Plumpness Assessment of Strawberry using an Improved Deep Learning Technique. Front. Plant Sci. 2020, 11, 559. [Google Scholar] [CrossRef]
- Arad, B.; Kurtser, P.; Barnea, E.; Harel, B.; Edan, Y.; Ben-Shahar, O. Controlled Lighting and Illumination-Independent Target Detection for Real-Time Cost-Efficient Applications. The Case Study of Sweet Pepper Robotic Harvesting. Sensors 2019, 9, 1390. [Google Scholar] [CrossRef] [Green Version]
- Mu, Y.; Chen, T.-S.; Ninomiya, S.; Guo, W. Intact Detection of Highly Occluded Immature Tomatoes on Plants using Deep Learning Techniques. Sensors 2020, 20, 2984. [Google Scholar] [CrossRef] [PubMed]
- Moreira, G.; Magalhaes, S.A.; Pinho, T.; Santos, F.N.d.; Cunha, M. Benchmark of Deep Learning and a Proposed HSV Colour Space Models for the Detection and Classification of Greenhouse Tomato. Agronomy 2022, 12, 356. [Google Scholar] [CrossRef]
- Lawal, O.M. YOLOMuskmelon: Quest for Fruit Detection Speed and Accuracy using Deep Learning. IEEE Access 2021, 9, 15221–15227. [Google Scholar] [CrossRef]
- Magalhaes, S.A.; Castro, L.; Moreira, G.; Santos, F.N.D.; Cunha, M.; Dias, J.; Moreira, A.P. Evaluating the Single-Shot Multibox Detector and YOLO Deep Learning Models for the Detection of Tomatoes in a Greenhouse. Sensors 2021, 21, 3569. [Google Scholar] [CrossRef]
- Lyu, B.; Smith, S.D.; Cherkauer, K.A. Fine-Grained Recognition in High-Throughput Phenotyping. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, Seattle, WA, USA, 14–19 June 2020; pp. 72–73. [Google Scholar]
- Zhou, J.; Li, J.; Wang, C.; Wu, H.; Zhao, C.; Wang, Q. A Vegetable Disease Recognition Model for Complex Background based on Region Proposal and Progressive Learning. Comput. Electron. Agric. 2021, 184, 106101. [Google Scholar] [CrossRef]
- Qi, J.; Liu, X.; Liu, K.; Xu, F.; Guo, H.; Tian, X.; Li, M.; Bao, Z.; Li, Y. An Improved YOLOv5 Model Based on Visual Attention Mechanism: Application to Recognition of Tomato Virus Disease. Comput. Electron. Agric. 2022, 194, 106780. [Google Scholar] [CrossRef]
- Zhang, P.; Yang, L.; Li, D. Efficientnet-B4-Ranger: A Novel Method for Greenhouse Cucumber Disease Recognition under Natural Complex Environment. Comput. Electron. Agric. 2020, 176, 105652. [Google Scholar] [CrossRef]
- Wang, C.; Zhou, J.; Zhao, C.; Li, J.; Teng, G.; Wu, H. Few-shot Vegetable Disease Recognition Model Based on Image Text Collaborative Representation Learning. Comput. Electron. Agric. 2021, 184, 106098. [Google Scholar] [CrossRef]
- Fernando, S.; Nethmi, R.; Silva, A.; Perera, A.; Silva, R.D.; Abeygunawardhana, P.K. Intelligent Disease Detection System for Greenhouse with a Robotic Monitoring System. In Proceedings of the 2nd International Conference on Advancements in Computing (ICAC), Colombo, Sri Lanka, 10–11 December 2020; Volume 1, pp. 204–209. [Google Scholar]
- Nieuwenhuizen, A.; Kool, J.; Suh, H.; Hemming, J. Automated Spider Mite Damage Detection on Tomato Leaves in Greenhouses. In Proceedings of the XI International Symposium on Protected Cultivation in Mild Winter Climates and I International Symposium on Nettings and 1268, Tenerife, Canary Islands, Spain, 27–31 January 2019; pp. 165–172. [Google Scholar]
- Liu, K.; Zhang, C.; Yang, X.; Diao, M.; Liu, H.; Li, M. Development of an Occurrence Prediction Model for Cucumber Downy Mildew in Solar Greenhouses Based on Long Short-Term Memory Neural Network. Agronomy 2022, 12, 442. [Google Scholar] [CrossRef]
- Fuentes, A.; Yoon, S.; Lee, M.H.; Park, D.S. Improving Accuracy of Tomato Plant Disease Diagnosis Based on Deep Learning With Explicit Control of Hidden Classes. Front. Plant Sci. 2021, 12, 682230. [Google Scholar] [CrossRef]
- Wang, X.; Liu, J. Tomato Anomalies Detection in Greenhouse Scenarios Based on YOLO-Dense. Front. Plant Sci. 2021, 12, 533. [Google Scholar] [CrossRef]
- Zhang, Z.; Flores, P.; Friskop, A.; Liu, Z.; Igathinathane, C.; Jahan, N.; Mathew, J.; Shreya, S. Enhancing Wheat Disease Diagnosis in a GreenhouseUusing Image Deep Features and Parallel Feature Fusion. Front. Plant Sci. 2022, 13, 834447. [Google Scholar] [CrossRef]
- Li, W.; Wang, D.; Li, M.; Gao, Y.; Wu, J.; Yang, X. Field detection of tiny pests from sticky trap images using deep learning in agricultural greenhouse. Comput. Electron. Agric. 2021, 183, 106048. [Google Scholar] [CrossRef]
- Tureček, T.; Vařacha, P.; Turexcxková, A.; Psota, V.; Jankúu, P.; Štěpánek, V.; Viktorin, A.; xSxenkexrxík, R.; Jašek, R.; Chramcov, B.; et al. Scouting of Whiteflies in Tomato Greenhouse Environment Using Deep Learning. In Agriculture Digitalization and Organic Production; Springer: Singapore, 2022; pp. 323–335. [Google Scholar]
- Wang, D.; Wang, Y.; Li, M.; Yang, X.; Wu, J.; Li, W. Using an Improved YOLOv4 Deep Learning Network for Accurate Detection of Whitefly and Thrips on Sticky Trap Images. Trans. ASABE 2021, 64, 919–927. [Google Scholar] [CrossRef]
- Rustia, D.J.A.; Chao, J.-J.; Chiu, L.-Y.; Wu, Y.-F.; Chung, J.-Y.; Hsu, J.-C.; Lin, T.-T. Automatic Greenhouse Insect Pest Detection and Recognition Based on a Cascaded Deep Learning Classification Method. J. Appl. Entomol. 2021, 145, 206–222. [Google Scholar] [CrossRef]
- Zhou, X.; Sun, J.; Tian, Y.; Yao, K.; Xu, M. Detection of Heavy Metal Lead in Lettuce Leaves Based on Fluorescence Hyperspectral Technology Combined with Deep Learning Algorithm. Spectrochim. Acta Part Mol. Biomol. Spectrosc. 2022, 266, 120460. [Google Scholar] [CrossRef] [PubMed]
- da Silva, L.A.; Bressan, P.O.; Gonçalves, D.N.; Freitas, D.M.; Machado, B.B.; Gonxcxalves, W.N. Estimating Soybean Leaf Defoliation using Convolutional Neural Networks and Synthetic Images. Comput. Electron. Agric. 2019, 156, 360–368. [Google Scholar] [CrossRef]
- Qu, Y.; Clausen, A.; Jørgensen, B.N. Application of Deep Neural Network on Net Photosynthesis Modeling. In Proceedings of the IEEE 19th International Conference on Industrial Informatics (INDIN), Palma de Mallorca, Spain, 21–23 July 2021; pp. 1–7. [Google Scholar]
- Ahsan, M.; Eshkabilov, S.; Cemek, B.; Küçüktopcu, E.; Lee, C.W.; Simsek, H. Deep Learning Models to Determine Nutrient Concentration in Hydroponically Grown Lettuce Cultivars (Lactuca sativa L.). Sustainability 2021, 14, 416. [Google Scholar] [CrossRef]
- Kusanur, V.; Chakravarthi, V.S. Using Transfer Learning for Nutrient Deficiency Prediction and Classification in Tomato Plan. Int. J. Adv. Comput. Sci. Appl. 2021, 12, 784–790. [Google Scholar]
- Sun, J.; Wu, M.; Hang, Y.; Lu, B.; Wu, X.; Chen, Q. Estimating Cadmium Content in Lettuce Leaves Based on Deep Brief Network and Hyperspectral Imaging Technology. J. Food Process Eng. 2019, 42, e13293. [Google Scholar] [CrossRef]
- Tran, T.-T.; Choi, J.-W.; Le, T.-T.H.; Kim, J.-W. A Comparative Study of Deep CNN in Forecasting and Classifying the Macronutrient Deficiencies on Development of Tomato Plant. Appl. Sci. 2019, 9, 1601. [Google Scholar] [CrossRef] [Green Version]
- Vit, A.; Shani, G.; Bar-Hillel, A. Length Phenotyping with Interest Point Detection. Comput. Electron. Agric. 2020, 176, 105629. Available online: https://www.sciencedirect.com/science/article/pii/S0168169919318939 (accessed on 15 September 2022). [CrossRef]
- Boogaard, F.P.; Rongen, K.S.; Kootstra, G.W. Robust node detection and tracking in fruit-vegetable crops using deep learning and multi-view imaging. Biosyst. Eng. 2020, 192, 117–132. [Google Scholar] [CrossRef]
- Xhimitiku, I.; Bianchi, F.; Proietti, M.; Tocci, T.; Marini, A.; Menculini, L.; Termite, L.F.; Pucci, E.; Garinei, A.; Marconi, M.; et al. Anomaly Detection in Plant Growth in a Controlled Environment using 3D Scanning Techniques and Deep Learning. In Proceedings of the 2021 IEEE International Workshop on Metrology for Agriculture and Forestry (MetroAgriFor), Trento, Italy, 3–5 November 2021; pp. 86–91. [Google Scholar]
- Lauguico, S.; Concepcion, R.; Tobias, R.R.; Alejandrino, J.; Guia, J.D.; Guillermo, M.; Sybingco, E.; Dadios, E. Machine Vision-Based Prediction of Lettuce Phytomorphological Descriptors using Deep Learning Networks. In Proceedings of the IEEE 12th International Conference on Humanoid, Nanotechnology, Information Technology, Communication and Control, Environment, and Management (HNICEM), Manila, Philippines, 3–7 December 2020; pp. 1–6. [Google Scholar]
- Zhu, F.; He, M.; Zheng, Z. Data Augmentation using Improved cDCGAN for Plant Vigor Rating. Comput. Electron. Agric. 2020, 175, 105603. [Google Scholar] [CrossRef]
- Ullah, S.; Henke, M.; Narisetti, N.; Panzarová, K.; Trtílek, M.; Hejatko, J.; Gladilin, E. Towards Automated Analysis of Grain Spikes in Greenhouse Images Using Neural Network Approaches: A Comparative Investigation of Six Methods. Sensors 2021, 21, 7441. [Google Scholar] [CrossRef]
- Choi, K.; Park, K.; Jeong, S. Classification of Growth Conditions in Paprika Leaf Using Deep Neural Network and Hyperspectral Images. In Proceedings of the 2021 Twelfth International Conference on Ubiquitous and Future Networks (ICUFN), Jeju Island, Korea, 17–20 August 2021; pp. 93–95. [Google Scholar]
- Baar, S.; Kobayashi, Y.; Horie, T.; Sato, K.; Suto, H.; Watanabe, S. Non-destructive Leaf Area Index Estimation Via Guided Optical Imaging for Large Scale Greenhouse Environments. Comput. Electron. Agric. 2022, 197, 106911. [Google Scholar] [CrossRef]
- Xiong, Y.; Ge, Y.; From, P.J. An Obstacle Separation Method for Robotic Picking of Fruits in Clusters. Comput. Electron. Agric. 2020, 175, 105397. [Google Scholar] [CrossRef]
- Jin, Y.; Liu, J.; Wang, J.; Xu, Z.; Yuan, Y. Far-near Combined Positioning of Picking-point based on Depth Data Features for Horizontal-Trellis Cultivated Grape. Comput. Electron. Agric. 2022, 194, 106791. [Google Scholar] [CrossRef]
- Zhang, F.; Gao, J.; Zhou, H.; Zhang, J.; Zou, K.; Yuan, T. Three-Dimensional Pose Detection method Based on Keypoints Detection Network for Tomato Bunch. Comput. Electron. Agric. 2022, 195, 106824. [Google Scholar] [CrossRef]
- Gong, L.; Wang, W.; Wang, T.; Liu, C. Robotic Harvesting of the Occluded Fruits with a Precise Shape and Position Reconstruction Approach. J. Field Robot. 2022, 39, 69–84. [Google Scholar] [CrossRef]
- Lahcene, A.; Amine, D.M.; Abdelkader, D. A Hybrid Deep Learning Model for Predicting Lifetime and Mechanical Performance Degradation of Multilayer Greenhouse Polyethylene Films. Polym. Sci. Ser. B 2021, 63, 964–977. [Google Scholar] [CrossRef]
- Zhang, P.; Li, D. EPSA-YOLO-V5s: A Novel Method for Detecting the Survival Rate of Rapeseed in a Plant Factory Based on Multiple Guarantee Mechanisms. Comput. Electron. Agric. 2022, 193, 106714. [Google Scholar] [CrossRef]
- Xu, P.; Fang, N.; Liu, N.; Lin, F.; Yang, S.; Ning, J. Visual Recognition of Cherry Tomatoes in Plant Factory Based on Improved Deep Instance Segmentation. Comput. Electron. Agric. 2022, 197, 106991. [Google Scholar] [CrossRef]
- Wu, Z.; Yang, R.; Gao, F.; Wang, W.; Fu, L.; Li, R. Segmentation of Abnormal Leaves of Hydroponic Lettuce Based on DeepLabV3+ for Robotic Sorting. Comput. Electron. Agric. 2021, 190, 106443. [Google Scholar] [CrossRef]
- Hendrawan, Y.; Damayanti, R.; Riza, D.F.A.; Hermanto, M.B. Classification of Water Stress in Cultured Sunagoke Moss Using Deep Learning. Telkomnika 2021, 19, 1594–1604. [Google Scholar] [CrossRef]
- Gozzovelli, R.; Franchetti, B.; Bekmurat, M.; Pirri, F. Tip-Burn Stress Detection of Lettuce Canopy Grown in Plant Factories. In Proceedings of the IEEE/CVF International Conference on Computer Vision, Montreal, QC, Canada, 11–17 October 2021; pp. 1259–1268. [Google Scholar]
- Hao, X.; Jia, J.; Gao, W.; Guo, X.; Zhang, W.; Zheng, L.; Wang, M. MFC-CNN: An Automatic Grading Scheme for Light Stress Levels of Lettuce (Lactuca sativa L.) leaves. Comput. Electron. Agric. 2020, 179, 105847. [Google Scholar] [CrossRef]
- Rizkiana, A.; Nugroho, A.; Salma, N.; Afif, S.; Masithoh, R.; Sutiarso, L.; Okayasu, T. Plant Growth Prediction Model for Lettuce (Lactuca sativa) in Plant Factories Using Artificial Neural Network. In Proceedings of the IOP Conference Series: Earth and Environmental Science, Miass, Russia, 20–23 September 2021; Volume 733, p. 012027. [Google Scholar]
- Kim, T.; Lee, S.-H.; Kim, J.-O. A Novel Shape Based Plant Growth Prediction Algorithm Using Deep Learning and Spatial Transformation. IEEE Access 2022, 10, 731–737. [Google Scholar] [CrossRef]
- Chang, S.; Lee, U.; Hong, M.J.; Jo, Y.D.; Kim, J.-B. Time-Series Growth Prediction Model Based on U-Net and Machine Learning in Arabidopsis. Front. Plant Sci. 2021, 12, 512–721. [Google Scholar] [CrossRef] [PubMed]
- Franchetti, B.; Ntouskos, V.; Giuliani, P.; Herman, T.; Barnes, L.; Pirri, F. Vision Based Modeling of Plants Phenotyping in Vertical Farming under Artificial Lighting. Sensors 2019, 19, 4378. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Buxbaum, N.; Lieth, J.; Earles, M. Non-Destructive Plant Biomass Monitoring With High Spatio-Temporal Resolution via Proximal RGB-D Imagery and End-to-End Deep Learning. Front. Plant Sci. 2022, 13, 758818. [Google Scholar] [CrossRef] [PubMed]
- Vorapatratorn, S. Development of Automatic Plant Factory Control Systems with AI-Based Artificial Lighting. In Proceedings of the 13th International Conference on Information Technology and Electrical Engineering (ICITEE), Chiang Mai, Thailand, 14–15 October 2021; pp. 69–73. [Google Scholar]
- Hwang, Y.; Lee, S.; Kim, T.; Baik, K.; Choi, Y. Crop Growth Monitoring System in Vertical Farms Based on Region-of-Interest Prediction. Agriculture 2022, 12, 656. [Google Scholar] [CrossRef]
- Tao, X.; Zhang, D.; Wang, Z.; Liu, X.; Zhang, H.; Xu, D. Detection of Power Line Insulator Defects using Aerial Images Analyzed with Convolutional Neural Networks. IEEE Trans. Syst. Man Cybern. Syst. 2018, 50, 1486–1498. [Google Scholar] [CrossRef]
- Aljubury, I.M.A.; Ridha, H.D. Enhancement of Evaporative Cooling System in a Greenhouse using Geothermal Energy. Renew. Energy 2017, 111, 321–331. [Google Scholar] [CrossRef]
- Philipp, G.; Song, D.; Carbonell, J.G. Gradients Explode-Deep Networks are Shallow-ResNet Explained. In Proceedings of the 6th International Conference on Learning Representations ICLR Workshop Track, Vanvouver, BC, Canada, 30 April–3 May 2018; Available online: https://openreview.net/forum?id=rJjcdFkPM (accessed on 15 September 2022).
- Howard, A.G.; Zhu, M.; Chen, B.; Kalenichenko, D.; Wang, W.; Weyand, T.; Andreetto, M.; Adam, H. Mobilenets: Efficient Convolutional Neural Networks for Mobile Vision Applications. arXiv 2017, arXiv:1704.04861. [Google Scholar]
- Ioffe, S.; Szegedy, C. Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift. In Proceedings of the International conference on machine learning, Lille, France, 6–11 July 2015; pp. 448–456. [Google Scholar]
- Simonyan, K.; Zisserman, A. Very Deep Convolutional Networks for Large-Scale Image Recognition. arXiv 2014, arXiv:1409.1556. [Google Scholar]
- Krizhevsky, A.; Sutskever, I.; Hinton, G.E. Imagenet Classification with Deep Convolutional Neural Networks. Commun. ACM 2017, 60, 84–90. [Google Scholar] [CrossRef] [Green Version]
- Szegedy, C.; Liu, W.; Jia, Y.; Sermanet, P.; Reed, S.; Anguelov, D.; Erhan, D.; Vanhoucke, V.; Rabinovich, A. Going Deeper with Convolutions. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 7–12 June 2015; pp. 1–9. [Google Scholar]
- Bochkovskiy, A.; Wang, C.-Y.; Liao, H.-Y.M. YOlOv4: Optimal Speed and Accuracy of Object Detection. arXiv 2020, arXiv:2004.10934. [Google Scholar]
- Wang, C.-Y.; Liao, H.-Y.M.; Wu, Y.-H.; Chen, P.-Y.; Hsieh, J.-W.; Yeh, I.-H. Cspnet: A New Backbone that can Enhance Learning Capability of CNN. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, Seattle, WA, USA, 14–19 June 2020; pp. 390–391. [Google Scholar]
- Chollet, F. Xception: Deep Learning with Depthwise Separable Convolutions. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 1251–1258. [Google Scholar]
- Tan, M.; Le, Q. Efficientnet: Rethinking Model Scaling for Convolutional Neural Networks. In Proceedings of the International Conference on Machine Learning, Long Beach, CA, USA, 9–15 June 2019; pp. 6105–6114. [Google Scholar]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep Residual Learning for Image Recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 26 June–1 July 2016; pp. 770–778. [Google Scholar]
- Kingma, D.P.; Ba, J. Adam: A Method for Stochastic Optimization. arXiv 2014, arXiv:1412.6980. [Google Scholar]
- Xu, D.; Zhang, S.; Zhang, H.; Mandic, D.P. Convergence of the RMSProp Deep Learning Method with Penalty for Nonconvex Optimization. Neural Netw. 2021, 139, 17–23. [Google Scholar] [CrossRef] [PubMed]
- Tong, Q.; Liang, G.; Bi, J. Calibrating the Adaptive Learning Rate to improve Convergence of ADAM. Neurocomputing 2022, 481, 333–356. [Google Scholar] [CrossRef]
- Cutkosky, A.; Mehta, H. Momentum Improves Normalized SGD. In Proceedings of the International Conference on Machine Learning, Vienna, Austria, 12–18 July 2020; pp. 2260–2268. [Google Scholar]
- Nesterov, Y. A Method for Unconstrained Convex Minimization Problem with the Rate of Convergence o (1/k^2). Dokl. USSR 1983, 269, 543–547. [Google Scholar]
- Miller, T. Explanation in Artificial Intelligence: Insights from the Social Sciences. Artif. Intell. 2019, 267, 1–38. [Google Scholar] [CrossRef]
- Hiriyannaiah, S.; Srinivas, A.; Shetty, G.K.; Siddesh, G.; Srinivasa, K. A Computationally Intelligent Agent for Detecting Fake News Using Generative Adversarial Networks. In Hybrid Computational Intelligence; Elsevier: Amsterdam, The Netherlands, 2020; pp. 69–96. [Google Scholar]
Ref. | Year | Focus of Study | Highlights |
---|---|---|---|
[22] | 2018 | Deep learning in agriculture | 40 papers were identified and examined in the context of deep learning in the agricultural domain. |
[23] | 2019 | Fruit detection and yield estimation | The development of various deep learning models in fruit detection and localization to support tree crop load estimation was reviewed. |
[24] | 2019 | Plant disease detection and classification | A thorough analysis of deep learning models used to visualize various plant diseases was reviewed. |
[25] | 2020 | Dense images analysis | Review deep learning applications for dense agricultural scenes, including recognition and classification, detection counting, and yield estimation. |
[26] | 2021 | Plant disease detection and classification | Current trends and limitations for detecting plant leaf disease using deep learning and cutting-edge imaging techniques. |
[27] | 2021 | Weed detection | 70 existing deep learning-based weed detection and classification techniques cover four main producers: data acquisition, datasets preparation, DL techniques, and evaluation metrics approaches. |
[28] | 2021 | Bloom/Yield recognition | Diverse automation approaches with computer vision and deep learning models for crop yield detection were presented. |
Our Paper | 2022 | Deep learning applications in CEA | Review developments of deep learning models for various applications in CEA. |
Source | Number of Papers in the Initial Search | Eligible Papers with Duplicates |
---|---|---|
Google Scholar | 330 | 27 |
Scopus | 127 | 25 |
Science Direct | 119 | 19 |
Wiley | 40 | 4 |
IEEEXplore | 51 | 9 |
SpringerLink | 44 | 4 |
Web of Science | 40 | 17 |
Total | 751 | 105 |
Application Classification | Tasks | Growing Medium | DL Model | Networks | Preprocessing Augmentation | Optimizer | Dataset Type | Imaging Method | Performance | Ref. |
---|---|---|---|---|---|---|---|---|---|---|
Climate Condition Prediction | Transpiration rate | hydroponic | ANN | ANN | NS | Adam | 31,033 data points | NS | RMSE = 0.07–0.10-gm min, R2 = 0.95–0.96 | [30] |
temp (°C), humidity deficit (g/kg), relative humidity (%), radiation , conc. | soil-based | RNN-TCN | LSTM-RNN | NS | Adam | NS | NS | RMSE (Dataset1): 10.45(±0.94), RMSE (Dataset2): 6.76 (±0.45), RMSE(Dataset3): 7.40 (±1.88) | [31] | |
temperature, humidity, concentration | soil-based | ANN | NS | NS | Adam | NS | NS | ANN at 30 min, R2 = (temp: 0.94, humidity: 0.78, : 0.70), RMSEP = (temp: 0.94, humidity: 5.44, : 32.12), %SEP = (temp: 4.22, humidity: 8.18, : 6.49) | [32] | |
NARX | NARX at 30 min, R2 = (temp: 0.86, humidity: 0.71, : 0.81), RMSEP = (temp: 1.32, humidity: 6.27, : 28.30), %SEP = (temp: 5.86, humidity: 9.42, : 7.74) | |||||||||
RNN-LSTM | RNN-LSTM at 30 min, R2 = (temp: 0.96, humidity: 0.8, : 0.81), RMSEP = (temp: 0.71, humidity: 5.23, : 28.30), %SEP = (temp: 3.15, humidity: 7.85, : 5.72) | |||||||||
temp., humidity, pressure, dew point | soil-based | RNN-LSTM | NS | NS | NS | NS | NS | Temperature, RMSE = 0.067163 | [33] | |
temp., humidity, illumination, conc., soil temp. and soil moisture | soil-based | LSTM | NS | NS | NS | NS | NS | Temp., RMSE = 0.38 (tomato), 0.55 (cucumber), 0.42 (pepper) | [34] | |
Humidity, RMSE = 1.25 (tomato), 1.95 (cucumber), 1.78 (pepper) | ||||||||||
Illumination, RMSE = 78 (tomato), 80 (cucumber), 30 (pepper) | ||||||||||
, RMSE = 3.2 (tomato), 4.1 (cucumber), 3.9 (pepper) | ||||||||||
Soil temp., RMSE = 0.07 (tomato), 0.08 (cucumber), 0.045 (pepper) | ||||||||||
Soil moisture, RMSE = 0.14 (tomato), 0.30 (cucumber), 0.15 (pepper) | ||||||||||
Yield Estimation | corn crop and leaf weeds classification | soil-based | Dual PSPNet | ResNet-50 | rotation, shift (height, width, vertical, horizontal, pixel intensity), zoom and Gaussian blur | SGD with Nesterov Momentum | 6906 images | RGB | Balanced Accuracy (BAC) = 75.76%, Dice-Sorensen Coefficient (DSC) = 47.97% (for dataset A+C) | [35] |
green pepper detection | soil-based | Improved YOLOv4-tiny | CSP DarkNet53 | Gaussian noise addition, HSL adjustment, scaling and rotation | NS | 1500 images | RGB | P: 96.91%, R: 93.85%, AP: 95.11%, F1 Score: 0.95 | [36] | |
cherry tomato clusters location detection, tomato’s maturity estimation | soil-based | SSD | MobileNet V1 | horizontal flip and random crop | Adam or RMSprop | 254 images | RGB | IoU = 0.892 (for tomato’s cluster location detection), RMSE: 0.2522 (for tomato’s maturity estimation) | [37] | |
tomato organs detection | soil-based | Improved FPN | ResNet-101 | NS | SGD | 8929 images | RGB | mAP: 99.5% | [38] | |
mushroom recognition | soil-based | Improved SSD | MobileNet V2 | flip, random rotation, random cropping, and random size, brightness and tone conversion, random erasure, mixup | NS | 4600 images | RGB | P: 94.4%, R: 93%, mAP: 93.2%, F1 Score: 0.937, Speed: 0.0032s | [39] | |
tomato detection | soil-based | Mask R-CNN | ResNext-101 | NS | SGD | 123 images | RGB | P: 93%, R: 93%, F1 Score: 0.93 | [40] | |
mushroom localization | soil-based | YOLOv3 | DarkNet53 | NS | NS | 500 images | RGB | Average prediction error = 3.7 h, Average detection = 46.6 | [41] | |
tomato detection | hydroponic | Faster R-CNN | ResNet-101 | gamma correction | momentum | 895 images | RGB, HSV | detection accuracy: 88.6% | [42] | |
cherry tomato detection | soil-based | SSD | MobileNet | rotating, brightness adjustment and noising | RMSProp | 1730 images | RGB | AP = 97.98% | [43] | |
InceptionV2 | AP = 98.85% | |||||||||
SSD300 | Adam | AP = 92.73% | ||||||||
SSD512 | AP = 93.87% | |||||||||
plant classification | soil-based | The LNet270v1 | custom | random reflection (X and Y), Shear (X and Y), Scale (X and Y), Translation (X and Y), rotation | Adam | 13,766 images | RGB | mean accuracy: 91.99%, mIoU: 86.5%, mean BFScore: 86.42% | [44] | |
tomato detection | soil-based | Mask R-CNN | ResNet-50 | None used | SGD | 123 images | RGB | Average result @ 0.5, (ResNet-50, P = 84.5%, R = 90.5%, F1 Score = 0.87) | [45] | |
ResNet-101 | Average result = 0.5, (ResNet-101, P = 82.5%, R = 90%, F1 Score = 0.86) | |||||||||
ResNext-101 | Average result @ 0.5, (ResNext-101, P = 92%, R = 93%, F1 Score = 0.925) | |||||||||
Lettuce seedlings identification | hydroponic | YOLO-VOLO-LS | VOLO | rotation, flipping, and contrast adjustment | NS | 6900 images | RGB | Average results = (recall: 96.059%, Precision: 96.014%, F1-score: 0.96039) | [46] | |
Fig detection | soil-based | YOLOFig | ResNet43 | NS | NS | 412 images | RGB | P = 74%, R = 88%, F1-score = 0.80 | [47] | |
strawberry detection | soil-based | Improved Faster-RCNN | ResNet-50 | brightness, chroma, contrast, and sharpness augmentation and attenuation | NS | 400 images | RGB | Accuracy = 86%, ART = 0.158s, IoU = 0.892 | [48] | |
sweet pepper detection | soil-based | SSD | custom | NS | NS | 468 images | RGB, HSV | Average Precision = (Flash-only: 84%, Flash-No-Flash image: 83.6%) | [49] | |
tomato detection | soil-based | Faster R-CNN | ResNet-50, ResNet-101, Inception-ResNet-v2 | resizing, crop, rotating, random horizontal flip | NS | 640 images | RGB | F1 score = 83.67% and AP = 87.83% for tomato detection using Faster R-CNN with ResNet-101, R2 = 0.87 for tomato counting | [50] | |
tomato detection | soil-based | SSD | MobileNetv2 | rotation, translate, flip, multipley, noise addition, scale, blur | NS | 1029 images | RGB, HSV | mAP = 65.38%, P = 70.12%, R = 84.9%, F1-score = 85.81% | [51] | |
YOLOv4 | CSP DarkNet53 | mAP = 65.38%, P = 70.12%, R = 84.9%, F1-score = 85.81% | ||||||||
muskmelon detection | soil-based | YOLO Muskmelon | ResNet43 | NS | NS | 410 images | RGB | IoU = 70.9%, P = 85%, R = 82%, AP = 89.6%, F1 = 84%, FPS = 96.3 | [52] | |
tomato detection | soil-based | SSD | MobileNet V2 | rotation, scaling, translation, flip, blur (Gaussian Filter), Gaussian Noise | NS | 5365 | RGB | mAP = 51.56%, P = 84.37%, R = 54.40%, F1 = 66.15%, I = 16.44 ms | [53] | |
InceptionV2 | mAP = 48.54%, P = 85.31%, R = 50.93%, F1 = 63.78%, I = 24.75 ms | |||||||||
ResNet-50 | mAP = 42.62%, P = 92.51%, R = 43.59%, F1 = 59.26%, I = 47.78 ms | |||||||||
ResNet-101 | mAP = 36.32%, P = 88.63%, R = 38.13%, F1 = 53.32%, I = 59.78 ms | |||||||||
YOLOv4-tiny | CSP DarkNet53 | mAP = 47.48%, P = 88.39%, R = 49.33%, F1 = 63.32%, I = 4.87 ms | ||||||||
Arabidopsis, Bean, Komatsuna recognition | soil-based | CNN | ResNet-18 | scaling, rotation and translation | Adam | 2694 images | RGB | mA = 0.922 (Arabidopsis), mA = 1 (Bean), mA =1 (Komatsuna) | [54] | |
Disease Detection and Classification | Tomato (powdery mildew (PM), early blight) and cucumber (PM, downy mildew (DM)) recognition | soil-based | CNN | PRP-Net | ShiftScaleRotate, RandomSizedCrop, HorizontalFlip | SGD | 4284 images | RGB | Average results (Accuracy = 98.26%, Precision = 92.60%, Sensitivity = 93.60%, Specificity = 99.01%) | [55] |
tomato virus disease recognition | soil-based | SE-YOLOv5 | CSPNet | Gaussian noise addition, rotation, mirroring, intensity random adjustment | NS | 150 images | RGB, HSV | P = 86.75%, R = 92.19%, mAP@(0.5) = 94.1%, mAP@(0.5:0.95) = 75.98, prediction accuracy = 91.07% | [56] | |
cucumber PM, DM and the combination of PM and DM recognition | soil-based | Efficient Net | EfficientNet-B4 | flip (horizontal, vertical), rotation | Ranger | 2816 images | RGB | Train Accuracy = 99.22%, Verification accuracy = 96.38%, Test accuracy = 96.39% | [57] | |
Tomato (PM, early blight), cucumber (PM, DM, virus disease) recognition | soil-based | ITC-Net | ResNet18 and TextRCNN | Cropping, Normalization, word segmentation, word list construction, text vectorization | Adam | 1516 images | RGB | Accuracy: 99.48%, Precision: 98.90%, Sensitivity: 98.78%, Specificity: 99.66% | [58] | |
leaf mold, tomato yellow leaf curl detection | soil-based | CNN | ResNet-50, ResNet-101 | filtering, histogram | NS | 115 images | RGB, HSV | Testing Accuracy = 98.61%, Validation accuracy = 99% | [59] | |
spider mite detection | soil-based | CNN | ResNet18 | NS | NS | 850 images | multi-spectral, RGB | accuracy: 90% | [60] | |
cucumber DM prediction | soil-based | LSTM | NS | Min-Max normalization | Adam | 11,827 images | RGB | A = 90%, R = 89%, P = 94%, F1-Score = 0.91 | [61] | |
tomato disease detection | soil-based | Faster R-CNN | VGG16 | resizing, cropping, rotation, flipping, contrast, brightness, color, noise | NS | 59,717 images | RGB | mAP = 89.04% | [62] | |
ResNet-50 | mAP = 90.19% | |||||||||
ResNet-50 FPN | mAP = 92.58% | |||||||||
various tomato diseases (i.e., leaf mold, gray mold, early blight, late blight, leaf curl virus, brown spot) detection | soil-based | YOLO-Dense | DarkNet53 | NS | NS | 15,000 images | RGB | mAP: 96.41% | [63] | |
wheat disease detection | soil-based | CNN | ResNet-101 | cropping | NS | 160 plants | NIR, RGB | Accuracy = 84% for tan spot disease, 75% for leaf rust disease | [64] | |
Small Insect Detection | Pests (whitefly and Thrips) detection | soil-based | TPest-RCNN | VGG16 | Resizing, Spliting | NS | 1941 images | RGB | AP: 95.2%, F1 Score: 0.944 | [65] |
whiteflies (greenhouse whitefly and cotton whitefly) detection | hydroponic | Faster R-CNN | ResNet-50 | mirroring | SGD | 1161 images | RGB | RMSE = 5.83, Precision = 0.5794, Recall = 0.7892 | [66] | |
whitefly detection | soil-based | YOLOV4 | CSP DarkNet53 | cropping | Adam | 1200 images | RGB | Whitefly: (precision = 97.4%, recall = 95.7%), mAP = 95.1% | [67] | |
Thrips detection | Thrips: (precision = 97.9%, recall = 94.5%), mAP = 95.1% | |||||||||
flies, gnats, thrips, whiteflies detection | soil-based | YOLOv3-tiny | DarkNet53 | cropping | Adam | NS | RGB | average F1-score: 0.92, mean counting accuracy: 0.91 | [68] | |
Nutrient Estimation and Detection | lead content detection | soil-based | WT-MC-stacked auto-encoders | NS | standard normalized variable (SNV), 1st Der, 2nd Der, 3rd Der, 4th Der | NS | 2800 images | hyper-spectral data | pb content detection = 0.067∼1.400 mg/kg, RMSEC = 0.02321 mg/kg, RMSEP = 0.04017mg/kg, R2C = 0.9802, R2P = 0.9467 | [69] |
soyabean leaf defoliation estimation | soil-based | CNN | AlexNet | Resizing, Binarized, Rotation | NS | 10,000 images | RGB | RMSE (AlexNet) = 4.57(±5.8) | [70] | |
VGGNet | RMSE (VGGNet): 4.65 (±6.4) | |||||||||
ResNet | RMSE (ResNet): 14.60 (±18.8) | |||||||||
PN: (light level concentration, temperature) prediction | soil-based | DNN | custom | NS | Adam | 33,000 images | NS | accuracy: 96.20% (7 hidden layer with 128 units per hidden layer), accuracy: 96.30% (8 hidden layer with 64 units per hidden layer) | [71] | |
nutrient concentration estimation | hydroponic | CNN | VGG16 | width, height shift, shear, flipping, zoom, scaling, cropping | Adam | 779 images | RGB | Average Classification Accuracy (ACA) = 97.9% | [72] | |
VGG19 | Average Classification Accuracy (ACA) = 97.8% | |||||||||
Calcium Magnesium deficiencies prediction | soil-based | SVM, Random Forest (RF) Classifier | Inception V3 | NS | RMSProp | 880 images | RGB | Accuracy = 98.71% (for InceptionV3 with SVM) and 97.85% (for Inception-V3 with RF classifier) | [73] | |
VGG16 | Adam | Accuracy = 99.14% (for VGG16 with SVM) and 95.71% (for VGG16 with RF classifier) | ||||||||
ResNet-50 | Adam | Accuracy = 88.84% (for ResNet50 with SVM) and 84.12% (for ResNet-50 with RF classifier) | ||||||||
cadmium content estimation | soil-based | PSO-DBN | NS | Savitzky-Golay(SG) to remove the spectral noise | NS | 1260 images | hyper-spectral data | When the hidden layers is 3, the prediction result is as follows, R2: 0.8976, RMSE: 0.6890, and RPD: 2.8367 | [74] | |
Nutrient deficiencies (Calcium/Ca2+, Potassium/K+, Nitrogen/N) classification | soil-based | CNN | Inception-ResNetV2 | shift, rotation, resizing | NS | 571 images | RGB | Average Accuracy = 87.27%, Average Precision = 100%, Recall = Ca2+: 100%, K+: 100%, N: 100% | [75] | |
Auto-Encoder | NS | Average Accuracy = 79.09%, Average Precision = 94.2%, Recall = Ca2+: 97.6%, K+: 92.45%, N: 95.23% | ||||||||
Growth Monitoring | length estimation and interest point detection | soil-based | Mask R-CNN | ResNet-101 | NS | NS | 2574 images | RGB | Results in 2D (Banana Tree, AP: 92.5%, Banana Leaves, AP: 90%, Cucumber fruit, AP: 60.2%) | [76] |
internode length detection | soil-based | YOLOv3 | DarkNet53 | NS | NS | 9990 images | RGB | R:92% AP: 95%, F1 Score: 0.94 | [77] | |
plant growth anomalies detection | soil-based | LSTM | NS | filtering, cropping | Adam | NS | RGB, HSV | 2D (P: 42% R: 71%, F1: 0.52), 3D photogrammetry with high resolution camera (P: 57% R: 57%, F1: 0.57), 3D low-cost photogrammetry system (P: 44% R: 79%, F1 :0.56), LiDAR (P: 5% R: 86%, F1: 0.63) | [78] | |
Phytomorphological descriptor prediction | aquaponics | CNN | DarkNet53 | Scaling and Resizing | SGD with Momentum | 300 images | RGB | R2(Area-DarkNet53) = 0.9858, R2(Diameter-DarkNet53) = 0.9836 | [79] | |
Xception | R2(Centroid x-Xception) = 0.6390, R2(Centroid-y-Xception) = 0.7239 | |||||||||
Inception ResNetv2 | R2(Major Axis-InceptionResNetv2) = 0.8197, R2(Minor Axis-InceptionResNetv2) = 0.7460 | |||||||||
orchid seedlings vigor rating | soil-based | CNN | ResNet-50 | Cropping, Resizing | Adam | 1700 images | RGB, HSV | A = 95.5%, R = 97%, P = 94.17%, F1-Score = 0.9557 | [80] | |
spike detection | soil-based | SSD | Inception-ResNetv2 | NS | SGD | 292 images | RGB | [email protected] = 0.780, [email protected] = 0.551, [email protected]:0.95 = 0.470 | [81] | |
YOLOv3 | DarkNet53 | NS | [email protected] = 0.941, [email protected] = 0.680, [email protected]:0.95 = 0.604 | |||||||
YOLOv4 | CSP DarkNet53 | CutOut, MixUp, CutMix, RandomErase | [email protected] = 0.941, [email protected] = 0.700, [email protected]:0.95 = 0.610 | |||||||
Faster R-CNN | InceptionV2 | NS | Adam | [email protected] = 0.950, [email protected] = 0.822, [email protected]:0.95 = 0.660 | ||||||
spike segmentation | ANN | NS | NS | NS | AP = 0.61 | |||||
U-Net | VGG16 | rotation [−30 30], horizontal flip, and brightness | Adam | AP = 0.84 | ||||||
Deep-LabV3+ | ResNet-101 | NS | AP = 0.922 | |||||||
Paprika leaves growth conditions classification | soil-based | DNN | Improved VGG-16 | rotation | NS | 227 images | hyper-spectral data | Accuracy = 90.9% | [82] | |
VGG-16 | Accuracy = 86.4% | |||||||||
ConvNet | Accuracy = 82.3% | |||||||||
leaf shape estimation | hydroponic | encoder-decoder CNNs | U-Net | random rotation, and random horizontal spatial flipping | Adam | NS | RGB | Deviation of U-Net based estimation is less than 10% of the manual LAI estimation | [83] | |
Robotic Harvesting | Obstacle Separation | soil-based | Mask R-CNN | ResNet-101 | 3D HSI color thresholding | NS | NS | RGB | Success Rate = 65.1% (whole process) | [84] |
picking-point positioning | soil-based | CNN | custom | NS | NS | 100 images | RGB | Success rate: 100% | [85] | |
keypoints detection | soil-based | TPM | custom | Rotation and brightness adjustment | RMSprop | 2500 images | RGB | Qualified rate: 94.02%, Accuracy: 85.77% | [86] | |
pose detection | Adam | Accuracy: 70.05% | ||||||||
target positioning estimation | soil-based | Mask-RCNN | ResNet | cropping | NS | NS | RGB, Infrared | Average Gripping Accuracy (AGA): 8.21mm, APSR: 73.04% | [87] | |
Others | LPDE film lifetime prediction | NS | SVM-CNN | NS | NS | Adam | 4072 images | NS | NS | [88] |
Application Classification | Tasks | Growing Medium | DL Model | Networks | Preprocessing Augmentation | Optimizer | Dataset Type | Imaging Method | Performance | Ref. |
---|---|---|---|---|---|---|---|---|---|---|
Yield Estimation | rapeseed detection | hydroponic | ESPA-YOLO-V5s | CSP DarkNet | rotating, flipping (horizontal, vertical) | NS | 6616 images | RGB | P = 94.5%, R = 99.6%, F1-score = 0.970, [email protected] = 0.996 | [89] |
tomato prediction | hydroponic | Improved Mask R-CNN | ResNet | random translation, random brightness change, Gaussian noise addition | NS | 1078 images | RGB | Accuracy = 93.91% (Fruit), Accuracy = 88.13% (Stem) | [90] | |
Stress Level Monitoring | lettuce abnormal leaves (yellow, withered, decay) | hydroponic | DeepLabV3+ | Xception-65 | rotating, mirroring, flipping | NS | 500 images | RGB | Xception-65 (mIoU = 0.4803, PA = 95.10%, speed = 243.4 ± 4.8a) | [91] |
Xception-71 | Xception-71 (mIoU = 0.7894, PA = 99.06%, speed = 248.9 ± 4.1a) | |||||||||
ResNet-50 | ResNet-50 (mIoU = 0.7998, PA = 99.20%, speed = 154.0 ± 3.8c) | |||||||||
ResNet-101 | ResNet-101 (mIoU = 0.8326, PA = 99.24%, speed = 193.4 ± 4.0b) | |||||||||
water stress classification | NS | CNN | ResNet50 | rotation, re-scaling | SGD with momentum /Adam /RMSProp | 800 images | RGB | Average Accuracy: ResNet-50 with (Adam = 94.15%, RMSProp =88.75%, SGDm = 83.77%) | [92] | |
GoogLeNet | GoogLeNet with (Adam = 78.3%, RMSProp = 80.4%) | |||||||||
patch-level detection | NS | YOLOv2 | DarkNet19 | NS | SGD with Nesterov Momentum | 60,000 images | RGB | Accuracy = 87.05% | [93] | |
pixel-level segmentation | U-Net | NS | cropping, random jittering | Adam | mAP = 87.00%, IoU = 77.20%, Dice score = 75.02% | |||||
light stress grading | hydroponic | MFC-CNN | custom | 90, 180, and 270-degree rotation, mirror rotation, salt and pepper noise, and image sharpening | SGD | 1113 images | RGB | Accuracy = 87.95% Average F1-score = 0.8925 | [94] | |
Growth Monitoring | plant growth prediction | NS | NS | NS | NS | NS | 45 data samples | RGB | RMSE = 0.987, R2 = 0.728 for 4-7-1 network architecture | [95] |
leaf shape estimation | NS | custom | Spatial transformer network | rotation, scaling, translation | Adam | NS | RGB | PSNR = 30.61, SSIM = 0.8431 | [96] | |
PSNR = 26.55, SSIM = 0.9065 | ||||||||||
PSNR = 23.03, SSIM = 0.8154 | ||||||||||
growth prediction | soil-based | U-Net | SE-ResXt101 | cropping, scaling and padding | NS | 232 plant samples | RGB | F1-score = 97% | [97] | |
plant behaviour prediction | hydroponic | Mask R-CNN | NS | rotation and scaling | NS | 1728 images | RGB | leaf area accuracy = 100% | [98] | |
lettuce plant biomass prediction | hydroponic | DCNN | ResNet-50 | rotation, brightness, contrast, saturation, hue, grayscale | Adam | 864 plants | RGB | For RGBD (MAPE = 7.3%, RMSE = 1.13g), For RGB (MAPE = 9.6%, RMSE = 1.03g), For Depth (MAPE = 12.4%, RMSE = 2.04g) | [99] | |
growth prediction | hydroponic | ANN | NS | NS | NS | NS | NS | ANN: Accuracy (%) = 98.3235, F-measure (%) = 97.5413, Training time (sec) = 121.78 | [100] | |
SVM | SVM: Accuracy (%) = 96.0886, F-measure(%) = 93.4589, Training time (sec) = 202.48 | |||||||||
growth prediction | hydroponic | Mask R-CNN | ResNet-50 | flipping, cropping and rotation | NS | 600 images | NS | mAP = 76.9%, AP = 92.6% | [101] |
Model | Ref. | Advantages | Disadvantages |
---|---|---|---|
AE | [69,75] |
|
|
DBN | [74] |
|
|
LSTM | [31,61] |
|
|
ANN | [30,32] |
|
|
CNN | [45,62] |
|
|
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Ojo, M.O.; Zahid, A. Deep Learning in Controlled Environment Agriculture: A Review of Recent Advancements, Challenges and Prospects. Sensors 2022, 22, 7965. https://doi.org/10.3390/s22207965
Ojo MO, Zahid A. Deep Learning in Controlled Environment Agriculture: A Review of Recent Advancements, Challenges and Prospects. Sensors. 2022; 22(20):7965. https://doi.org/10.3390/s22207965
Chicago/Turabian StyleOjo, Mike O., and Azlan Zahid. 2022. "Deep Learning in Controlled Environment Agriculture: A Review of Recent Advancements, Challenges and Prospects" Sensors 22, no. 20: 7965. https://doi.org/10.3390/s22207965
APA StyleOjo, M. O., & Zahid, A. (2022). Deep Learning in Controlled Environment Agriculture: A Review of Recent Advancements, Challenges and Prospects. Sensors, 22(20), 7965. https://doi.org/10.3390/s22207965