Next Article in Journal
On the Applications of EMG Sensors and Signals
Next Article in Special Issue
Hyperspectral Image Data and Waveband Indexing Methods to Estimate Nutrient Concentration on Lettuce (Lactuca sativa L.) Cultivars
Previous Article in Journal
Wrapper Functions for Integrating Mathematical Models into Digital Twin Event Processing
Previous Article in Special Issue
GAN-Based Video Denoising with Attention Mechanism for Field-Applicable Pig Detection System
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Deep Learning in Controlled Environment Agriculture: A Review of Recent Advancements, Challenges and Prospects

Department of Biological and Agricultural Engineering, Texas A&M AgriLife Research, Texas A&M University System, Dallas, TX 75252, USA
*
Author to whom correspondence should be addressed.
Sensors 2022, 22(20), 7965; https://doi.org/10.3390/s22207965
Submission received: 20 September 2022 / Revised: 12 October 2022 / Accepted: 12 October 2022 / Published: 19 October 2022
(This article belongs to the Special Issue Smart Decision Systems for Digital Farming)

Abstract

:
Controlled environment agriculture (CEA) is an unconventional production system that is resource efficient, uses less space, and produces higher yields. Deep learning (DL) has recently been introduced in CEA for different applications including crop monitoring, detecting biotic and abiotic stresses, irrigation, microclimate prediction, energy efficient controls, and crop growth prediction. However, no review study assess DL’s state of the art to solve diverse problems in CEA. To fill this gap, we systematically reviewed DL methods applied to CEA. The review framework was established by following a series of inclusion and exclusion criteria. After extensive screening, we reviewed a total of 72 studies to extract the useful information. The key contributions of this article are the following: an overview of DL applications in different CEA facilities, including greenhouse, plant factory, and vertical farm, is presented. We found that majority of the studies are focused on DL applications in greenhouses (82%), with the primary application as yield estimation (31%) and growth monitoring (21%). We also analyzed commonly used DL models, evaluation parameters, and optimizers in CEA production. From the analysis, we found that convolutional neural network (CNN) is the most widely used DL model (79%), Adaptive Moment Estimation (Adam) is the widely used optimizer (53%), and accuracy is the widely used evaluation parameter (21%). Interestingly, all studies focused on DL for the microclimate of CEA used RMSE as a model evaluation parameter. In the end, we also discussed the current challenges and future research directions in this domain.

1. Introduction

Sustainable access to high-quality food is a problem in developed and developing countries. Rapid urbanization, climate change, and depleting natural resources have raised the concern for global food security. Additionally, the rapid population growth further aggregate the food insecurity challenge. According to World Health Organization, the food production needs to be increased by 70% to meet the food demand of about 10 billion people by 2050 [1], of which about 6.5 billion will be living in urban areas [2]. A significant amount of food is produced in the open fields using traditional agricultural practices, which results in low yields per sq. ft of land used. Simply increasing the agricultural land is not a long-term option because of the associated risks of land degradation, de-forestation, and increased emissions due to transportation to urban areas [3]. Thus, alternative production systems are essential to offset these challenges for establishing a sustainable food supply chain.
Controlled environment agriculture (CEA), including greenhouses, high-tunnels, vertical farms (vertical or horizontal plane), and plant factories, is increasingly considered an important strategy to address global food challenges [4]. CEA is further categorized based on the growing medium and production technology (hydroponics, aquaponics, aeroponics, and soil-based). CEA integrates knowledge across multiple disciplines to optimize crop quality and production efficiency without sufficient arable land. Globally, the CEA market has witnessed a growth of about 19% in 2020 and is projected to grow at a compound annual growth rate of 25% during the 2021–28 period [5]. CEA market in the US is predicted to be $3 billion by 2024, with an annual growth of about 24% [6]. Advocates of CEA claim that the system is more than 90% efficient in water use, produces 10–250 times the higher yield per unit area, and generates 80% less waste than traditional field production, while also reducing food transportation miles in urban areas [3,7,8].
Despite all these benefits, the CEA industry struggles to achieve economic sustainability due to inefficient microclimate and rootzone-environment controls and high costs. Microclimate control, including light, temperature, airflow, carbon dioxide, and humidity, is a major challenge in CEA, which is essential to produce uniform, high quantity, and quality crops [9]. In the last decade, substantial research has been carried out on implementing intelligent systems in CEA facilities such as nutrient solution management for hydroponic farm [10], and cloud-based micro-environment monitoring and control systems for the vertical farm [11]. Further, using artificial intelligence (AI) algorithms have also created new opportunities for intelligent predictions and self-learning [12]. DL has gained significant attention in the last few years due to its massive footprints in many modern day technologies. DL algorithms applied to CEA across all units have provided insights into farmers’ support and action. Computer vision and DL algorithms have been implemented to automate the irrigation in vertical stack farms [13], and microclimate control [14], which facilitated the growers to carry out a quantitative assessment for high-level decision-making.
CEA is an intensive production system, the labor is required year-round, and the labor requirement is also significantly higher than traditional agriculture [15]. A small indoor farm of fewer than 1500 sq. ft requires at least three full-time workers [16]. Intelligent automation, however, could address these challenges. Furthermore, the crop cycle in CEA is relatively small, therefore the timely decision to perform a specific operation is critical. For instance, the harvest decision requires information about crop maturity, which can be obtained using an optical sensor integrated with DL-based prediction models [17]. In recent years, research has been carried out to develop robotic systems for indoor agriculture [18,19,20]. For target detection, various sensors are implemented such as cameras [19], or LiDAR [21]. This increasing popularity of DL applications in CEA sparks our motivation to conduct a systematic review of recent advances in this domain.

1.1. Review Scope

Table 1 presents the existing review articles covering DL applications in different sections of agriculture [22,23,24,25,26,27,28]. From the table, it is evident that the reported studies (based on the authors’ knowledge) lacks a critical overview of recent advancements in DL methodologies for CEA. Thus, a need to review the recent works in CEA is consequential to determine state of the art, identify current challenges, and provide future recommendations. Figure 1 shows the bibliometric network and co-occurrence map of the author-supplied keywords.

1.2. Paper Organization

The article’s organization is as follows: Section 2 features the methodology of the review process, including establishing review protocol, keywords selection, research questions formation, and data extraction. Section 3 presents the results of the review, including data synthesis and answers to the core research questions. Existing challenges and future recommendations are discussed in Section 4. The overall conclusions of the review is presented in Section 5.

2. Research Methodology

2.1. Review Protocol

In this research, we adhered to the SLR standard approach as described by Chitu Okoli and Kira Schabram [29]. Using this approach, we identified, specified, and analyzed all the publications in DL for CEA applications from 2019 to date, in order to present a response to each research question (RQ) and identify any gaps. Planning, conducting, and reporting the review are the three parts we divided the SLR process into. Figure 2 depicts the actions taken at each level of the SLR. During the planning phase we identified RQs, relevant keywords, and databases. After the RQs were prepared, the search protocol was created, along with which databases and search strings should be used. Search string for each database was generated using selected keywords. Wiley, Web of Science, IEEEXplore Springer Link, Google Scholar, Scopus, and Science Direct are the databases used in this study. The databases were chosen to ensure adequate coverage of the target sector and to increase the scope of the assessment. By going through all the eligible studies, pertinent studies were chosen for the conducting review stage. Significant information was retrieved from the publications that met the selection/inclusion criteria in response to the RQs. Extracted data from selected publications were used to answer the RQs during the reporting stage, and the outcomes were presented using accompanying visuals and summary tables. This type of literature analysis demonstrates the most recent findings of DL research in CEA.

2.2. Research Questions

Identifying RQs is essential to the systematic review. At the start of the study, we set the RQs up to adhere to the review procedure. The searched articles were examined from a variety of aspects, and the following RQs were established.
  • RQ.1: What are the most often utilized DL models in CEA, and their benefits and drawbacks?
  • RQ.2: What are the main application domains of DL in CEA?
  • RQ.3: What evaluation parameters are used for DL models in CEA?
  • RQ.4: What are the DL backbone networks used in CEA applications?
  • RQ.5: What are the optimization methods used for CEA applications?
  • RQ.6: What are the primary growing media and plants used for DL in the CEA?

2.3. Search Method

In order to focus the search results on papers that were specifically relevant to the SLR’s scope, a methodical approach was taken. The original search was conducted using a generalized search equation that included the necessary keywords “deep learning” AND “controlled environment agriculture” OR “greenhouse” OR “plant factory” OR “vertical farm” to obtain the expanded search results. From the search results, a few studies were selected to extract the author supplied keywords, and synonyms. The discovered keywords produced the general search string/equation: (“controlled environment agriculture” OR “greenhouse” OR “plant factory” OR “vertical farm” OR “indoor farm”) AND (“deep learning” OR “deep neural network”). All seven databases were searched using the same keywords. Following search strings were used for different databases:
  • Science Direct: (“controlled environment agriculture” OR “greenhouse” OR “plant factory” OR “vertical farm”) AND (“Deep Learning”) NOT (“Internet of Things” OR “GREENHOUSE GAS” OR “gas emissions” OR “Machine learning”)
  • Wiley: (“controlled environment agriculture” OR “greenhouse” OR “plant factory” OR “vertical farm*”) AND (“deep learning”) NOT (“Internet of Things” OR “greenhouse gas” OR “Gas emissions” OR “machine learning” OR “Review”)
  • Web of Science: (AB = (((“controlled environment agriculture” OR “vertical farm” OR “greenhouse” OR “plant factory”) AND (“deep learning” ) NOT ( “Gas Emissions” OR “Internet of Things” OR “Greenhouse Gas” OR “machine learning” OR “Review”))))
  • Springer Link: (“deep learning”) AND (“Greenhouse” OR “controlled environment agriculture” OR “vertical farm” OR “plant factory”) NOT (“Internet of things” OR “review” OR “survey” OR “greenhouse gas” OR “IoT” OR “machine learning” OR “gas emissions”)
  • Google Scholar: “greenhouse” OR “vertical farm” OR “controlled environment agriculture” OR “plant factory” “deep learning”—“Internet of Things”—“IoT”—“greenhouse gas”—“review”—“survey”—“greenhouse gases”—“Gas Emissions”—“machine learning”
  • Scopus: TITLE-ABS-KEY ((“deep learning”) AND (“vertical farm*” OR “controlled environment agriculture” OR “plant factory” OR “greenhouse”)) AND (LIMIT-TO (PUBYEAR, 2022 ) OR LIMIT-TO (PUBYEAR, 2021) OR LIMIT-TO ( PUBYEAR, 2020) OR LIMIT-TO ( PUBYEAR, 2019 )) AND (LIMIT-TO (LANGUAGE, “English” )) AND (EXCLUDE (EXACTKEYWORD, “Greenhouse Gases”) OR EXCLUDE ( EXACTKEYWORD, “Gas Emissions”) OR EXCLUDE (EXACTKEYWORD, “Machine Learning”) OR EXCLUDE (EXACTKEYWORD, “Internet of Things”))
  • IEEEXplore: (“controlled environment agriculture” OR “greenhouse” OR “plant factory” OR “vertical farm”) AND (“Deep Learning”) NOT (“Internet of Things” OR “GREENHOUSE GAS” OR “gas emissions” OR “Machine learning”)
After all the results were processed, a total of 751 studies were found using the aforementioned search strings.

2.4. Selection/Inclusion Criteria

To establish the limits for the SLR, the inclusion Criteria (IC) and exclusion Criteria (EC) were defined. To choose the pertinent research based on the IC and EC, the studies that were obtained from all databases were carefully examined. The search outcomes from several databases were combined in a spreadsheet and compared to all of the IC and EC. A study must meet all of the ICs and ECs in order to be considered for the review. Upon passing the IC and EC, all studies that could respond to the RQs were deemed pertinent and chosen. The ICs and ECs are presented below:
  • IC.1: Peer-reviewed journal publications and conference papers.
  • IC.2: Studies published during the period between 2019 and April 2022.
  • IC.3: Studies should offer answers to the RQs.
  • EC.1: Study unrelated to DL for CEA.
  • EC.2: Full text not accessible.
  • EC.3: Duplicate or obtained from another database.
  • EC.4: Publication is a review or survey article.
  • EC.5: Publications such as book reviews, editorials, and summaries of conferences and seminars are not subjected to peer review.
  • EC.6: Studies published before 2019.
Applying the ICs and ECs produced a total of 72 eligible articles were selected, which were then shortlisted for additional examination. An overview of article search and selection procedure is shown Figure 3. The distribution of selected papers from different databases is shown in Table 2.

2.5. Data Extraction

Table 3 and Table 4 presents the summary of studies that fulfilled the selection criteria. The necessary data required to answer the RQs, were extracted from the selected studies. The extracted data were summarized using a spreadsheet application. In the spreadsheet, each study was assigned to separate row, and the column was assigned to different parameters. Tasks, DL model, training networks, imaging system, optimizer, pre-processing augmentation, application domain, performance parameters, growing medium, and publication year, journal, and country, as well as challenges were retrieved from the selected studies. To properly respond to the RQs, all of the extracted data were categorized and synthesized into various classifications. The following sections present the results of this SLR.

3. Deep Learning in CEA

RQ.1: What are the most often utilized DL models in CEA and their benefits and drawbacks?
In CEA, DL models have been applied to a variety of tasks, such as crop phenotyping, disease and small insect detection, growth monitoring, nutrient status and stress level monitoring, microclimatic condition prediction, and robotic harvesting, all of which require large amounts of data for the machine to learn from. The architectures have been implemented in various ways, including deep belief network (DBN), convolutional neural network (CNN), recurrent neural networks (RNN), stacked auto-encoders, long short-term memory (LSTM), and hybrid approaches. CNN, which has three primary benefits including parameter sharing, sparse interactions, and equivalent representations, is a popular and commonly used approach in deep learning. CNN’s feature mapping includes k filters that have been spatially divided into several channels [102]. The feature map’s width and height are reduced using the pooling technique. CNNs use filters to capture the semantic correlations through convolution operations in multiple-dimensional data as well as pooling layers for scaling and shared weights for memory reduction to evaluate hidden patterns. As a result, the CNN architecture has a significant advantage in comprehending spatial data, and the network’s accuracy improves as the number of convolutional layers rises.
RNN and LSTM are very useful in processing time-series data, which are frequently utilized in CEA. The most well-known RNN variations include Neural Turing Machines (NTM), Gated Recurrent Units (GRU), and Long-Short Term Memory (LSTM), with LSTM being the most popular for CEA applications. Typically for data dimensionality reduction, compression, and fusion, autoencoders (AE) are used to automatically learn and represent the unlabeled input data. Encode and decode are two of the autoencoder’s operations. Encoding input images yields a code, which is subsequently decoded to get an output. The back-propagation technique is used to train the network so that the output is equal to the input. A DBN is created by stacking a number of distinct unsupervised networks, such as RBMs (restricted Boltzmann machines), so that each layer can be connected to both previous and subsequent layers. As a result, DBNs are often constructed by stacking two or more RBMs. It is significant to demonstrate that DBNs have been used in CEA applications [74]. The benefits and drawbacks of various DL models are listed in Table 5. Table 5 reveals that the identified drawbacks of DL methods prevent them from becoming canonical approaches in CEA. Each DL approach has the features that make it better suited than the others to a certain application in the CEA. Hybrid models are said to address the shortcomings of some of the single DL methods. The hybrid approach demonstrates the integration of several deep learning techniques. In the publications we reviewed, we discovered some studies that made use of the hybrid approach. Figure 4. shows a visual breakdown of the most often used CEA approaches along with how frequently they are applied.
The following subsection classifies CEA into two categories: (1) Greenhouse, (2) Indoor farm.

3.1. Deep Learning in Greenhouses

RQ.2: What are the main application domains of DL in CEA?
In this subsection, we present the DL models in greenhouse production for diverse applications. Table 3 present the application domain, tasks, DL model, network, optimizer, datasets, pre-processing augmentation, imaging method, growing medium and performance of DL in greenhouse.

3.1.1. Microclimate Condition Prediction

Maintaining the greenhouse at its ideal operating conditions throughout all phases of plant growth requires an understanding of the microclimate and its characteristics. The greenhouse can increase crop yield by operating at the optimal temperature, humidity, carbon dioxide (CO2) concentrations, and other microclimate parameters at each stage of the plant growth. For instance, greater indoor air temperatures—which can be achieved by preserving the greenhouse effect or using the right heating technology—are necessary for the maximum plant growth in cold climates. On the other hand, the greenhouse effect is only necessary in very hot areas for a brief period of around 2–3 months while other suitable cooling systems are needed [103]. Accurate prediction of a greenhouse’s internal environmental factors using DL approaches is one of the recent trends in CEA. In our survey, we found 5 studies [30,31,32,33,34] that mentioned microclimate conditions prediction in the greenhouse.

3.1.2. Yield Estimation

Crop detection, one of the most important topics in smart agriculture, especially in greenhouse production, is critical for matching crop supply and demand and crop management to boost productivity. Many of the surveyed articles demonstrate the application of DL models for crop yield estimation. The Single Shot MultiBox detector (SSD) method was used in the studies [37,43,51,53] to estimate tomato crops in the greenhouse environment followed by robotic harvesting. Other applications of SSD include detecting oyster mushrooms in [39] and sweet pepper in [49]. Another DL model called You Only Look Once (YOLO) with different modifications has been utilized in some of the reviewed papers for crop yield estimation as demonstrated in [36,41,46,47,51,52,53]. As described in [40,42,45,48,50,61], R-CNN models such as Mask-RCNN and Faster-RCNN, two of the most widely used DL models, are used in crop yield prediction applications, especially for tomato and strawberry. Other custom DL models for detecting crops have been proposed in the studies of [35,38,44,54].

3.1.3. Disease Detection and Classification

Disease control in greenhouse environments is one of the most pressing issues in agriculture. Spraying pesticides/insecticides equally over the agricultural area is the most common disease control method. Although effective, this approach comes at a tremendous financial cost. Techniques for image recognition using DL can dramatically increase efficiency and speed while reducing recognition cost. As indicated in Table 3, we only identified various diseases of tomato and cucumber based on our assessments of the evaluated publications. As indicated in Table 3, we identified various diseases of tomato such as powdery mildew (PM) in [55,58,62], early blight in [55,58,63], leaf mold in [59,62,63], yellow leaf curl [59,63], gray mold in [62,63], spider mite in [60] and virus disease in [56]. Similarly, the diseases of cucumber such as powdery mildew (PM) in [55,57,58], downy mildew (DM) in [55,57,58,61] and virus disease in [58] are the sole diseases discussed based on our assessments of the evaluated publications. The wheat disease stated in [64] is another disease reported in the examined articles.

3.1.4. Growth Monitoring

Plant growth monitoring is one of the applications where DL techniques have been applied to greenhouse production. Plant growth monitoring encompasses various areas such as length estimation at all crop growth stages as demonstrated in [76,77], and anomalies in plant growth in [78,82]. Other areas where plant growth monitoring is applied are in the prediction of Phyto-morphological descriptors as demonstrated in [79], seedling vigor rating in [80], leaf-shape estimation [83], and spike detection and segmentation in [81].

3.1.5. Nutrient Detection and Estimation

It is crucial for crop management in greenhouses to accurately diagnose the nutritional state of crops because both an excess and a lack of nutrients can result in severe damage and decreased output. The goal of automatically identifying nutritional deficiencies is comparable to that of automatically recognizing diseases in that both involve finding the visual signs that characterize the disorder of concern. Based on our survey, we realized that there are few works dedicated to DL for nutrient estimation compared to most works utilizing DL for nutrient detection. The goal of nutritional detection is to identify one of these pertinent deficiencies, therefore symptoms that do not seem to be connected to the targeted disorders are disregarded. The studies [69,75] employed the autoencoders approach to detect nutrient deficiencies and lead content, respectively. CNN models were also frequently used in applications for nutrient detection. This was demonstrated in soybean leaf defoliation in [70], nutrient concentration in [72], nutrient deficiencies in [75], net photosynthesis modeling in [71] and calcium and magnesium deficiencies in [73]. As shown in [74], the cadmium concentration of lettuce leaves was estimated using a different DL model called DBN that was optimized using particle swarm optimization.

3.1.6. Small Insect Detection

The intricate nature of pest control in greenhouses calls for a methodical approach to early and accurate pest detection. Using an automatic detection approach (i.e., DL) for small insects in a greenhouse is even more critical for quickly and efficiently obtaining trap counts. The most prevalent greenhouse insects discovered in the reviewed studies are whiteflies and thrips [65,66,67,68]. Our survey mentioned four studies for applying DL models (mostly CNN architectures) for tiny pest detection.

3.1.7. Robotic Harvesting

Robotics has evolved into a new “agricultural tool” in an era where smart agriculture technology is so advanced. The development of agricultural robots has been hastened by the integration of digital tools, sensors, and control technologies, exhibiting tremendous potential and advantages in modern farming. These developments span from rapidly digitizing plants with precise, detailed temporal and spatial information to completing challenging nonlinear control tasks for robot navigation. High-value crops planted in CEA (i.e., tomato, sweet pepper, cucumber, and strawberry) ripen heterogeneously and require selective harvesting of only the ripe fruits. According to the reviewed papers, few works have utilized DL for robotic harvesting applications, such as picking-point positioning in grapes [85], obstacle separation using robots in tomato harvesting [84], 3D-pose detection for tomato bunch [86] and lastly, target tomato positioning estimation [87].

3.1.8. Others

Other applications related to DL in CEA applications include predicting low-density polyethylene (LDPE) film life and mechanical properties in greenhouses using a hybrid model integrating both SVM and CNN [88].

3.2. Deep Learning in Indoor Farms

This subsection presents the main applications of the reviewed works that utilized DL in indoor farms (vertical farms, shipping containers, plant factories, etc.,). Table 4 present the application domain, tasks, DL model, network, optimizer, datasets, preprocessing augmentation, imaging method, growing medium, and performance of DL in indoor farms.

3.2.1. Stress-Level Monitoring

To reduce both acute and chronic productivity loss, early detection of plant stress is crucial in CEA production. Rapid detection and decision-making are necessary when stress manifests in plants in order to manage the stress and prevent economic loss. We discovered that a few DL stress-level monitoring papers are reported for plant factories. Stress level monitoring encompasses various areas such as water stress classification [92], tip-burn stress detection [93], lettuce light stress grading [94], and abnormal leaves sorting [91].

3.2.2. Growth Monitoring

In an indoor farm, it is critical to maintain a climate that promotes crop development through ongoing farm conditions monitoring. Crop states are critical for determining the optimal cultivation environment, and by continuously monitoring crop statuses, a proper crop-optimized farm environment can feasibly be maintained. In contrast to traditional methods, which is time-consuming, DL models are required to automate the monitoring system and increase measurement accuracy. We found several studies used DL models for growth monitoring in indoor farms, including plant biomass monitoring [99], growth prediction model in arabidopsis [97], growth prediction model in lettuce [95], vision based plants phenotyping [98], plant growth prediction algorithm [96,101] and the development of automatic plant factory control system [100].

3.2.3. Yield Estimation

Due to its advantages over traditional methods in terms of accuracy, speed, robustness, and even resolving complicated agricultural scenarios, DL methods have been applied to yield estimation and counting research applications in indoor farming systems. The domains covered by yield estimation and counting from the examined publications include the identification of rapeseed [89] and cherry tomatoes [90].
The application distribution of DL techniques in CEA is shown in Figure 5.

4. Discussion

4.1. Summary of Reviewed Studies

We observed a rapid advancement in CEA using DL techniques between 2019 and 2022, as demonstrated in Figure 6. With rising work since 2019, this illustrates the relevance of DL in CEA. In Figure 7, we showed the distribution of published articles by various journals. The figure shows that the journal Computers and Electronics in Agriculture published the most DL for CEA articles (19). We also presented the country-by-country distribution of the evaluated articles, with China accounting for 40% of the total, indicating the highest number of publications, as shown in Figure 8. Korea and the Netherlands each contain 10% and 7% of the papers, respectively.

4.2. Evaluation Parameters

Our survey found that various evaluation parameters were employed in the selected publications (RQ.3). Precision, recall, intersection-over-union (IoU), root mean square error (RMSE), mean average precision (mAP), F1-Score, root mean square error (RMSE), R-Square, peak signal noise ratio (PSNR), Jaccard index, success rate, sensitivity, specificity, accuracy, structural similarity index measure (SSIM), errors, standard error of prediction (SEP), and inference time were the most commonly used evaluation parameters for the DL analysis in CEA. Figure 9 depicts the frequency with which the assessment parameters are used. With 29 times, accuracy was the most frequently utilized as an evaluation measure. Precision, recall, mAP, F1-Score, and RMSE were used at least 10 times; IoU and R-Square were used 5 times, while the rest were used fewer than 5 times. We noticed that RMSE and R-Square were utilized as evaluation metrics in all microclimate prediction studies. Success rate and accuracy were used as evaluation measures for robotic harvesting applications. With the exception of a few cases of recall, precision, mAP, and F1-score, works related to growth monitoring applications used accuracy, RMSE, R-Square, and accuracy. RMSE, precision, recall, mAP, F1-Score, and accuracy were commonly utilized in other applications in the examined studies.

4.3. DL Backbone Networks

RQ.4: What are the DL backbone networks used in CEA applications?
There are many backbone networks, but this article will only focus on the backbone networks used in the reviewed papers, which include ResNet, EfficientNet, DarkNet, Xception, InceptionResNet, MobileNet, VGG, GoogleNet, PRPNet. These network structures are fine-tuned or combined with other backbone structures.
ResNet was the most often utilized network in CEA applications, according to the survey, as illustrated in Figure 10. The ResNet architecture can overcome the vanishing/exploding gradient problem [104]. When using gradient-based learning and backpropagation to train a deep neural network, the number of n hidden layers is multiplied by the n number of derivatives. The vanishing gradient problem occurs when the derivatives are modest, and the gradient rapidly diminishes as it spreads throughout the model until it vanishes. The gradient increases exponentially as the derivatives grow, resulting in the exploding gradient problem. A skip connection strategy is utilized in the ResNet to skip some training layers and connect directly to the output. The benefit of utilizing the skipping approach is that if any layer degrades the performance of the network, regularization will skip it, preventing exploding/vanishing gradient problems.
The main feature of MobileNet [105] is that it uses depth-wise separable convolutions to replace the standard convolutions of traditional network structures. Its significant advantages are high computational efficiency and small parameters of convolutional networks. MobileNet v1 and v2 are used in the reviewed articles, with v2 performing faster than v1. ResNet, on the other hand, adds a structure made up of multiple layers of networks that feature a shortcut connection known as a residual block. ResNet and FPN are used by Mask R-CNN to combine and extract multi-layer information. Many variants of ResNet architecture were discovered in reviewed articles, i.e., the same concept but with a different number of layers. A ResNeXt replicates a building block that combines a number of transformations with the same topology. It exposes a new dimension in comparison to ResNet, and requires minimal extra effort in designing each path.
Inception network [106] uses many tricks to push performance, both in terms of speed and accuracy, such as in dimension reduction. The versions of the inception network used in these reviewed papers are InceptionV2, InceptionV3, Inception-ResNetV2, and SSD InceptionV2. Each version is an upgrade to increase the accuracy and reduce the computational complexity. InceptionResNetV2 can achieve higher accuracies at a lower epoch. With the advantage of expanding network depth while using a small convolution filter size, VGG [107] can significantly boost model performance. VGGNet inherits some of its framework from AlexNet [108]. GoogleNet [109] has an inception module inspired by sparse matrices, which can be clustered into dense sub-matrices to boost computation speed, which is in contrast to AlexNet and VGGNet, which increases the network depth to improve training results. Contrary to VGG-nets, the Inception model family has shown that correctly constructed topologies can produce compelling accuracy with minimal theoretical complexity.
The backbone network for You Only Look Once (YOLO), DarkNet, has been enhanced in its most recent edition. YOLOv2 and YOLOv3 introduce DarkNet19 and DarkNet53, respectively, while YOLOv4 proposes CSPDarkNet [110]. CSPNet [111] is proposed to mitigate the problem of heavy inference computations from the network architecture perspective and has been seen to be used in the recent YOLO structure, i.e., SE-YOLOv5 [56]. Other backbone network structures include Xception [112] with different layers of 65 and 71, EfficientNet [113], and PRPNet [55].

4.4. Optimizer

RQ.5: What are the optimization methods used for CEA applications?
In contrast to the increasing complexity of neural network topologies [114], the training methods remain very straightforward. In order to make a neural network efficient, it must first be trained, as most neural networks produce random outputs without it. Optimizers, which modify the properties of the neural network, such as weights and learning rate, have long been recognized as a primordial component of DL, and a robust optimizer can dramatically increase the performance of a given architecture.
Stochastic gradient descent (SGD) is an optimization approach and one of the variants of gradient descent that is also commonly used in neural networks. It updates the parameters for each training one at a time, eliminating redundancy. As a hyper-parameter, the learning rate of SGD is often difficult to tune because the magnitudes of multiple parameters change greatly, and adjustment is required during the training process. Several adaptive gradient descent variants have been created to address this problem, including Adaptive Moment Estimation (Adam) [115], RMSprop [116], Ranger [117], Momentum [118], and Nesterov [119]. These algorithms automatically adapt the learning rate to different parameters, based on the statistics of gradient leading to faster convergence, simplifying learning strategies, and have been seen in many neural networks applied to CEA applications, as demonstrated in Figure 11.

4.5. Growing Medium and Plant Distribution

RQ.6: What are the primary growing media and plants used for DL in the CEA?
We note that the most common growing medium used in the evaluated studies is soil-based (78%), as shown in Figure 12. There are 14 publications on hydroponics, one on aquaponics, and none on aeroponics for soil-less growing media. This insinuates that these soilless growing media are still in their infancy. We also showed the distribution of the plants used in the evaluated papers, with tomatoes representing 39% of all plants grown in the CEA and corresponding to the highest number of publications, as shown in Figure 13. The percentages of papers that planted lettuce, pepper, and cucumber are 16%, 9%, and 8%, respectively. According to the reviewed publications, it was also discovered that indoor farms used soil-less techniques (hydroponics and aquaponics) more frequently than greenhouse systems, which frequently used soil-based growing medium.

4.6. Challenges and Future Directions

To the best of our knowledge, the paragraphs below provide a brief description of some specific aspects on the challenges and potential directions of DL applications in CEA.
For DL models to be effective, learning typically needs a lot of data. Such huge training datasets are difficult to gather, not publicly available for some CEA applications, and may even be problematic owing to privacy laws. Even while data augmentation and massive training datasets methods can somewhat make up for the shortage of huge labeled datasets, it is difficult to completely meet the demand for hundreds or thousands, if not less, high-quality data points. When utilized with validated data, DL models may not be able to generalize in situations where the data is insufficient. However, we discovered a number of studies that used smaller datasets and attained great accuracy, as shown in [40,45,56,59,82]. The studies demonstrated various strategies for handling this circumstance by carefully choosing the features that ensure the method will perform at its peak. Additionally, in order to ensure optimal performance and streamline the processing of the learning algorithms, the dimensionality of the input vectors for the classification and detection algorithms must be reduced.
DL algorithms are also susceptible to the caliber of the data utilized to train them. Overfitting can occur when an algorithm “learns” about noise and excessive details in the input set, which has a detrimental effect on the created model’s ability to generalize. The model in this instance performs admirably on the training dataset but poorly on new data. To combat the overfitting model, regularization techniques include weight decay/regularization, altering the network’s complexity (i.e., the amount of weights and their values), early halting, and activity regularization.
We expect in the future to see more combinations of two-time series models for temporal sequence processing as demonstrated in [31]. It is also anticipated that more methods would use LSTM or other RNN models in the future, utilizing the time dimension to make more accurate predictions, especially in climatic condition prediction.Additionally, it helps to gauge the reliability of time series prediction by offering an explicable result. As a result, improving interpretability will receive a lot of attention in the future [120].
The majority of the evaluated studies focused on supervised learning, while just a small number used semi-supervised learning. Future works that include unsupervised learning into CEA applications will be heavily reliant on tools like the generative adversarial network (GAN). A generative modeling method known as GAN learns to replicate a specific data distribution. The lack of data is a major barrier to creating effective deep neural network models, but GANs are the solution [121]. In order to lessen model overfitting, the realistic images created by GAN that differ from the original training data are appealing in data augmentation of DL-computer vision.
Another area worth noting is the clear interest in the use of AI and computer vision in CEA applications. With the use of DL-computer vision, a number of difficult CEA issues are being resolved. However, DL-computer vision does face significant difficulties, one of which is the enormous processing power. Adopting cloud-based solutions with auto scaling, load balancing, and high availability characteristics is one way to deal with this issue. Real-time video input analysis and real-time inferences are some of the limitations of cloud solutions, but edge devices with features like GPU accelerators can do it. Utilizing computer vision solutions on edge hardware helps lessen latency restrictions. Few works have addressed the need for proper security to ensure data integrity and dependability in the rapidly expanding field of computer vision in CEA; additional research into this area is needed in subsequent works.
There is an imperative need where deep learning needs to be applied in the next few years such as developing more microclimate models for monitoring and maintaining the microclimatic parameters to the desired range for optimal plant growth and development, thus helping in irrigation and fertigation management of the crops. The need for AI, particularly DL, to derive an empirical and non-linear “growth response function” that maps microclimate conditions to crop growth stages is critical because, according to the reviewed papers, this has not been extensively studied. This calls for the optimization of microclimate control set points at various growth stages of crops. There are currently very few publications that have developed prediction models for the microclimate parameters in CEA. In addition to the microclimate prediction models, the need to also develop more microclimate control systems such as (1) developing automatic shading system to prevent crops from harsh sunlight in greenhouses, (2) developing pad-fan systems and fogging systems based on vapor pressure deficit (VPD) control, which is an effective way to simultaneously maintain ideal ranges of temperature and relative humidity, thus significantly enhancing plant photosynthesis and productivity in greenhouse production, (3) developing photoperiod control systems based on light spectrum and intensity control. Despite the paucity of studies on microclimate prediction and control, extensive research is needed in the use of edge-AI systems for precise monitoring at various phases of crop growth. Lastly, it is crucial to investigate the use of DL for nutrient solution management in soilless cultures (influenced by both microclimate conditions and crop growth). We anticipate that further research that considers monitoring, predicting, controlling, and optimizing microclimate factors in CEA will become available in the near future as advancements in accuracy, efficiency, and architectures are put forth. Additionally, the labor availability and associated costs, are a growing concern for the sustainability and profitability of CEA industry. Some research has been reported for developing robotic systems, but majority of it is focused on field production. However, the CEA is a unique production environment and the indoor grown crops have different requirements for automation based on the production technology employed (greenhouse, vertical tower, vertical tier, hydroponic, dutch bucket, pot/tray, etc., ). Further, the CEA crops are more dense (plants per unit area), which makes robotics applications more challenging. Thus, extensive efforts are required to develop DL-driven automation and robotic systems for different production environments, to address these challenges.

5. Conclusions

Today, it is evident that prediction and optimization procedures are essential in many industries. This study has fully discussed a review of DL-based research efforts in CEA, which were motivated by the most recent breakthroughs in computational neuroscience. This study examined various application areas, described the tasks, listed technical details such as DL models and networks, described the preprocessing augmentation, the optimizer used, and performance of each method.
The results of this study demonstrate that the applications of DL models have attracted a lot of interest recently as a result of their ability to recognize distinctive object features and offer greater precision. There is no way to determine which DL model is the best. However, we found that RNN-LSTM was frequently used for predicting microclimate conditions in CEA due to its time series prediction. We noticed that prediction of the microclimate conditions, a crucial issue in CEA, was the subject of relatively little of the reported research. We can see that CNN models, the widely used DL model, have high applicability and universality based on the reviewed papers. CNN and ResNet are most widely adopted DL model and network, while other models and networks are also implemented in this domain. In order to generate constructive discussions of the limitations of DL techniques in the CEA domain, critical challenges and future research prospects were presented. We believe these studies will serve as a roadmap for future studies towards creating an intelligent system for various CEA applications.

Author Contributions

Conceptualization, M.O.O. and A.Z.; methodology, M.O.O.; investigation, M.O.O.; writing—original draft preparation, M.O.O.; writing—review and editing, A.Z.; visualization, M.O.O.; funding acquisition, A.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This research was partially supported in part by the United States Department of Agriculture (USDA)’s National Institute of Food and Agriculture (NIFA) Federal Appropriations under TEX09954 and Accession No. 7002248. This publication was also supported by AgriLife Research, VFIC, and the Hatch program of the National Institute of Food and Agriculture, U.S. Department of Agriculture.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

MDPIMultidisciplinary Digital Publishing Institute
DOAJDirectory of open access journals
TLAThree letter acronym
LDLinear dichroism
AEAutoencoder
AIArtificial Intelligence
AdamAdaptive Moment Estimation
AGAAverage Gripping Accuracy
ANNArtificial Neural Networks
APAverage Precision
CEAControlled Environment Agriculture
CNNConvolutional Neural Networks
DBNDeep Belief Network
DCNNDeep Convolutional Neural Networks
DLDeep Learning
DMDowny Mildew
DNNDeep Neural Networks
FPNFeature Pyramid Networks
HSIHue, Saturation, Intensity
HSVHue, Saturation, Value
GRUGated Recurrent Unit
IoUIntersection Over Union
LDPELow-density Polyethylene
LiDARLight Detection and Ranging
LiPoLithium-ion Polymer
LSTMLong Short Term Memory
mAPMean Average Precision
MAPEMean Average Percent Error
NSNot Specified
NTMNeural Turing Machines
PPrecision
PMPowdery Mildew
PSNRPeak Signal Noise Ratio
RRecall
R2R-Square
RBMRestricted Boltzmann machine
R-CNNRegion-Based Convolutional Neural Networks
ResNetResidual Networks
RMSERoot Mean Square Error
RNNRecurrent Neural Networks
RMSPropRoot Mean Squared Propagation
RPNRegion Proposal Network
RQResearch Questions
SGDStochastic Gradient Descent
SEPStandard Error of Prediction
SSDSingle Shot Multibox Detector
SSIMStructural Similarity Index Measure
SLRSystematic Literature Review
STNSpatial Transformer Network
SVMSupport Vector Machine
TCNTemporal Convolutional Networks
VGGVisual Geometry Group
VPDVapor Pressure Deficit
YOLOYou Only Look Once

References

  1. World Health Organization. The State of Food Security and Nutrition in the World 2018: Building Climate Resilience for Food Security and Nutrition; Food and Agriculture Organization: Rome, Italy, 2018. [Google Scholar]
  2. Avtar, R.; Tripathi, S.; Aggarwal, A.K.; Kumar, P. Population–Urbanization–Energy Nexus: A Review. Resources 2019, 8, 136. [Google Scholar] [CrossRef] [Green Version]
  3. Benke, K.; Tomkins, B. Future Food-Production Systems: Vertical Farming and Controlled-Environment Agriculture. Sustain. Sci. Pract. Policy 2017, 13, 13–26. [Google Scholar] [CrossRef] [Green Version]
  4. Saad, M.H.M.; Hamdan, N.M.; Sarker, M.R. State of the Art of Urban Smart Vertical Farming Automation System: Advanced Topologies, Issues and Recommendations. Electronics 2021, 10, 1422. [Google Scholar] [CrossRef]
  5. Fortune Business Insights. Vertical Farming Market to Rise at 25.2% CAGR by 2028; Increasing Number of Product Launches Will Aid Growth, Says Fortune Business Insights™. Available online: https://www.globenewswire.com/news-release/2021/06/08/2243245/0/en/vertical-farming-market-to-rise-at-25-2-cagr-by-2028-increasing-number-of-product-launches-will-aid-growth-says-fortune-business-insights.html (accessed on 18 July 2022).
  6. Cision. United States $3 Billion Vertical Farming Market to 2024: Growing Popularity of Plug & Play Farms Scope for Automation Using Big Data and AI. Based on Report, Vertical Farming Market in the U.S.—Industry Outlook and Forecast 2019–2024”, by Research and Markets. Available online: https://www.prnewswire.com/news-releases/united-states-3-billion-vertical-farming-market-to-2024-growing-popularity-of-plug--play-farms--scope-for-automation-using-big-data-and-ai-300783042.html (accessed on 18 July 2022).
  7. Asseng, S.; Guarin, J.R.; Raman, M.; Monje, O.; Kiss, G.; Despommier, D.D.; Meggers, F.M.; Gauthier, P.P. Wheat Yield Potential in Controlled-Environment Vertical Farms. Proc. Natl. Acad. Sci. USA 2020, 117, 19131–19135. [Google Scholar] [CrossRef] [PubMed]
  8. Naus, T. Is Vertical Farming Really Sustainable. EIT Food. Available online: https://www.eitfood.eu/blog/post/is-vertical-farming-really-sustainable (accessed on 18 July 2022).
  9. Chia, T.-C.; Lu, C.-L. Design and Implementation of the Microcontroller Control System for Vertical-Garden Applications. In Proceedings of the 2011 Fifth International Conference on Genetic and Evolutionary Computing, Xiamen, China, 29 August–1 September 2011; pp. 139–141. [Google Scholar]
  10. Michael, G.; Tay, F.; Then, Y. Development of Automated Monitoring System for Hydroponics Vertical Farming. J. Phys. Conf. 2021, 1844, 012024. [Google Scholar] [CrossRef]
  11. Bhowmick, S.; Biswas, B.; Biswas, M.; Dey, A.; Roy, S.; Sarkar, S.K. Application of IoT-Enabled Smart Agriculture in Vertical Farming. In Advances in Communication, Devices and Networking, Lecture Notes in Electrical Engineering; Springer: Sinngapore, 2019; Volume 537, pp. 521–528. [Google Scholar]
  12. Monteiro, J.; Barata, J.; Veloso, M.; Veloso, L.; Nunes, J. Towards Sustainable Digital Twins for Vertical Farming. In Proceedings of the 2018 Thirteenth International Conference on Digital Information Management (ICDIM), Berlin, Germany, 24–26 September 2018; pp. 234–239. [Google Scholar]
  13. Siregar, R.R.A.; Palupiningsih, P.; Lailah, I.S.; Sangadji, I.B.; Sukmajati, S.; Pahiyanti, A.N.G. Automatic Watering Systems in Vertical Farming Using the Adaline Algorithm. In Proceedings of the International Seminar of Science and Applied Technology (ISSAT 2020), Virtual, 24 November 2020; pp. 429–435. [Google Scholar]
  14. Ruscio, F.; Paoletti, P.; Thomas, J.; Myers, P.; Fichera, S. Low-cost Monitoring System for Hydroponic Urban Vertical Farms. Int. J. Agric. Biosyst. Eng. 2019, 13, 267–271. [Google Scholar]
  15. Leblanc, R. What You Should Know about Vertical Farming. Available online: https://www.thebalancesmb.com/what-you-should-know-about-vertical-farming-4144786 (accessed on 18 July 2022).
  16. Statista-Research. Labor Demand for Indoor Farming Worldwide as of 2016, by Farm Size. Available online: https://www.statista.com/statistics/752196/labor-demand-for-indoor-farming-by-farm-size/ (accessed on 18 July 2022).
  17. Iron-OX. Available online: https://ironox.com/technology/ (accessed on 18 July 2022).
  18. Bac, C.W.; Hemming, J.; Henten, E.J.V. Stem Localization of Sweet-Pepper Plants using the Support Wire as a Visual Cue. Comput. Electron. Agric. 2014, 105, 111–120. [Google Scholar] [CrossRef]
  19. Feng, Q.; Zou, W.; Fan, P.; Zhang, C.; Wang, X. Design and Test of Robotic Harvesting System for Cherry Tomato. Int. J. Agric. Biol. 2018, 11, 96–100. [Google Scholar] [CrossRef]
  20. Yaguchi, H.; Nagahama, K.; Hasegawa, T.; Inaba, M. Development of an Autonomous Tomato Harvesting Robot with Rotational Plucking Gripper. In Proceedings of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Korea, 9–14 October 2016; pp. 652–657. [Google Scholar]
  21. Tsoulias, N.; Paraforos, D.S.; Xanthopoulos, G.; Zude-Sasse, M. Apple Shape Detection Based on Geometric and Radiometric Features using a LiDAR Laser Scanner. Remote Sens. 2020, 12, 2481. [Google Scholar] [CrossRef]
  22. Kamilaris, A.; Prenafeta-Boldú, F.X. Deep Learning in Agriculture: A Survey. Comput. Electron. Agric. 2018, 147, 70–90. [Google Scholar] [CrossRef] [Green Version]
  23. Koirala, A.; Walsh, K.B.; Wang, Z.; McCarthy, C. Deep Learning–Method Overview and Review of Use for Fruit Detection and Yield Estimation. Comput. Electron. Agric. 2019, 162, 219–234. [Google Scholar] [CrossRef]
  24. Saleem, M.H.; Potgieter, J.; Arif, K.M. Plant Disease Detection and Classification by Deep Learning. Plants 2019, 8, 468. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  25. Zhang, Q.; Liu, Y.; Gong, C.; Chen, Y.; Yu, H. Applications of Deep Learning for Dense Scenes Analysis in Agriculture: A Review. Sensors 2020, 20, 1520. [Google Scholar] [CrossRef] [Green Version]
  26. Li, L.; Zhang, S.; Wang, B. Plant Disease Detection and Classification by Deep Learning—A Review. IEEE Access 2021, 9, 56683–56698. [Google Scholar] [CrossRef]
  27. Hasan, A.M.; Sohel, F.; Diepeveen, D.; Laga, H.; Jones, M.G. A Survey of Deep Learning Techniques for Weed Detection from Images. Comput. Electron. Agric. 2021, 184, 106067. [Google Scholar] [CrossRef]
  28. Darwin, B.; Dharmaraj, P.; Prince, S.; Popescu, D.E.; Hemanth, D.J. Recognition of Bloom/Yield in Crop Images Using Deep Learning Models for Smart Agriculture: A Review. Agronomy 2021, 11, 646. [Google Scholar] [CrossRef]
  29. Okoli, C.; Schabram, K. A Guide to Conducting a Systematic Literature Review of Information Systems Research; Elsevier: Amsterdam, The Netherlands, 2010. [Google Scholar]
  30. Nam, D.S.; Moon, T.; Lee, J.W.; Son, J.E. Estimating Transpiration Rates of Hydroponically-Grown Paprika Via an Artificial Neural Network Using Aerial and Root-Zone Environments and GrowthFactors in Greenhouses. Hortic. Environ. Biotechnol. 2019, 60, 913–923. [Google Scholar] [CrossRef]
  31. Gong, L.; Yu, M.; Jiang, S.; Cutsuridis, V.; Pearson, S. Deep Learning Based Prediction on Greenhouse Crop Yield Combined TCN and RNN. Sensors 2021, 21, 4537. [Google Scholar] [CrossRef]
  32. Jung, D.-H.; Kim, H.S.; Jhin, C.; Kim, H.-J.; Park, S.H. Time-Serial Analysis of Deep Neural Network Models for Prediction of Climatic Conditions inside a Greenhouse. Comput. Electron. Agric. 2020, 173, 105402. Available online: https://www.sciencedirect.com/science/article/pii/S0168169919317326 (accessed on 15 September 2022). [CrossRef]
  33. Ali, A.; Hassanein, H.S. Wireless Sensor Network and Deep Learning for Prediction Greenhouse Environments. In Proceedings of the 2019 International Conference on Smart Applications, Communications and Networking (SmartNets), Sharm El Sheikh, Egypt, 17–19 December 2019; pp. 1–5. [Google Scholar]
  34. Liu, Y.; Li, D.; Wan, S.; Wang, F.; Dou, W.; Xu, X.; Li, S.; Ma, R.; Qi, L. A Long Short-Term Memory-Based Model for Greenhouse Climate Prediction. Int. J. Intell. Syst. 2022, 37, 135–151. [Google Scholar] [CrossRef]
  35. Picon, A.; San-Emeterio, M.G.; Bereciartua-Perez, A.; Klukas, C.; Eggers, T.; Navarra-Mestre, R. Deep Learning-based Segmentation of Multiple Species of Weeds and Corn Crop Using Synthetic and Real Image Datasets. Comput. Electron. Agric. 2022, 194, 106719. [Google Scholar] [CrossRef]
  36. Li, X.; Pan, J.; Xie, F.; Zeng, J.; Li, Q.; Huang, X.; Liu, D.; Wang, X. Fast and Accurate Green Pepper Detection in Complex Backgrounds Via an Improved YOLOv4-tiny Model. Comput. Electron. Agric. 2021, 191, 106503. [Google Scholar] [CrossRef]
  37. Tenorio, G.L.; Caarls, W. Automatic Visual Estimation of Tomato Cluster Maturity in Plant Rows. Mach. Vis. Appl. 2021, 32, 1–18. [Google Scholar] [CrossRef]
  38. Sun, J.; He, X.; Wu, M.; Wu, X.; Shen, J.; Lu, B. Detection of Tomato Organs based on Convolutional Neural Network under the Overlap and Occlusion Backgrounds. Mach. Vis. Appl. 2020, 31, 1–13. [Google Scholar] [CrossRef]
  39. Rong, J.; Wang, P.; Yang, Q.; Huang, F. A Field-Tested Harvesting Robot for Oyster Mushroom in Greenhouse. Agronomy 2021, 11, 1210. [Google Scholar] [CrossRef]
  40. Fonteijn, H.; Afonso, M.; Lensink, D.; Mooij, M.; Faber, N.; Vroegop, A.; Polder, G.; Wehrens, R. Automatic Phenotyping of Tomatoes in Production Greenhouses using Robotics and Computer Vision: From Theory to Practice. Agronomy 2021, 11, 1599. [Google Scholar] [CrossRef]
  41. Lu, C.-P.; Liaw, J.-J.; Wu, T.-C.; Hung, T.-F. Development of a Mushroom Growth Measurement System Applying Deep Learning for Image Recognition. Agronomy 2019, 9, 32. [Google Scholar] [CrossRef] [Green Version]
  42. Seo, D.; Cho, B.-H.; Kim, K. Development of Monitoring Robot System for Tomato Fruits in Hydroponic Greenhouses. Agronomy 2021, I, 2211. [Google Scholar] [CrossRef]
  43. Yuan, T.; Lv, L.; Zhang, F.; Fu, J.; Gao, J.; Zhang, J.; Li, W.; Zhang, C.; Zhang, W. Robust Cherry Tomatoes Detection Algorithm in Greenhouse Scene Based on SSD. Agriculture 2020, 10, 160. [Google Scholar] [CrossRef]
  44. Islam, M.P.; Nakano, Y.; Lee, U.; Tokuda, K.; Kochi, N. TheLNet270v1–A Novel Deep-Network Architecture for the Automatic Classification of Thermal Images for Greenhouse Plants. Front. Plant Sci. 2021, 12, 630425. [Google Scholar] [CrossRef]
  45. Afonso, M.; Fonteijn, H.; Fiorentin, F.S.; Lensink, D.; Mooij, M.; Faber, N.; Polder, G.; Wehrens, R. Tomato Fruit Detection and Counting in Greenhouses Using Deep Learning. Front. Plant Sci. 2020, 11, 571299. [Google Scholar] [CrossRef]
  46. Zhang, P.; Li, D. YOLO-VOLO-LS: A Novel Method for Variety Identification of Early Lettuce Seedlings. Front. Plant Sci. 2022, 13, 806878. [Google Scholar] [CrossRef]
  47. Lawal, O.M.; Zhao, H. YOLOFig Detection Model Development Using Deep Learning. IET Image Process. 2021, 15, 3071–3079. [Google Scholar] [CrossRef]
  48. Zhou, C.; Hu, J.; Xu, Z.; Yue, J.; Ye, H.; Yang, G. A Novel Greenhouse-Based System for the Detection and Plumpness Assessment of Strawberry using an Improved Deep Learning Technique. Front. Plant Sci. 2020, 11, 559. [Google Scholar] [CrossRef]
  49. Arad, B.; Kurtser, P.; Barnea, E.; Harel, B.; Edan, Y.; Ben-Shahar, O. Controlled Lighting and Illumination-Independent Target Detection for Real-Time Cost-Efficient Applications. The Case Study of Sweet Pepper Robotic Harvesting. Sensors 2019, 9, 1390. [Google Scholar] [CrossRef] [Green Version]
  50. Mu, Y.; Chen, T.-S.; Ninomiya, S.; Guo, W. Intact Detection of Highly Occluded Immature Tomatoes on Plants using Deep Learning Techniques. Sensors 2020, 20, 2984. [Google Scholar] [CrossRef] [PubMed]
  51. Moreira, G.; Magalhaes, S.A.; Pinho, T.; Santos, F.N.d.; Cunha, M. Benchmark of Deep Learning and a Proposed HSV Colour Space Models for the Detection and Classification of Greenhouse Tomato. Agronomy 2022, 12, 356. [Google Scholar] [CrossRef]
  52. Lawal, O.M. YOLOMuskmelon: Quest for Fruit Detection Speed and Accuracy using Deep Learning. IEEE Access 2021, 9, 15221–15227. [Google Scholar] [CrossRef]
  53. Magalhaes, S.A.; Castro, L.; Moreira, G.; Santos, F.N.D.; Cunha, M.; Dias, J.; Moreira, A.P. Evaluating the Single-Shot Multibox Detector and YOLO Deep Learning Models for the Detection of Tomatoes in a Greenhouse. Sensors 2021, 21, 3569. [Google Scholar] [CrossRef]
  54. Lyu, B.; Smith, S.D.; Cherkauer, K.A. Fine-Grained Recognition in High-Throughput Phenotyping. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, Seattle, WA, USA, 14–19 June 2020; pp. 72–73. [Google Scholar]
  55. Zhou, J.; Li, J.; Wang, C.; Wu, H.; Zhao, C.; Wang, Q. A Vegetable Disease Recognition Model for Complex Background based on Region Proposal and Progressive Learning. Comput. Electron. Agric. 2021, 184, 106101. [Google Scholar] [CrossRef]
  56. Qi, J.; Liu, X.; Liu, K.; Xu, F.; Guo, H.; Tian, X.; Li, M.; Bao, Z.; Li, Y. An Improved YOLOv5 Model Based on Visual Attention Mechanism: Application to Recognition of Tomato Virus Disease. Comput. Electron. Agric. 2022, 194, 106780. [Google Scholar] [CrossRef]
  57. Zhang, P.; Yang, L.; Li, D. Efficientnet-B4-Ranger: A Novel Method for Greenhouse Cucumber Disease Recognition under Natural Complex Environment. Comput. Electron. Agric. 2020, 176, 105652. [Google Scholar] [CrossRef]
  58. Wang, C.; Zhou, J.; Zhao, C.; Li, J.; Teng, G.; Wu, H. Few-shot Vegetable Disease Recognition Model Based on Image Text Collaborative Representation Learning. Comput. Electron. Agric. 2021, 184, 106098. [Google Scholar] [CrossRef]
  59. Fernando, S.; Nethmi, R.; Silva, A.; Perera, A.; Silva, R.D.; Abeygunawardhana, P.K. Intelligent Disease Detection System for Greenhouse with a Robotic Monitoring System. In Proceedings of the 2nd International Conference on Advancements in Computing (ICAC), Colombo, Sri Lanka, 10–11 December 2020; Volume 1, pp. 204–209. [Google Scholar]
  60. Nieuwenhuizen, A.; Kool, J.; Suh, H.; Hemming, J. Automated Spider Mite Damage Detection on Tomato Leaves in Greenhouses. In Proceedings of the XI International Symposium on Protected Cultivation in Mild Winter Climates and I International Symposium on Nettings and 1268, Tenerife, Canary Islands, Spain, 27–31 January 2019; pp. 165–172. [Google Scholar]
  61. Liu, K.; Zhang, C.; Yang, X.; Diao, M.; Liu, H.; Li, M. Development of an Occurrence Prediction Model for Cucumber Downy Mildew in Solar Greenhouses Based on Long Short-Term Memory Neural Network. Agronomy 2022, 12, 442. [Google Scholar] [CrossRef]
  62. Fuentes, A.; Yoon, S.; Lee, M.H.; Park, D.S. Improving Accuracy of Tomato Plant Disease Diagnosis Based on Deep Learning With Explicit Control of Hidden Classes. Front. Plant Sci. 2021, 12, 682230. [Google Scholar] [CrossRef]
  63. Wang, X.; Liu, J. Tomato Anomalies Detection in Greenhouse Scenarios Based on YOLO-Dense. Front. Plant Sci. 2021, 12, 533. [Google Scholar] [CrossRef]
  64. Zhang, Z.; Flores, P.; Friskop, A.; Liu, Z.; Igathinathane, C.; Jahan, N.; Mathew, J.; Shreya, S. Enhancing Wheat Disease Diagnosis in a GreenhouseUusing Image Deep Features and Parallel Feature Fusion. Front. Plant Sci. 2022, 13, 834447. [Google Scholar] [CrossRef]
  65. Li, W.; Wang, D.; Li, M.; Gao, Y.; Wu, J.; Yang, X. Field detection of tiny pests from sticky trap images using deep learning in agricultural greenhouse. Comput. Electron. Agric. 2021, 183, 106048. [Google Scholar] [CrossRef]
  66. Tureček, T.; Vařacha, P.; Turexcxková, A.; Psota, V.; Jankúu, P.; Štěpánek, V.; Viktorin, A.; xSxenkexrxík, R.; Jašek, R.; Chramcov, B.; et al. Scouting of Whiteflies in Tomato Greenhouse Environment Using Deep Learning. In Agriculture Digitalization and Organic Production; Springer: Singapore, 2022; pp. 323–335. [Google Scholar]
  67. Wang, D.; Wang, Y.; Li, M.; Yang, X.; Wu, J.; Li, W. Using an Improved YOLOv4 Deep Learning Network for Accurate Detection of Whitefly and Thrips on Sticky Trap Images. Trans. ASABE 2021, 64, 919–927. [Google Scholar] [CrossRef]
  68. Rustia, D.J.A.; Chao, J.-J.; Chiu, L.-Y.; Wu, Y.-F.; Chung, J.-Y.; Hsu, J.-C.; Lin, T.-T. Automatic Greenhouse Insect Pest Detection and Recognition Based on a Cascaded Deep Learning Classification Method. J. Appl. Entomol. 2021, 145, 206–222. [Google Scholar] [CrossRef]
  69. Zhou, X.; Sun, J.; Tian, Y.; Yao, K.; Xu, M. Detection of Heavy Metal Lead in Lettuce Leaves Based on Fluorescence Hyperspectral Technology Combined with Deep Learning Algorithm. Spectrochim. Acta Part Mol. Biomol. Spectrosc. 2022, 266, 120460. [Google Scholar] [CrossRef] [PubMed]
  70. da Silva, L.A.; Bressan, P.O.; Gonçalves, D.N.; Freitas, D.M.; Machado, B.B.; Gonxcxalves, W.N. Estimating Soybean Leaf Defoliation using Convolutional Neural Networks and Synthetic Images. Comput. Electron. Agric. 2019, 156, 360–368. [Google Scholar] [CrossRef]
  71. Qu, Y.; Clausen, A.; Jørgensen, B.N. Application of Deep Neural Network on Net Photosynthesis Modeling. In Proceedings of the IEEE 19th International Conference on Industrial Informatics (INDIN), Palma de Mallorca, Spain, 21–23 July 2021; pp. 1–7. [Google Scholar]
  72. Ahsan, M.; Eshkabilov, S.; Cemek, B.; Küçüktopcu, E.; Lee, C.W.; Simsek, H. Deep Learning Models to Determine Nutrient Concentration in Hydroponically Grown Lettuce Cultivars (Lactuca sativa L.). Sustainability 2021, 14, 416. [Google Scholar] [CrossRef]
  73. Kusanur, V.; Chakravarthi, V.S. Using Transfer Learning for Nutrient Deficiency Prediction and Classification in Tomato Plan. Int. J. Adv. Comput. Sci. Appl. 2021, 12, 784–790. [Google Scholar]
  74. Sun, J.; Wu, M.; Hang, Y.; Lu, B.; Wu, X.; Chen, Q. Estimating Cadmium Content in Lettuce Leaves Based on Deep Brief Network and Hyperspectral Imaging Technology. J. Food Process Eng. 2019, 42, e13293. [Google Scholar] [CrossRef]
  75. Tran, T.-T.; Choi, J.-W.; Le, T.-T.H.; Kim, J.-W. A Comparative Study of Deep CNN in Forecasting and Classifying the Macronutrient Deficiencies on Development of Tomato Plant. Appl. Sci. 2019, 9, 1601. [Google Scholar] [CrossRef] [Green Version]
  76. Vit, A.; Shani, G.; Bar-Hillel, A. Length Phenotyping with Interest Point Detection. Comput. Electron. Agric. 2020, 176, 105629. Available online: https://www.sciencedirect.com/science/article/pii/S0168169919318939 (accessed on 15 September 2022). [CrossRef]
  77. Boogaard, F.P.; Rongen, K.S.; Kootstra, G.W. Robust node detection and tracking in fruit-vegetable crops using deep learning and multi-view imaging. Biosyst. Eng. 2020, 192, 117–132. [Google Scholar] [CrossRef]
  78. Xhimitiku, I.; Bianchi, F.; Proietti, M.; Tocci, T.; Marini, A.; Menculini, L.; Termite, L.F.; Pucci, E.; Garinei, A.; Marconi, M.; et al. Anomaly Detection in Plant Growth in a Controlled Environment using 3D Scanning Techniques and Deep Learning. In Proceedings of the 2021 IEEE International Workshop on Metrology for Agriculture and Forestry (MetroAgriFor), Trento, Italy, 3–5 November 2021; pp. 86–91. [Google Scholar]
  79. Lauguico, S.; Concepcion, R.; Tobias, R.R.; Alejandrino, J.; Guia, J.D.; Guillermo, M.; Sybingco, E.; Dadios, E. Machine Vision-Based Prediction of Lettuce Phytomorphological Descriptors using Deep Learning Networks. In Proceedings of the IEEE 12th International Conference on Humanoid, Nanotechnology, Information Technology, Communication and Control, Environment, and Management (HNICEM), Manila, Philippines, 3–7 December 2020; pp. 1–6. [Google Scholar]
  80. Zhu, F.; He, M.; Zheng, Z. Data Augmentation using Improved cDCGAN for Plant Vigor Rating. Comput. Electron. Agric. 2020, 175, 105603. [Google Scholar] [CrossRef]
  81. Ullah, S.; Henke, M.; Narisetti, N.; Panzarová, K.; Trtílek, M.; Hejatko, J.; Gladilin, E. Towards Automated Analysis of Grain Spikes in Greenhouse Images Using Neural Network Approaches: A Comparative Investigation of Six Methods. Sensors 2021, 21, 7441. [Google Scholar] [CrossRef]
  82. Choi, K.; Park, K.; Jeong, S. Classification of Growth Conditions in Paprika Leaf Using Deep Neural Network and Hyperspectral Images. In Proceedings of the 2021 Twelfth International Conference on Ubiquitous and Future Networks (ICUFN), Jeju Island, Korea, 17–20 August 2021; pp. 93–95. [Google Scholar]
  83. Baar, S.; Kobayashi, Y.; Horie, T.; Sato, K.; Suto, H.; Watanabe, S. Non-destructive Leaf Area Index Estimation Via Guided Optical Imaging for Large Scale Greenhouse Environments. Comput. Electron. Agric. 2022, 197, 106911. [Google Scholar] [CrossRef]
  84. Xiong, Y.; Ge, Y.; From, P.J. An Obstacle Separation Method for Robotic Picking of Fruits in Clusters. Comput. Electron. Agric. 2020, 175, 105397. [Google Scholar] [CrossRef]
  85. Jin, Y.; Liu, J.; Wang, J.; Xu, Z.; Yuan, Y. Far-near Combined Positioning of Picking-point based on Depth Data Features for Horizontal-Trellis Cultivated Grape. Comput. Electron. Agric. 2022, 194, 106791. [Google Scholar] [CrossRef]
  86. Zhang, F.; Gao, J.; Zhou, H.; Zhang, J.; Zou, K.; Yuan, T. Three-Dimensional Pose Detection method Based on Keypoints Detection Network for Tomato Bunch. Comput. Electron. Agric. 2022, 195, 106824. [Google Scholar] [CrossRef]
  87. Gong, L.; Wang, W.; Wang, T.; Liu, C. Robotic Harvesting of the Occluded Fruits with a Precise Shape and Position Reconstruction Approach. J. Field Robot. 2022, 39, 69–84. [Google Scholar] [CrossRef]
  88. Lahcene, A.; Amine, D.M.; Abdelkader, D. A Hybrid Deep Learning Model for Predicting Lifetime and Mechanical Performance Degradation of Multilayer Greenhouse Polyethylene Films. Polym. Sci. Ser. B 2021, 63, 964–977. [Google Scholar] [CrossRef]
  89. Zhang, P.; Li, D. EPSA-YOLO-V5s: A Novel Method for Detecting the Survival Rate of Rapeseed in a Plant Factory Based on Multiple Guarantee Mechanisms. Comput. Electron. Agric. 2022, 193, 106714. [Google Scholar] [CrossRef]
  90. Xu, P.; Fang, N.; Liu, N.; Lin, F.; Yang, S.; Ning, J. Visual Recognition of Cherry Tomatoes in Plant Factory Based on Improved Deep Instance Segmentation. Comput. Electron. Agric. 2022, 197, 106991. [Google Scholar] [CrossRef]
  91. Wu, Z.; Yang, R.; Gao, F.; Wang, W.; Fu, L.; Li, R. Segmentation of Abnormal Leaves of Hydroponic Lettuce Based on DeepLabV3+ for Robotic Sorting. Comput. Electron. Agric. 2021, 190, 106443. [Google Scholar] [CrossRef]
  92. Hendrawan, Y.; Damayanti, R.; Riza, D.F.A.; Hermanto, M.B. Classification of Water Stress in Cultured Sunagoke Moss Using Deep Learning. Telkomnika 2021, 19, 1594–1604. [Google Scholar] [CrossRef]
  93. Gozzovelli, R.; Franchetti, B.; Bekmurat, M.; Pirri, F. Tip-Burn Stress Detection of Lettuce Canopy Grown in Plant Factories. In Proceedings of the IEEE/CVF International Conference on Computer Vision, Montreal, QC, Canada, 11–17 October 2021; pp. 1259–1268. [Google Scholar]
  94. Hao, X.; Jia, J.; Gao, W.; Guo, X.; Zhang, W.; Zheng, L.; Wang, M. MFC-CNN: An Automatic Grading Scheme for Light Stress Levels of Lettuce (Lactuca sativa L.) leaves. Comput. Electron. Agric. 2020, 179, 105847. [Google Scholar] [CrossRef]
  95. Rizkiana, A.; Nugroho, A.; Salma, N.; Afif, S.; Masithoh, R.; Sutiarso, L.; Okayasu, T. Plant Growth Prediction Model for Lettuce (Lactuca sativa) in Plant Factories Using Artificial Neural Network. In Proceedings of the IOP Conference Series: Earth and Environmental Science, Miass, Russia, 20–23 September 2021; Volume 733, p. 012027. [Google Scholar]
  96. Kim, T.; Lee, S.-H.; Kim, J.-O. A Novel Shape Based Plant Growth Prediction Algorithm Using Deep Learning and Spatial Transformation. IEEE Access 2022, 10, 731–737. [Google Scholar] [CrossRef]
  97. Chang, S.; Lee, U.; Hong, M.J.; Jo, Y.D.; Kim, J.-B. Time-Series Growth Prediction Model Based on U-Net and Machine Learning in Arabidopsis. Front. Plant Sci. 2021, 12, 512–721. [Google Scholar] [CrossRef] [PubMed]
  98. Franchetti, B.; Ntouskos, V.; Giuliani, P.; Herman, T.; Barnes, L.; Pirri, F. Vision Based Modeling of Plants Phenotyping in Vertical Farming under Artificial Lighting. Sensors 2019, 19, 4378. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  99. Buxbaum, N.; Lieth, J.; Earles, M. Non-Destructive Plant Biomass Monitoring With High Spatio-Temporal Resolution via Proximal RGB-D Imagery and End-to-End Deep Learning. Front. Plant Sci. 2022, 13, 758818. [Google Scholar] [CrossRef] [PubMed]
  100. Vorapatratorn, S. Development of Automatic Plant Factory Control Systems with AI-Based Artificial Lighting. In Proceedings of the 13th International Conference on Information Technology and Electrical Engineering (ICITEE), Chiang Mai, Thailand, 14–15 October 2021; pp. 69–73. [Google Scholar]
  101. Hwang, Y.; Lee, S.; Kim, T.; Baik, K.; Choi, Y. Crop Growth Monitoring System in Vertical Farms Based on Region-of-Interest Prediction. Agriculture 2022, 12, 656. [Google Scholar] [CrossRef]
  102. Tao, X.; Zhang, D.; Wang, Z.; Liu, X.; Zhang, H.; Xu, D. Detection of Power Line Insulator Defects using Aerial Images Analyzed with Convolutional Neural Networks. IEEE Trans. Syst. Man Cybern. Syst. 2018, 50, 1486–1498. [Google Scholar] [CrossRef]
  103. Aljubury, I.M.A.; Ridha, H.D. Enhancement of Evaporative Cooling System in a Greenhouse using Geothermal Energy. Renew. Energy 2017, 111, 321–331. [Google Scholar] [CrossRef]
  104. Philipp, G.; Song, D.; Carbonell, J.G. Gradients Explode-Deep Networks are Shallow-ResNet Explained. In Proceedings of the 6th International Conference on Learning Representations ICLR Workshop Track, Vanvouver, BC, Canada, 30 April–3 May 2018; Available online: https://openreview.net/forum?id=rJjcdFkPM (accessed on 15 September 2022).
  105. Howard, A.G.; Zhu, M.; Chen, B.; Kalenichenko, D.; Wang, W.; Weyand, T.; Andreetto, M.; Adam, H. Mobilenets: Efficient Convolutional Neural Networks for Mobile Vision Applications. arXiv 2017, arXiv:1704.04861. [Google Scholar]
  106. Ioffe, S.; Szegedy, C. Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift. In Proceedings of the International conference on machine learning, Lille, France, 6–11 July 2015; pp. 448–456. [Google Scholar]
  107. Simonyan, K.; Zisserman, A. Very Deep Convolutional Networks for Large-Scale Image Recognition. arXiv 2014, arXiv:1409.1556. [Google Scholar]
  108. Krizhevsky, A.; Sutskever, I.; Hinton, G.E. Imagenet Classification with Deep Convolutional Neural Networks. Commun. ACM 2017, 60, 84–90. [Google Scholar] [CrossRef] [Green Version]
  109. Szegedy, C.; Liu, W.; Jia, Y.; Sermanet, P.; Reed, S.; Anguelov, D.; Erhan, D.; Vanhoucke, V.; Rabinovich, A. Going Deeper with Convolutions. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 7–12 June 2015; pp. 1–9. [Google Scholar]
  110. Bochkovskiy, A.; Wang, C.-Y.; Liao, H.-Y.M. YOlOv4: Optimal Speed and Accuracy of Object Detection. arXiv 2020, arXiv:2004.10934. [Google Scholar]
  111. Wang, C.-Y.; Liao, H.-Y.M.; Wu, Y.-H.; Chen, P.-Y.; Hsieh, J.-W.; Yeh, I.-H. Cspnet: A New Backbone that can Enhance Learning Capability of CNN. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, Seattle, WA, USA, 14–19 June 2020; pp. 390–391. [Google Scholar]
  112. Chollet, F. Xception: Deep Learning with Depthwise Separable Convolutions. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 1251–1258. [Google Scholar]
  113. Tan, M.; Le, Q. Efficientnet: Rethinking Model Scaling for Convolutional Neural Networks. In Proceedings of the International Conference on Machine Learning, Long Beach, CA, USA, 9–15 June 2019; pp. 6105–6114. [Google Scholar]
  114. He, K.; Zhang, X.; Ren, S.; Sun, J. Deep Residual Learning for Image Recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 26 June–1 July 2016; pp. 770–778. [Google Scholar]
  115. Kingma, D.P.; Ba, J. Adam: A Method for Stochastic Optimization. arXiv 2014, arXiv:1412.6980. [Google Scholar]
  116. Xu, D.; Zhang, S.; Zhang, H.; Mandic, D.P. Convergence of the RMSProp Deep Learning Method with Penalty for Nonconvex Optimization. Neural Netw. 2021, 139, 17–23. [Google Scholar] [CrossRef] [PubMed]
  117. Tong, Q.; Liang, G.; Bi, J. Calibrating the Adaptive Learning Rate to improve Convergence of ADAM. Neurocomputing 2022, 481, 333–356. [Google Scholar] [CrossRef]
  118. Cutkosky, A.; Mehta, H. Momentum Improves Normalized SGD. In Proceedings of the International Conference on Machine Learning, Vienna, Austria, 12–18 July 2020; pp. 2260–2268. [Google Scholar]
  119. Nesterov, Y. A Method for Unconstrained Convex Minimization Problem with the Rate of Convergence o (1/k^2). Dokl. USSR 1983, 269, 543–547. [Google Scholar]
  120. Miller, T. Explanation in Artificial Intelligence: Insights from the Social Sciences. Artif. Intell. 2019, 267, 1–38. [Google Scholar] [CrossRef]
  121. Hiriyannaiah, S.; Srinivas, A.; Shetty, G.K.; Siddesh, G.; Srinivasa, K. A Computationally Intelligent Agent for Detecting Fake News Using Generative Adversarial Networks. In Hybrid Computational Intelligence; Elsevier: Amsterdam, The Netherlands, 2020; pp. 69–96. [Google Scholar]
Figure 1. Bibliometric visualization produced by VOSviewer Software using the author’s specified keywords.
Figure 1. Bibliometric visualization produced by VOSviewer Software using the author’s specified keywords.
Sensors 22 07965 g001
Figure 2. Planning and reporting process of systematic literature review (SLR).
Figure 2. Planning and reporting process of systematic literature review (SLR).
Sensors 22 07965 g002
Figure 3. Article inclusion and exclusion process flowchart.
Figure 3. Article inclusion and exclusion process flowchart.
Sensors 22 07965 g003
Figure 4. Visual illustration of the deep learning techniques applied to controlled environment agriculture in 2019–2022 (Focusing on the reviewed papers).
Figure 4. Visual illustration of the deep learning techniques applied to controlled environment agriculture in 2019–2022 (Focusing on the reviewed papers).
Sensors 22 07965 g004
Figure 5. Application distribution of deep learning in controlled environment agriculture.
Figure 5. Application distribution of deep learning in controlled environment agriculture.
Sensors 22 07965 g005
Figure 6. Year-wise distribution of the publication from 2019 to April 2022.
Figure 6. Year-wise distribution of the publication from 2019 to April 2022.
Sensors 22 07965 g006
Figure 7. Publication distribution for deep learning applications in controlled environment agriculture.
Figure 7. Publication distribution for deep learning applications in controlled environment agriculture.
Sensors 22 07965 g007
Figure 8. Country-wise distribution of the reviewed papers in controlled environment agriculture.
Figure 8. Country-wise distribution of the reviewed papers in controlled environment agriculture.
Sensors 22 07965 g008
Figure 9. Evaluation parameters distribution of deep learning model in controlled environment agriculture.
Figure 9. Evaluation parameters distribution of deep learning model in controlled environment agriculture.
Sensors 22 07965 g009
Figure 10. Distribution of different deep learning training networks used in controlled environment agriculture.
Figure 10. Distribution of different deep learning training networks used in controlled environment agriculture.
Sensors 22 07965 g010
Figure 11. Distribution of different deep learning optimizer used in controlled environment agriculture.
Figure 11. Distribution of different deep learning optimizer used in controlled environment agriculture.
Sensors 22 07965 g011
Figure 12. Growing medium distribution in controlled environment agriculture.
Figure 12. Growing medium distribution in controlled environment agriculture.
Sensors 22 07965 g012
Figure 13. Plant distribution of papers for deep learning applications in controlled environment agriculture.
Figure 13. Plant distribution of papers for deep learning applications in controlled environment agriculture.
Sensors 22 07965 g013
Table 1. Summary of the recent important related reviews.
Table 1. Summary of the recent important related reviews.
Ref.YearFocus of StudyHighlights
[22]2018Deep learning in agriculture40 papers were identified and examined in the context of deep learning in the agricultural domain.
[23]2019Fruit detection and yield estimationThe development of various deep learning models in fruit detection and localization to support tree crop load estimation was reviewed.
[24]2019Plant disease detection and classificationA thorough analysis of deep learning models used to visualize various plant diseases was reviewed.
[25]2020Dense images analysisReview deep learning applications for dense agricultural scenes, including recognition and classification, detection counting, and yield estimation.
[26]2021Plant disease detection and classificationCurrent trends and limitations for detecting plant leaf disease using deep learning and cutting-edge imaging techniques.
[27]2021Weed detection70 existing deep learning-based weed detection and classification techniques cover four main producers: data acquisition, datasets preparation, DL techniques, and evaluation metrics approaches.
[28]2021Bloom/Yield recognitionDiverse automation approaches with computer vision and deep learning models for crop yield detection were presented.
Our Paper2022Deep learning applications in CEAReview developments of deep learning models for various applications in CEA.
Table 2. Distribution of papers selected from different databases.
Table 2. Distribution of papers selected from different databases.
SourceNumber of Papers in the Initial SearchEligible Papers with Duplicates
Google Scholar33027
Scopus12725
Science Direct11919
Wiley404
IEEEXplore519
SpringerLink444
Web of Science4017
Total751105
Table 3. Summary of studies for deep learning applications in greenhouses.
Table 3. Summary of studies for deep learning applications in greenhouses.
Application ClassificationTasksGrowing MediumDL ModelNetworksPreprocessing AugmentationOptimizerDataset TypeImaging MethodPerformanceRef.
Climate
Condition
Prediction
Transpiration ratehydroponicANNANNNSAdam31,033 data pointsNSRMSE = 0.07–0.10-gm 2 min 1 , R2 = 0.95–0.96[30]
temp (°C), humidity deficit (g/kg), relative humidity (%), radiation ( W / m 2 ) , CO 2 conc.soil-basedRNN-TCNLSTM-RNNNSAdamNSNSRMSE (Dataset1): 10.45(±0.94), RMSE (Dataset2): 6.76 (±0.45), RMSE(Dataset3): 7.40 (±1.88)[31]
temperature, humidity, CO 2 concentrationsoil-basedANNNSNSAdamNSNSANN at 30 min, R2 = (temp: 0.94, humidity: 0.78, CO 2 : 0.70), RMSEP = (temp: 0.94, humidity: 5.44, CO 2 : 32.12), %SEP = (temp: 4.22, humidity: 8.18, CO 2 : 6.49)[32]
NARX NARX at 30 min, R2 = (temp: 0.86, humidity: 0.71, CO 2 : 0.81), RMSEP = (temp: 1.32, humidity: 6.27, CO 2 : 28.30), %SEP = (temp: 5.86, humidity: 9.42, CO 2 : 7.74)
RNN-LSTM RNN-LSTM at 30 min, R2 = (temp: 0.96, humidity: 0.8, CO 2 : 0.81), RMSEP = (temp: 0.71, humidity: 5.23, CO 2 : 28.30), %SEP = (temp: 3.15, humidity: 7.85, CO 2 : 5.72)
temp., humidity, pressure, dew pointsoil-basedRNN-LSTMNSNSNSNSNSTemperature, RMSE = 0.067163[33]
temp., humidity, illumination, CO 2 conc., soil temp. and soil moisturesoil-basedLSTMNSNSNSNSNSTemp., RMSE = 0.38 (tomato), 0.55 (cucumber), 0.42 (pepper)[34]
Humidity, RMSE = 1.25 (tomato), 1.95 (cucumber), 1.78 (pepper)
Illumination, RMSE = 78 (tomato), 80 (cucumber), 30 (pepper)
CO 2 , RMSE = 3.2 (tomato), 4.1 (cucumber), 3.9 (pepper)
Soil temp., RMSE = 0.07 (tomato), 0.08 (cucumber), 0.045 (pepper)
Soil moisture, RMSE = 0.14 (tomato), 0.30 (cucumber), 0.15 (pepper)
Yield
Estimation
corn crop and leaf weeds classificationsoil-basedDual PSPNetResNet-50rotation, shift (height, width, vertical, horizontal, pixel intensity), zoom and Gaussian blurSGD with Nesterov Momentum6906 imagesRGBBalanced Accuracy (BAC) = 75.76%, Dice-Sorensen Coefficient (DSC) = 47.97% (for dataset A+C)[35]
green pepper detectionsoil-basedImproved YOLOv4-tinyCSP DarkNet53Gaussian noise addition, HSL adjustment, scaling and rotationNS1500 imagesRGBP: 96.91%, R: 93.85%, AP: 95.11%, F1 Score: 0.95[36]
cherry tomato clusters location detection, tomato’s maturity estimationsoil-basedSSDMobileNet V1horizontal flip and random cropAdam or RMSprop254 imagesRGBIoU = 0.892 (for tomato’s cluster location detection), RMSE: 0.2522 (for tomato’s maturity estimation)[37]
tomato organs detectionsoil-basedImproved FPNResNet-101NSSGD8929 imagesRGBmAP: 99.5%[38]
mushroom recognitionsoil-basedImproved SSDMobileNet V2flip, random rotation, random cropping, and random size, brightness and tone conversion, random erasure, mixupNS4600 imagesRGBP: 94.4%, R: 93%, mAP: 93.2%, F1 Score: 0.937, Speed: 0.0032s[39]
tomato detectionsoil-basedMask R-CNNResNext-101NSSGD123 imagesRGBP: 93%, R: 93%, F1 Score: 0.93[40]
mushroom localizationsoil-basedYOLOv3DarkNet53NSNS500 imagesRGBAverage prediction error = 3.7 h, Average detection = 46.6[41]
tomato detectionhydroponicFaster R-CNNResNet-101gamma correctionmomentum895 imagesRGB, HSVdetection accuracy: 88.6%[42]
cherry tomato detectionsoil-basedSSDMobileNetrotating, brightness adjustment and noisingRMSProp1730 imagesRGBAP = 97.98%[43]
InceptionV2 AP = 98.85%
SSD300 Adam AP = 92.73%
SSD512 AP = 93.87%
plant classificationsoil-basedThe LNet270v1customrandom reflection (X and Y), Shear (X and Y), Scale (X and Y), Translation (X and Y), rotationAdam13,766 imagesRGBmean accuracy: 91.99%, mIoU: 86.5%, mean BFScore: 86.42%[44]
tomato detectionsoil-basedMask R-CNNResNet-50None usedSGD123 imagesRGBAverage result @ 0.5, (ResNet-50, P = 84.5%, R = 90.5%, F1 Score = 0.87)[45]
ResNet-101 Average result = 0.5, (ResNet-101, P = 82.5%, R = 90%, F1 Score = 0.86)
ResNext-101 Average result @ 0.5, (ResNext-101, P = 92%, R = 93%, F1 Score = 0.925)
Lettuce seedlings identificationhydroponicYOLO-VOLO-LSVOLOrotation, flipping, and contrast adjustmentNS6900 imagesRGBAverage results = (recall: 96.059%, Precision: 96.014%, F1-score: 0.96039)[46]
Fig detectionsoil-basedYOLOFigResNet43NSNS412 imagesRGBP = 74%, R = 88%, F1-score = 0.80[47]
strawberry detectionsoil-basedImproved Faster-RCNNResNet-50brightness, chroma, contrast, and sharpness augmentation and attenuationNS400 imagesRGBAccuracy = 86%, ART = 0.158s, IoU = 0.892[48]
sweet pepper detectionsoil-basedSSDcustomNSNS468 imagesRGB, HSVAverage Precision = (Flash-only: 84%, Flash-No-Flash image: 83.6%)[49]
tomato detectionsoil-basedFaster R-CNNResNet-50, ResNet-101, Inception-ResNet-v2resizing, crop, rotating, random horizontal flipNS640 imagesRGBF1 score = 83.67% and AP = 87.83% for tomato detection using Faster R-CNN with ResNet-101, R2 = 0.87 for tomato counting[50]
tomato detectionsoil-basedSSDMobileNetv2rotation, translate, flip, multipley, noise addition, scale, blurNS1029 imagesRGB, HSVmAP = 65.38%, P = 70.12%, R = 84.9%, F1-score = 85.81%[51]
YOLOv4CSP DarkNet53 mAP  = 65.38%, P = 70.12%, R = 84.9%, F1-score = 85.81%
muskmelon detectionsoil-basedYOLO MuskmelonResNet43NSNS410 imagesRGBIoU = 70.9%, P = 85%, R = 82%, AP = 89.6%, F1 = 84%, FPS = 96.3[52]
tomato detectionsoil-basedSSDMobileNet V2rotation, scaling, translation, flip, blur (Gaussian Filter), Gaussian NoiseNS5365RGBmAP = 51.56%, P = 84.37%, R = 54.40%, F1 = 66.15%, I = 16.44 ms[53]
InceptionV2 mAP = 48.54%, P = 85.31%, R = 50.93%, F1 = 63.78%, I = 24.75 ms
ResNet-50 mAP = 42.62%, P = 92.51%, R = 43.59%, F1 = 59.26%, I = 47.78 ms
ResNet-101 mAP = 36.32%, P = 88.63%, R = 38.13%, F1 = 53.32%, I = 59.78 ms
YOLOv4-tinyCSP DarkNet53 mAP = 47.48%, P = 88.39%, R = 49.33%, F1 = 63.32%, I = 4.87 ms
Arabidopsis, Bean, Komatsuna recognitionsoil-basedCNNResNet-18scaling, rotation and translationAdam2694 imagesRGBmA = 0.922 (Arabidopsis), mA = 1 (Bean), mA =1 (Komatsuna)[54]
Disease
Detection and
Classification
Tomato (powdery mildew (PM), early blight) and cucumber (PM, downy mildew (DM)) recognitionsoil-basedCNNPRP-NetShiftScaleRotate, RandomSizedCrop, HorizontalFlipSGD4284 imagesRGBAverage results (Accuracy = 98.26%, Precision = 92.60%, Sensitivity = 93.60%, Specificity = 99.01%)[55]
tomato virus disease recognitionsoil-basedSE-YOLOv5CSPNetGaussian noise addition, rotation, mirroring, intensity random adjustmentNS150 imagesRGB, HSVP = 86.75%, R = 92.19%, mAP@(0.5) = 94.1%, mAP@(0.5:0.95) = 75.98, prediction accuracy = 91.07%[56]
cucumber PM, DM and the combination of PM and DM recognitionsoil-basedEfficient NetEfficientNet-B4flip (horizontal, vertical), rotationRanger2816 imagesRGBTrain Accuracy = 99.22%, Verification accuracy = 96.38%, Test accuracy = 96.39%[57]
Tomato (PM, early blight), cucumber (PM, DM, virus disease) recognitionsoil-basedITC-NetResNet18 and TextRCNNCropping, Normalization, word segmentation, word list construction, text vectorizationAdam1516 imagesRGBAccuracy: 99.48%, Precision: 98.90%, Sensitivity: 98.78%, Specificity: 99.66%[58]
leaf mold, tomato yellow leaf curl detectionsoil-basedCNNResNet-50, ResNet-101filtering, histogramNS115 imagesRGB, HSVTesting Accuracy = 98.61%, Validation accuracy = 99%[59]
spider mite detectionsoil-basedCNNResNet18NSNS850 imagesmulti-spectral, RGBaccuracy: 90%[60]
cucumber DM predictionsoil-basedLSTMNSMin-Max normalizationAdam11,827 imagesRGBA = 90%, R = 89%, P = 94%, F1-Score = 0.91[61]
tomato disease detectionsoil-basedFaster R-CNNVGG16resizing, cropping, rotation, flipping, contrast, brightness, color, noiseNS59,717 imagesRGBmAP = 89.04%[62]
ResNet-50 mAP =  90.19%
ResNet-50 FPN mAP = 92.58%
various tomato diseases (i.e., leaf mold, gray mold, early blight, late blight, leaf curl virus, brown spot) detectionsoil-basedYOLO-DenseDarkNet53NSNS15,000 imagesRGBmAP: 96.41%[63]
wheat disease detectionsoil-basedCNNResNet-101croppingNS160 plantsNIR, RGBAccuracy = 84% for tan spot disease, 75% for leaf rust disease[64]
Small
Insect
Detection
Pests (whitefly and Thrips) detectionsoil-basedTPest-RCNNVGG16Resizing, SplitingNS1941 imagesRGBAP: 95.2%, F1 Score: 0.944[65]
whiteflies (greenhouse whitefly and cotton whitefly) detectionhydroponicFaster R-CNNResNet-50mirroringSGD1161 imagesRGBRMSE = 5.83, Precision = 0.5794, Recall = 0.7892[66]
whitefly detectionsoil-basedYOLOV4CSP DarkNet53croppingAdam1200 imagesRGBWhitefly: (precision = 97.4%, recall = 95.7%), mAP = 95.1%[67]
Thrips detection Thrips: (precision = 97.9%, recall = 94.5%), mAP = 95.1%
flies, gnats, thrips, whiteflies detectionsoil-basedYOLOv3-tinyDarkNet53croppingAdamNSRGBaverage F1-score: 0.92, mean counting accuracy: 0.91[68]
Nutrient
Estimation
and
Detection
lead content detectionsoil-basedWT-MC-stacked auto-encodersNSstandard normalized variable (SNV), 1st Der, 2nd Der, 3rd Der, 4th DerNS2800 imageshyper-spectral datapb content detection = 0.067∼1.400 mg/kg, RMSEC = 0.02321 mg/kg, RMSEP =  0.04017mg/kg, R2C = 0.9802, R2P = 0.9467[69]
soyabean leaf defoliation estimationsoil-basedCNNAlexNetResizing, Binarized, RotationNS10,000 imagesRGBRMSE (AlexNet) = 4.57(±5.8)[70]
VGGNet RMSE (VGGNet): 4.65 (±6.4)
ResNet RMSE (ResNet): 14.60 (±18.8)
PN: (light level CO 2 concentration, temperature) predictionsoil-basedDNNcustomNSAdam33,000 imagesNSaccuracy: 96.20% (7 hidden layer with 128 units per hidden layer), accuracy: 96.30% (8 hidden layer with 64 units per hidden layer)[71]
nutrient concentration estimationhydroponicCNNVGG16width, height shift, shear, flipping, zoom, scaling, croppingAdam779 imagesRGBAverage Classification Accuracy (ACA) = 97.9%[72]
VGG19 Average Classification Accuracy (ACA) = 97.8%
Calcium Magnesium deficiencies predictionsoil-basedSVM, Random Forest (RF) ClassifierInception V3NSRMSProp880 imagesRGBAccuracy = 98.71% (for InceptionV3 with SVM) and 97.85% (for Inception-V3 with RF classifier)[73]
VGG16 Adam Accuracy = 99.14% (for VGG16 with SVM) and 95.71% (for VGG16 with RF classifier)
ResNet-50 Adam Accuracy = 88.84% (for ResNet50 with SVM) and 84.12% (for ResNet-50 with RF classifier)
cadmium content estimationsoil-basedPSO-DBNNSSavitzky-Golay(SG) to remove the spectral noiseNS1260 imageshyper-spectral dataWhen the hidden layers is 3, the prediction result is as follows, R2: 0.8976, RMSE: 0.6890, and RPD: 2.8367[74]
Nutrient deficiencies (Calcium/Ca2+, Potassium/K+, Nitrogen/N) classificationsoil-basedCNNInception-ResNetV2shift, rotation, resizingNS571 imagesRGBAverage Accuracy = 87.27%, Average Precision = 100%, Recall = Ca2+: 100%, K+: 100%, N: 100%[75]
Auto-EncoderNS Average Accuracy = 79.09%, Average Precision = 94.2%, Recall = Ca2+: 97.6%, K+: 92.45%, N: 95.23%
Growth
Monitoring
length estimation and interest point detectionsoil-basedMask R-CNNResNet-101NSNS2574 imagesRGBResults in 2D (Banana Tree, AP: 92.5%, Banana Leaves, AP: 90%, Cucumber fruit, AP: 60.2%)[76]
internode length detectionsoil-basedYOLOv3DarkNet53NSNS9990 imagesRGBR:92% AP: 95%, F1 Score: 0.94[77]
plant growth anomalies detectionsoil-basedLSTMNSfiltering, croppingAdamNSRGB, HSV2D (P: 42% R: 71%, F1: 0.52), 3D photogrammetry with high resolution camera (P: 57% R: 57%, F1: 0.57), 3D low-cost photogrammetry system (P: 44% R: 79%, F1 :0.56), LiDAR (P: 5% R: 86%, F1: 0.63)[78]
Phytomorphological descriptor predictionaquaponicsCNNDarkNet53Scaling and ResizingSGD with Momentum300 imagesRGBR2(Area-DarkNet53) = 0.9858, R2(Diameter-DarkNet53) = 0.9836[79]
Xception R2(Centroid x-Xception) = 0.6390, R2(Centroid-y-Xception) = 0.7239
Inception ResNetv2 R2(Major Axis-InceptionResNetv2) = 0.8197, R2(Minor Axis-InceptionResNetv2) = 0.7460
orchid seedlings vigor ratingsoil-basedCNNResNet-50Cropping, ResizingAdam1700 imagesRGB, HSVA = 95.5%, R = 97%, P = 94.17%, F1-Score = 0.9557[80]
spike detectionsoil-basedSSDInception-ResNetv2NSSGD292 imagesRGB[email protected] = 0.780, [email protected] = 0.551, [email protected]:0.95 = 0.470[81]
YOLOv3DarkNet53NS [email protected] = 0.941, [email protected] = 0.680, [email protected]:0.95 = 0.604
YOLOv4CSP DarkNet53CutOut, MixUp, CutMix, RandomErase [email protected] = 0.941, [email protected] = 0.700, [email protected]:0.95 = 0.610
Faster R-CNNInceptionV2NSAdam [email protected] = 0.950, [email protected] = 0.822, [email protected]:0.95 = 0.660
spike segmentation ANNNSNSNS AP = 0.61
U-NetVGG16rotation [−30 30], horizontal flip, and brightnessAdam AP = 0.84
Deep-LabV3+ResNet-101 NS AP = 0.922
Paprika leaves growth conditions classificationsoil-basedDNNImproved VGG-16rotationNS227 imageshyper-spectral dataAccuracy = 90.9%[82]
VGG-16 Accuracy = 86.4%
ConvNet Accuracy = 82.3%
leaf shape estimationhydroponicencoder-decoder CNNsU-Netrandom rotation, and random horizontal spatial flippingAdamNSRGBDeviation of U-Net based estimation is less than 10% of the manual LAI estimation[83]
Robotic
Harvesting
Obstacle Separationsoil-basedMask R-CNNResNet-1013D HSI color thresholdingNSNSRGBSuccess Rate = 65.1% (whole process)[84]
picking-point positioningsoil-basedCNNcustomNSNS100 imagesRGBSuccess rate: 100%[85]
keypoints detectionsoil-basedTPMcustomRotation and brightness adjustmentRMSprop2500 imagesRGBQualified rate: 94.02%, Accuracy: 85.77%[86]
pose detection Adam Accuracy: 70.05%
target positioning estimationsoil-basedMask-RCNNResNetcroppingNSNSRGB, InfraredAverage Gripping Accuracy (AGA): 8.21mm, APSR: 73.04%[87]
OthersLPDE film lifetime predictionNSSVM-CNNNSNSAdam4072 imagesNSNS[88]
NS: Not Specified.
Table 4. Summary of studies for deep learning applications in indoor farms.
Table 4. Summary of studies for deep learning applications in indoor farms.
Application ClassificationTasksGrowing MediumDL ModelNetworksPreprocessing AugmentationOptimizerDataset TypeImaging MethodPerformanceRef.
Yield
Estimation
rapeseed detectionhydroponicESPA-YOLO-V5sCSP DarkNetrotating, flipping (horizontal, vertical)NS6616 imagesRGBP = 94.5%, R = 99.6%, F1-score = 0.970, [email protected] = 0.996[89]
tomato predictionhydroponicImproved Mask R-CNNResNetrandom translation, random brightness change, Gaussian noise additionNS1078 imagesRGBAccuracy = 93.91% (Fruit), Accuracy = 88.13% (Stem)[90]
Stress
Level
Monitoring
lettuce abnormal leaves (yellow, withered, decay)hydroponicDeepLabV3+Xception-65rotating, mirroring, flippingNS500 imagesRGBXception-65 (mIoU = 0.4803, PA = 95.10%, speed = 243.4 ± 4.8a)[91]
Xception-71 Xception-71 (mIoU = 0.7894, PA = 99.06%, speed = 248.9 ± 4.1a)
ResNet-50 ResNet-50 (mIoU = 0.7998, PA = 99.20%, speed = 154.0 ± 3.8c)
ResNet-101 ResNet-101 (mIoU = 0.8326, PA = 99.24%, speed = 193.4 ± 4.0b)
water stress classificationNSCNNResNet50rotation, re-scalingSGD with momentum /Adam /RMSProp800 imagesRGBAverage Accuracy: ResNet-50 with (Adam = 94.15%, RMSProp =88.75%, SGDm = 83.77%)[92]
GoogLeNet GoogLeNet with (Adam = 78.3%, RMSProp = 80.4%)
patch-level detectionNSYOLOv2DarkNet19NSSGD with Nesterov Momentum60,000 imagesRGBAccuracy = 87.05%[93]
pixel-level segmentation U-NetNScropping, random jitteringAdam mAP = 87.00%, IoU = 77.20%, Dice score = 75.02%
light stress gradinghydroponicMFC-CNNcustom90, 180, and 270-degree rotation, mirror rotation, salt and pepper noise, and image sharpeningSGD1113 imagesRGBAccuracy = 87.95% Average F1-score = 0.8925[94]
Growth
Monitoring
plant growth predictionNSNSNSNSNS45 data samplesRGBRMSE = 0.987, R2 = 0.728 for 4-7-1 network architecture[95]
leaf shape estimationNScustomSpatial transformer networkrotation, scaling, translationAdamNSRGBPSNR = 30.61, SSIM = 0.8431[96]
PSNR = 26.55, SSIM = 0.9065
PSNR = 23.03, SSIM = 0.8154
growth predictionsoil-basedU-NetSE-ResXt101cropping, scaling and paddingNS232 plant samplesRGBF1-score = 97%[97]
plant behaviour predictionhydroponicMask R-CNNNSrotation and scalingNS1728 imagesRGBleaf area accuracy = 100%[98]
lettuce plant biomass predictionhydroponicDCNNResNet-50rotation, brightness, contrast, saturation, hue, grayscaleAdam864 plantsRGBFor RGBD (MAPE = 7.3%, RMSE = 1.13g), For RGB (MAPE = 9.6%, RMSE = 1.03g), For Depth (MAPE = 12.4%, RMSE = 2.04g)[99]
growth predictionhydroponicANNNSNSNSNSNSANN: Accuracy (%) = 98.3235, F-measure (%) = 97.5413, Training time (sec) = 121.78[100]
SVM SVM: Accuracy (%) = 96.0886, F-measure(%) = 93.4589, Training time (sec) = 202.48
growth predictionhydroponicMask R-CNNResNet-50flipping, cropping and rotationNS600 imagesNSmAP = 76.9%, AP = 92.6%[101]
Table 5. Common DL architectures with their benefits and drawbacks.
Table 5. Common DL architectures with their benefits and drawbacks.
ModelRef.AdvantagesDisadvantages
AE [69,75]
  • Excellent performance for depth feature extractions
  • Do not need labeled data for training
  • Saves a significant amount of time by avoiding labeling in the case of large datasets
  • Lengthy processing time and fine tuning
  • Training may be hampered by errors that vanishes
DBN [74]
  • Unsupervised training
  • High efficiency in handling hyperspectral data at high dimensions
  • Can simplify characteristics that are redundant and complex through training network layer by layer
  • Disable to process multi-dimensional
  • Training can be prolonged and inefficient
LSTM [31,61]
  • Able to capture abstract temporal features
  • Alleviate the diminishing gradient problems
  • Poor spatial features representation resulting in classification errors
  • Difficult implementation
ANN[30,32]
  • Excellent for obtaining significant findings from complex nonlinear data
  • Can make highly accurate approximations of a vast class of functions.
  • Quite robust to noise in the training data.
  • Weak stability in heavily interconnected and complex systems
  • Require many training sets
CNN[45,62]
  • Ability to learn robust discriminative features
  • Ability to capture spatial correlations
  • High generalization potential
  • High computational cost
  • Difficult parameter tuning
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Ojo, M.O.; Zahid, A. Deep Learning in Controlled Environment Agriculture: A Review of Recent Advancements, Challenges and Prospects. Sensors 2022, 22, 7965. https://doi.org/10.3390/s22207965

AMA Style

Ojo MO, Zahid A. Deep Learning in Controlled Environment Agriculture: A Review of Recent Advancements, Challenges and Prospects. Sensors. 2022; 22(20):7965. https://doi.org/10.3390/s22207965

Chicago/Turabian Style

Ojo, Mike O., and Azlan Zahid. 2022. "Deep Learning in Controlled Environment Agriculture: A Review of Recent Advancements, Challenges and Prospects" Sensors 22, no. 20: 7965. https://doi.org/10.3390/s22207965

APA Style

Ojo, M. O., & Zahid, A. (2022). Deep Learning in Controlled Environment Agriculture: A Review of Recent Advancements, Challenges and Prospects. Sensors, 22(20), 7965. https://doi.org/10.3390/s22207965

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop