Next Article in Journal
Winter Cover Cropping in Sustainable Production Systems: Effects on Soybean and Synergistic Implications for Rhizosphere Microorganisms
Previous Article in Journal
Characterization of Natural Bioactive Compounds from Greek Oregano Accessions Subjected to Advanced Extraction Techniques
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Research Progress of Spectral Imaging Techniques in Plant Phenotype Studies

Institute of Data Science and Agricultural Economics, Beijing Academy of Agriculture and Forestry Sciences, Beijing 100097, China
*
Authors to whom correspondence should be addressed.
Plants 2024, 13(21), 3088; https://doi.org/10.3390/plants13213088
Submission received: 2 September 2024 / Revised: 25 October 2024 / Accepted: 31 October 2024 / Published: 2 November 2024
(This article belongs to the Section Plant Modeling)

Abstract

:
Spectral imaging technique has been widely applied in plant phenotype analysis to improve plant trait selection and genetic advantages. The latest developments and applications of various optical imaging techniques in plant phenotypes were reviewed, and their advantages and applicability were compared. X-ray computed tomography (X-ray CT) and light detection and ranging (LiDAR) are more suitable for the three-dimensional reconstruction of plant surfaces, tissues, and organs. Chlorophyll fluorescence imaging (ChlF) and thermal imaging (TI) can be used to measure the physiological phenotype characteristics of plants. Specific symptoms caused by nutrient deficiency can be detected by hyperspectral and multispectral imaging, LiDAR, and ChlF. Future plant phenotype research based on spectral imaging can be more closely integrated with plant physiological processes. It can more effectively support the research in related disciplines, such as metabolomics and genomics, and focus on micro-scale activities, such as oxygen transport and intercellular chlorophyll transmission.

1. Introduction

Plant phenotype is the quantitative description of the individual development, physiological characteristics, and biochemical characteristics of plants. The acquisition and analysis of plant phenotype data are important in irrigation management, disease prevention and control, breeding research, and yield increases [1]. High-throughput and precise phenotype acquisition greatly promote the screening of breeding materials and significantly improve breeding efficiency. There are significant differences in plant phenotypes, such as morphology, physiology, and biochemistry, between different plants at different growth stages. Moreover, the growth environment of plants under natural conditions is complex and poses significant challenges to plant phenotype analysis. Traditional plant phenotype observation is mainly conducted through the manual measurement of various phenotypic parameters and has the drawbacks of being time-consuming and labor-intensive and having large errors and strong subjectivity [2]. Rapid, batch, and repeatable measurements of plant phenotypes have become bottlenecks in the fields of breeding and precision cultivation.
Spectral imaging technique, which has the advantages of being non-destructive, high-throughput, reliable, and capable of real-time operation and repeatable measurement, has good potential in plant phenotype analysis. It is widely used in precision agriculture and breeding [3]. A variety of spectral sensors have been developed and applied in plant phenotype analysis and trait observation [4]. In this field, vegetation indices (VIs) [5], remote sensing [6], unmanned aerial vehicles (UAVs) [7], high-throughput plant phenotype platforms (HTPPs) [8], and machine learning [9] are currently research hotspots [10].
Previous studies have discussed the application of certain spectral imaging techniques in plant phenotypes, such as UAV [7], near-infrared (NIR) [11], remote sensing [9], and LiDAR [12], as well as their application in certain plants or certain phenotypes, such as grains [13], fruits [14], and drought stress [3]. The phenotype analysis plan in different scales or spectral imaging techniques is worth learning. This article reviews the research progress in plant phenotypes based on the spectral technique in Web of Science, which systematically compiles the application of spectral imaging in plant phenotype analysis on different scales based on natural light spectral bands. This will provide a basic and detailed reference and inspiration for future research.

2. Spectral Imaging

Spectral features are observable changes in phenotypic images caused by electromagnetic features of one or more pixels. This is caused by the emission or absorption of photons, whose energy corresponds to the difference between the initial and final states of the transition. Spectral features can indicate regions of interest, such as diseases or abiotic/biotic stresses [15]. Although the photosynthetically active radiation of plants covers only a 400–700 nm spectral range, however, plants can interact with light in the 350–2500 nm spectral range [8]. From short to long, the spectral bands can be divided into the following types based on wavelength: gamma-ray, X-ray, ultraviolet (UV), visible light (VL), infrared, microwave, and radio. The infrared is usually divided into four zones: NIR, short-wave infrared (SWIR), middle-wave infrared, and long-wave infrared.
According to the frequency or wavelength used, electromagnetic waves can be reflected, absorbed, or transmitted through materials, which provides many potential quantitative quality characteristics. This non-destructive analysis technique uses the resolution of sensors and the mathematical econometric models [16]. Consequently, an array of optical sensors and sensing methods have been developed based on different electromagnetic responses at different wavelengths [4]. As represented in Figure 1, according to the order of spectral wavelengths from short to long, spectral imaging mainly includes the following types:
  • Positron emission tomography (PET). This labels certain essential substances in plant metabolism (e.g., glucose, protein, and nucleic acid) with short-lived radioactive isotopes and reflects crop metabolic activity through the aggregation of these substances in metabolism activity [17].
  • X-ray computed tomography (X-ray CT). This has a wavelength range of 10 pm–10 nm. It is used to detect the differences in energy absorption before and after scanning from different angles to visualize the external and internal three-dimensional (3D) structures of plants [18].
  • Hyperspectral imaging (HSI). This has a wavelength range of 200–2500 nm. It is used to detect the two-dimensional geometric space and the one-dimensional spectral information of targets. It is based on a wide range of narrow-band image data and continuous and narrow-band image data with high spectral resolution [15].
  • Multispectral imaging (MSI). This has a wavelength range of 200–2500 nm. It contains many discrete spectral bands that typically range from three to hundreds or a set of customized wavelength bands [10].
  • Raman mapping (Raman). This has a wavelength range of 285–50000 nm. It is a scattering spectrum used to analyze the molecular structure and chemical composition of substances based on the Raman scattering effect [19].
  • Visible imaging (VI). This has a wavelength range of 380–780 nm. It creates an RGB color image with three channels: red, green, and blue [20].
  • Chlorophyll fluorescence imaging (ChlF). Although Chl fluorescence is emitted at around 600–750 nm, it can be excited in the 400 to 720–730 nm range. ChlF maps the emitted chlorophyll fluorescence signal to the sample space based on a pixel used to estimate photosynthetic performance and detect the effects of various stresses on plants [21].
  • Light detection and ranging (LiDAR). This uses sensors to send light pulses to objects and receives reflected pulses from objects, and measures the distance between the object and the sensor based on the time required between transmission and reception. It can obtain parameters such as distance, orientation, altitude, velocity, attitude, and even the shape of the target [22].
  • Near-infrared imaging (NIRI). This has a wavelength range of 780–1300 nm. It mainly records the infrared radiation reflected by objects [11].
  • Thermal imaging (TI). This has a wavelength range of 1000–14,000 nm. The infrared radiation energy distribution pattern of the object being tested is received, and the obtained infrared thermal image is formed, which corresponds to the thermal distribution field on the surface of the object [23].
  • Magnetic resonance imaging (MRI). This has a wavelength range of 1 mm–1 dm. The electromagnetic waves emitted by an object are detected by applying an external gradient magnetic field. The position and type of atomic nucleus of the object are detected, and an internal structure image is drawn [24].
There are slight differences in spectral band division among the different imaging methods in different research fields. There may also be intersections between different spectral bands; for example, MSI and HSI have covered VI, NIR, and SWIR.

3. Spectral Imaging Application in Plant Phenotypes

In this study, plants refer to the objects involved in agricultural production, such as grain crops, economic crops, feed crops, and green manure crops. Phenotype refers to external traits, such as shape, structure, size, and color, which are determined by genes, the environment, and related physiological and biochemical characteristics. Plant phenotype analysis is the study of various phenotypic information related to plant growth in complex environments, and it determines the structure, performance, and tolerance to limitations of individual plants or plant groups [15]. Plant phenotypes can be divided into four types: morphological phenotype, physiological phenotype, biochemical phenotype, and performance trait phenotype (Table 1).
The scattering and reflection, transmission, and absorption of photons are caused by the interaction between light and plant tissues involved in the propagation of light in plants. The spectral technique has been widely used in the phenotype analysis of crops, and sensors with different characteristics have different specific applications in agriculture [58,85,86,87]. Spectral data collection relies heavily on these sensors and affects the final phenotype results [88]. Different spectral imaging techniques can monitor changes in substances on different scales. PET and X-ray CT are used to detect the energy released by electron transitions in the inner shells of atoms. RGB and ChlF capture the energy changes caused by atomic valence electron transitions. HSI and MSI respond to energy changes caused by atomic electron and molecular vibrational transitions. TI, Raman, and MRI respond to molecular vibrational transitions. LiDAR generates precise 3D shapes by measuring variable distances (ranges) using laser pulses. NMR collects only spatial information, while infrared and Raman collect only spectral information [13]. The following system summarizes the application of spectral imaging techniques to plant phenotypes.

3.1. Morphological Phenotype

The morphological phenotype is the most widely used in phenotype analysis because it is the basis for plant selection in breeding programs. Related research has covered almost all plant species. Its common research objects include seeds, fruits, leaves, flowers, plants, and canopies. Their common characteristics are shown in Table 1. Some unique morphological characteristics of specific crops have been proposed, such as the tiller number and main stem number for gramine crops [89,90], the sword leaf length and width for rice, and the number of grains and pods per plant for soybeans [91]. With the deepening and refinement of phenotype research, more unique phenotypic features will be proposed in the future. As represented in Figure 2, many spectral imaging techniques can be used to extract morphological traits.
HSI and MSI provide various types of spectral information related to plant physiological and biochemical properties, which can reflect the interaction between light, crops, and their biochemical components from cellular to landscape scales. MSI is considered a type of HSI. HSI and MSI can be used to detect and identify the leaf area index (LAI), diseases, and pests [69], as well as to classify plants, fruits, and seeds. Multi-temporal MSI based on UAV is used for detecting banana plants and individual plant counting [55]. MSI and HSI passive sensing are used to assess early plant vigor in winter wheat, and dry weight, nitrogen uptake, and nitrogen content are tested based on canopy cover images [89]. MSI is also used for the phenotypic analysis of ornamental greening plants’ leaf color changes and genetic analyses [92].
The RGB image is the most common form of VI. The combination of RGB images and a deep learning convolutional neural network (CNN) can achieve plant classification, fruit detection [32], and image segmentation [93]. It is widely used for variety classification and identification, weed and canopy segmentation [94], defect and germination detection, and growth stage and maturity evaluation [85,95]. RGB combined with CNN is used for apple 3D detection and location in orchards, and it has an F1-score of 0.881 [30]. It can also be used to obtain morphological parameters, such as color [79], LAI [78], canopy area, and other geometric attributes, and to detect plant leaves, tree crowns, and fruits. With the use of RGB imagery, an attention-based recurrent CNN has been proposed for accurate vegetable mapping from multi-temporal UAVs, and it has an overall accuracy of 92.80% [57].
LiDAR is a laser active sensing technique based on the time-of-flight principle, which generates high-density 3D point clouds through photon counting [12]. It can obtain the horizontal and vertical spatial structural characteristics of plants, as well as the plant height, biomass, and 3D distribution of the canopy. Plant growth and development can be examined by plant size and morphology. LiDAR can generate 3D points along the laser path to reveal some information below the canopy, which provides unique structural features for classifying crops at the canopy or patch level. LiDAR can be used for the construction of canopy 3D point cloud [22], extraction of plant height, biomass, size, and shape of various crops [96], assessment of plant growth and development, description of crop structure, shape, light interception [97], and response to irrigation [98], and detection of male spikes and lodging.
MRI captures and depicts detailed 3D images of anatomical structures by detecting the energy released when protons align with magnetic fields. MRI can provide spatial information for nuclear examination. It can visualize the structural and dimensional characteristics of the maize stem vascular system [99]. MRI can be used for examining the internal and anatomical characteristics of seeds, roots [100], internal damage to fruits, embryos, and endosperms, and for assessing crop quality and phenotype. It can also obtain 3D plant structures [101] and use them for seed growth in soil, and provide detailed images of plant and root structures [24]. MRI is also used for the early detection of verticillium wilt based on cotton roots [102].
X-ray CT can obtain detailed 3D structural information inside objects and is an excellent tool for 3D imaging and the quantitative analysis of plant tissues and organs [18]. Due to the density dependence of X-ray attenuation, it is particularly suitable for plant imaging, as the space between cells is ubiquitous in many plant organs [103]. X-ray CT can be used for extracting crop morphological traits, such as tillering, spike length, leaf veins, and intercellular spaces [18]; inspecting the internal and anatomical characteristics of fruits and seeds [104]; assessing the quality, maturity, and phenotype of fruits and seeds [105,106]; and for the 3D reconstruction of root systems [103].

3.2. Physiological Phenotype

The physiological phenotype is related to the physical state and function of plants, including their growth rate, reproductive ability, and stress resistance. These traits are usually examined with biochemical traits to elucidate the functions of cells, tissues, and organs. The basic principle of plant physiological phenotype analysis based on spectral technique is shown in Figure 3. Physiological traits are influenced by various factors, including temperature, light, moisture, and nutrition. Spectral imaging, combined with advanced machine learning (ML), has become the ideal tool for high-throughput crop physiological phenotype analysis [15]. Because physiological traits involve the functions of various parts of plants under specific environmental conditions, they are important in stress responses.
HSI and MSI can simultaneously measure the spectral characteristics of a wide range of continuous bands, which are usually used to explain the structural, chemical, and physical properties of plants. It has stronger phenotypic detection capabilities and can reflect plant performance and interactions with the environment in more detail [63]. Especially in the early stages of biotic or abiotic stress in plants, the naked eye or visible light cannot recognize the stress, while HSI and MSI can detect it [107]. Therefore, they have significant advantages in the early identification and monitoring of plant diseases and pests, as well as in stress response evaluation [69]. HSI can also be used for the identification of haploid polyploids and the prediction of seed and germination abilities [28]. MSI is widely applied in seed phenotyping and quality monitoring, such as physicochemical quality traits, defect detection, pest infestation, and seed health [16].
NIRI uses an electromagnetic wave located between VL and mid-infrared light and is widely used for the real-time physical and chemical analyses of plants [11]. NIRI uses infrared wavelengths that penetrate deeper than other instruments of the same wavelength, making it highly sensitive in identifying the presence of water, water stress, and the cellular physical structure. It can measure the function of swollen cell structures and perform the phenotypic analysis of roots in the dark. NIRI can be used to measure plant water distribution and drought stress, external defect detection [108], and quality control [109]. NIRI is also used to monitor invasive insect pests (e.g., brown marmorated stink bugs) on different vegetal backgrounds [72].
TI measures the surface temperature of plants by detecting their infrared radiation and can generate analysis data based on time series or single time points. As it can monitor subtle changes in plant canopy temperature at different growth stages and its response to the environment, it is used to select genes and assist in breeding by comparing the differences in leaf or canopy temperatures. TI is mainly used for estimating plant drought stress and transpiration [23] and moisture condition index calculation, including the crop water stress index (CWSI) and stomatal conductance indices [107]. It is also used for the quantification of crop osmotic stress response to salinity and the detection and monitoring of diseases and pathogen infections [110]. TI can assess the water status of a wide range of individuals and is used to test the water stress of apples [111].
ChlF can reflect the spatial and temporal heterogeneity of fluorescence spectra on multiple scales, such as cells, leaves, and plants, thus making it the most accurate and appropriate method for screening the effects of environmental stress on plants. It can monitor plant metabolic information, detect plant diseases, and invert chlorophyll and nitrogen content, nitrogen–carbon ratio, and LAI. Non-modulated ChlF can explore in depth the intrinsic photosynthetic physiological information of plants. ChlF is mainly used for detecting biological or abiotic stresses related to photosynthesis [75] and for analyzing the nutritional and physiological status of plants [112], such as photosynthetic capacity, non-photochemical quenching, and other physiological characteristics. It obtains the photosynthetic activity of crops in real time. ChlF and MSI are used to monitor the drought stress of common beans [21].
PET is a nuclear imaging technique that can generate 3D images or images of functional processes [17]. It can perform non-invasive imaging of the distribution of biomarkers, evaluate processes at the molecular and cellular levels, detect plant stress [113], and provide quantitative data in a non-destructive and dynamic manner [114]. A novel design of dedicated plant PET scanners specifically developed to address agronomic issues has been proposed [17].
Other spectroscopic techniques also have many applications. RGB is mainly used for pests and disease detection and identification, as well as quality evaluation and grading [26]. MRI is mainly used for detecting plants’ germination ability, dormancy, survival, vitality, and pest infestation [100], as well as the water status and transportation in plant cells [99]. X-ray CT is mainly used for quantifying the degree of drought or salt stress and detecting germination ability, dormancy, survival, vitality, and pest infestation [115,116].

3.3. Biochemical Phenotype

The biochemical phenotype characterizes the presence, composition, and quantity of specific chemical and biochemical markers under steady-state conditions. These traits are related to various aspects of biological processes, such as leaf nitrogen, protein, carbohydrates, carotenoids, fatty acids, and chlorophyll content involved in photosynthesis, metabolism, and hydraulics (Figure 4). Photosynthetic pigments are important indicators of plants’ photosynthesis. Chlorophyll [64] and carotenoids can be evaluated for photosynthetic pigments at the leaf and canopy levels using RGB, HSI, ChlF, and NIR. The use of spectroscopic techniques to observe and identify targeted biochemical markers has promoted various omics and breeding research. Early and long-term nutrient deficiencies, such as nitrogen, phosphorus, potassium, magnesium, and iron, can be monitored using ChlF and MSI.
HSI and MSI are usually divided into three regions: the VIS region shows strong absorption of photosynthetic pigments, lutein, chlorophyll, and carotenoids; the VIS-SWIR region can extract information on general nutrients, such as protein, nitrogen, and sugar; the NIR and SWIR regions are sensitive to water and nitrogen content; and the SWIR area quantifies plant characteristics, such as phosphorus, hemicellulose, protein, and mineral content. HSI can compare the changes in special substances in plant bodies and show the reflectance of leaf sponge tissue, leaf biochemical components, and the main vegetation index of the canopy. HSI and MSI can be used to examine water content, plant nutrients, canopy chlorophyll content, nitrogen content, many VIs, and other biochemical parameters. HSI is used to characterize cotton photosynthesis at the canopy level, with a suitable spatial resolution and scanning throughput at the canopy and sub-canopy levels [63]. HSI and RGB images are used for the high-throughput analysis of leaf chlorophyll content in hydroponic lettuce [64].
NIRI relies on the absorption of NIR radiation by components, such as proteins, lipids, carbohydrates, and water, which can generate unique spectral patterns. It can detect related energy absorption and the molecular vibrations of the combination of the C-H, N-H, and O-H functional groups [16]. NIRI can be used for non-destructive testing and variety identification of agricultural products [108,117] and has been used to detect and visualize the distribution of sugar content in agricultural products in over 1,000 packaging factories in Japan. NIRI is also used for evaluating fruit maturity and hardness, quality and nutritional analysis, and the detection of survival and vitality.
ChlF comes with a standard filter wheel that can achieve multispectral fluorescence imaging. Fluorescence imaging is mainly used to measure the optical properties of chlorophyll. ChlF is commonly applied to assess the spatial patterns, photosynthesis, and metabolic status of crops. It can effectively inverse chlorophyll, nitrogen content, and nitrogen carbon ratio, detect plant metabolic information and diseases, and analyze the vertical heterogeneity of canopy biochemical parameters [118]. Chlf and HSI are used to determine shikimic acid concentrations in transgenic maize exhibiting glyphosate tolerance, which provides a new data-driven method.
Raman is a scattering spectrum that exposes a sample to the spectrum and measures the degree of light scattering caused by molecular bond vibrational transitions. It is commonly used in molecular structure research. As a non-invasive technique, it is popularly used in biochemical and structural analyses, which provide insights into the structure, concentration, and interaction of biochemical molecules within an organism’s cells and tissues [119,120]. Raman is mainly used for the analysis of plant biochemistry and structure; the detection of biochemical molecules, cells, and tissues [121]; the assessment of plants’ nutritional content [122]; and disease detection [123,124].

3.4. Performance Phenotype

The performance phenotype describes the overall performance of crops in terms of biomass, yield, and quality, and it is the most complex but interesting trait for crop breeders. The basic principle of plant performance phenotype analysis based on spectral technique is shown in Figure 5. Its common parameters include harvest index, number of grains per spike, thousand-grain weight, number of grains per plant, hundred-grain weight, number of spikes per mu, and theoretical yield. Although the feasibility and accuracy of the performance phenotype have been proven, its stability and repeatability cannot be strictly guaranteed. Due to complex factors, such as genotype, environmental factors, and agronomic practices, performance phenotype is influenced by a combination of factors.
HSI and MSI can be used for biomass and yield prediction and growth stage and maturity evaluation. HSI is used to train a CNN classification model to estimate corn grain yield, with a classification accuracy of 75.50% at five corn growth stages [61]. RGB and depth images are used to estimate various growth indices of four varieties of greenhouse lettuce, and the normalized root mean square error of fresh weight is 6.09% [85]. Researchers [80] have proposed a CNN approach using UAV-RGB imaging to estimate dry matter yield traits in a guinea grass breeding program.
Other spectral imaging techniques can be applied to crop phenotype research, such as synthetic aperture radar (SAR) and laser backscatter imaging (LLBI). SAR can work in weather conditions with very low visibility (e.g., cloud cover). It has been widely explored in crop classification, crop growth monitoring, and soil moisture monitoring [125]. LLBI is a low-cost imaging technique that utilizes the principles of light absorption, scattering, and image processing in visible and NIR electromagnetic spectra to detect and analyze targets [126].

4. Comparative Analysis of Spectral Imaging

4.1. Spectral Imaging Comparison

Different spectral imaging techniques are suitable for different crop phenotype analysis tasks, as shown in Figure 6. X-ray CT and LiDAR can measure the 3D morphological features of crops, and they are appropriate for the 3D representation of crop surfaces, tissues, and organs. Specific symptoms caused by plant nutrient deficiency can be easily detected using HSI, MSI, LiDAR, and ChlF measurements. HSI, IR, and Raman provide multi-component information that can overcome the low sensitivity. In addition, HSI and Raman are highly sensitive to the detection of trace components. ChlF surpasses HSI in characterizing photosynthetic activity on the micro-scale. Table 2 compares the different imaging techniques.
Crop phenotype analysis can be divided into different scale levels: cell and tissue level, organ level (root, stem, leaf, flower, fruit, seed, and harvest), plant level, and population and plot level. X-ray CT and MRI can provide anatomical details of crop tissues, while Raman can visualize cell walls, requiring the use of magnified images, such as microscopes. ChlF is used to examine the function and nutritional information of seeds. NIR effectively evaluates the internal quality attributes of fruits in a non-contact manner. LiDAR can show the 3D spatial structure of leaves and plants. The MSI phenotype has potential in the genetic analysis of seasonal leaf color changes in green crops [92]. HSI, MSI, and RGB are suitable for large-scale, high-throughput crop phenotype analysis that can be combined with agricultural machinery, drones, satellites, and other devices.

4.2. High-Throughput Plant Phenotyping Platform (HTPP)

HTPP applies modern information techniques for the quick, automatic, and non-destructive acquisition, analysis, and in situ monitoring of complex plant traits [128], thus making it possible to simultaneously measure many plant traits [112]. HTPP typically uses multiple sensors to measure various traits of crops, determine nutrient, water, and pesticide requirements, and detect various biological stresses. It is usually divided into two types: outdoor and indoor.
Outdoor HTPP is typically conducted on farms or in natural ecosystems using only natural light sources. Spectral sensing equipment is always installed in agricultural machinery, drones, fixed-wing aircraft, vehicles, and satellites for large-scale phenotype analysis. Aerial platforms provide remote sensing techniques for monitoring crop growth and various stresses and estimating large-scale crop yields [110]. In recent years, the HTPP for unmanned aerial vehicles equipped with multiple sensors has provided a large-scale, efficient, non-invasive, flexible, and low-cost solution for large-scale breeding [50,129,130]. It still faces many challenges, such as cross-platform data acquisition, sensor calibration, data processing methods [131], image interpretation, and the reliable and accurate extraction of crop phenotype information [132].
Indoor HTPP includes greenhouse, growth chamber, and laboratory scenarios. Crops can be fully or partially illuminated by artificial light sources. Typically, automated systems and handheld devices are used for imaging, which is suitable for a limited number of crops. Phenotypic analysis includes plant growth rate, crop stress detection, biomass estimation, seed viability, root characteristics, and physiological and biochemical measurements. Portable devices and their isolation measurement rooms are widely used. Handheld devices are simple, easy to use, and cost-effective, but they have small coverage and slow measurement speed and are labor-intensive and time-consuming.

5. Research Trends

5.1. Multimodal Data Application

Multimodal data application is a future research trend. Plants’ TI and MSI multimodal outputs can be compared and analyzed to provide complementary insights and to effectively develop VIs, which can provide new methods for plant stress physiological responses. Software has been developed for calculating CWSI and the green–red nutritional index by fusing NIR and RGB images. Multimodal data can improve existing yield prediction models, such as VIs, weather, soil, number of fruits/flowers, and canopy height. TI combined with RGB can accurately segment crop images of different VIs [133]. There is still room for the fusion and application of various types of remote sensing images, and combining a large amount of physical and spectral data with biochemical data on crop growth has great value [134].

5.2. 3D Image Application

Spatial 3D reconstruction of crops is important for high-throughput crop phenotype acquisition, plant type feature evaluation, and phenotype correlation analysis. Due to the limitations of physical space in the transport of water, gas, and nutrients in living organisms, the 3D analysis of plant structures is of great significance. Many physiological processes of crops, such as photosynthesis, respiration, and growth, are controlled by the transportation of water, metabolic gases, and nutrients, which are essentially 3D. The 3D structure research of crop phenotypes mainly includes the spatial structure of individual plants or populations, stem vein network, nutrient spatial transport mechanism, plant posture estimation, monitoring of lodging, leaf orientation distribution, the spatial structure of leaf sequence, inflorescence, and fruit sequence. The 3D imaging of crop organs and tissues is helpful for understanding their role in water, gas, and nutrient transport. The 3D images obtained from X-ray CT can be used for the quantitative modeling of crop physiological processes, precise numerical calculations of heat and mass transfer in crops, and other biophysical processes. By combining the realistic 3D arrangement of cells and tissues within plant organs, complex phenomena can be analyzed more accurately through simulations. For example, in respiration and photosynthesis, the connectivity of related transport structures can be studied using networks of pores or vascular systems.

5.3. Micro-Scale Applications

Spectral phenotype analysis techniques increasingly need to be extended to the substructure level, and imaging techniques should continuously surpass the current physical spectral level. Spectral techniques have been applied to many microscopic phenotypic studies, such as pollen, stomata, maize vascular bundles, mesoporous structures in fruit tissues, and cell stress. X-ray CT has opened up possibilities for the high-throughput phenotype analysis of plant organs [18]. For example, it can visualize the fluid and solute transport structures in plant organs; examine the structural morphology of the xylem in tomato roots and the cellular structure of the vascular system in tomato petioles; separate different tissues, such as apple ovaries, from 3D images; and separate cells from pore spaces on the microscopic scale. The rapid development of spectral imaging techniques is helping to connect genotype–phenotype differences. Although research on phenotypes on the micro-scale is currently relatively insufficient, it is an inevitable trend for future development.

5.4. Low-Cost Portable Imaging Device

The most important aspect of the crop phenotype analysis platform is its ease of use and affordability. The miniaturization of optical sensors can further promote the improvement of their performance and the flexibility of integration with different phenotype platforms. Breeding works require simple phenotypic tools, such as collecting wheat canopy reflectance data through handheld devices [135]. The handheld NDVI measuring instrument is simple, practical to use, and low cost, and it can identify plant vitality and biomass. The main characteristics of future phenotype devices included a low level of professional knowledge, simple application operation and data input, multiple sensors integrated into a portable device, and the strong stability and robustness of algorithms in complex environments.

5.5. Application of Machine Learning

ML is the future development direction of multi-omics, data integration, and systems biology [59]. Linking piled-up genomic information with trait expression still faces challenges, and remote sensing and ML can address the association between massive genomic information and trait expression in the future [52]. By combining computer science, biology, remote sensing, statistics, and genomics, ML can associate complex plant traits with gene expression in the future. ML has found increasing application in the processing of HSI data due to its nonparametric nature and strong flexibility in dealing with the nonlinear relationship between hyperspectral reflectance and target parameters. ML can already accurately predict the biochemical pathways of tomatoes using metabolite data [136]. The future ML platform must be robust, flexible, and able to distinguish multiple disease symptoms on a single leaf or the same plant canopy. Automated machine learning is an automated version of ML for dealing with large and complex multivariate datasets. This automation saves time and effort and enhances model quality, which is the future development trend.

6. Conclusion and Outlook

This article systematically analyzes research progress in spectral imaging techniques in crop phenotype analysis. Currently, there is insufficient research on physiological and biochemical phenotypes, especially biochemical phenotypes. There is abundant research on the macro-scale, especially on the field remote-sensing scale, unmanned aerial vehicles, and vehicle-mounted devices, while there is relatively little research on the micro-scale, especially on the substructural levels of molecules, cells, and organs. Future crop phenotype research based on spectral imaging should be more closely integrated with plant physiological processes, focusing on oxygen transport, intercellular chlorophyll transmission, and other micro-scale activity, and support research on related disciplines, such as metabolomics and genomics, more effectively.
Currently, the greatest challenge in phenotype research is quickly obtaining high-dimensional, high-density, and high-precision, large-scale plant phenotype data from individual molecules of the entire organism. How to effectively define and extract complex traits and how to improve accuracy and throughput remain key issues. HSI and MSI still have great development potential, which could be further developed in areas such as multi-device collaboration, spectral fusion, airborne equipment improvement, and real-time image processing techniques. The potential of sensors in obtaining new phenotypic information still needs to be explored, and new sensors and sensing methods for complex traits should be studied. Spectral imaging techniques should be improved in hardware modes, image reconstruction and analysis, resolution and contrast enhancement, mathematical modeling, and data sharing. Integrating a universal metabolomics platform would be ideal, and whole-plant physicochemical phenotype analysis would be the next key goal.

Author Contributions

Q.Z. and J.Z. analyzed the data, prepared the tables, and drafted the manuscript. F.Y. and M.W. designed the project and finalized the manuscript. R.L., Y.P. and L.Q. assisted with reference collection and reorganization, and partial data analysis. All authors have read and agreed to the published version of the manuscript.

Funding

This work is supported by Beijing Smart Agriculture Innovation Consortium Project (BAIC10-2024); Beijing Science and Technology Plan (Z231100003923005); Youth Fund of Beijing Academy of Agriculture and Forestry Sciences (CN) (QNJJ202213).

Data Availability Statement

The original contributions presented in the study are included in the article. Further inquiries can be directed to the corresponding author.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Huang, Y.; Ren, Z.; Li, D.; Liu, X. Phenotypic techniques and applications in fruit trees: A review. Plant Methods 2020, 16, 107. [Google Scholar] [CrossRef] [PubMed]
  2. Li, Z.; Zhou, X.; Cheng, Q.; Fei, S.; Chen, Z. A Machine-Learning Model Based on the Fusion of Spectral and Textural Features from UAV Multi-Sensors to Analyse the Total Nitrogen Content in Winter Wheat. Remote Sens. 2023, 15, 2152. [Google Scholar] [CrossRef]
  3. Al-Tamimi, N.; Langan, P.; Bernád, V.; Walsh, J.; Mangina, E.; Negrão, S. Capturing crop adaptation to abiotic stress using image-based technologies. Open Biol. 2022, 12, 210353. [Google Scholar] [CrossRef]
  4. Sun, D.; Xu, Y.; Cen, H. Optical sensors: Deciphering plant phenomics in breeding factories. Trends Plant Sci. 2022, 27, 209–210. [Google Scholar] [CrossRef] [PubMed]
  5. Radocaj, D.; Siljeg, A.; Marinovic, R.; Jurisic, M. State of Major Vegetation Indices in Precision Agriculture Studies Indexed in Web of Science: A Review. Agriculture 2023, 13, 707. [Google Scholar] [CrossRef]
  6. Fan, J.; Li, Y.; Yu, S.; Gou, W.; Guo, X.; Zhao, C. Application of Internet of Things to Agriculture—The LQ-FieldPheno Platform: A High-Throughput Platform for Obtaining Crop Phenotypes in Field. Research 2023, 6, 0059. [Google Scholar] [CrossRef]
  7. Feng, L.; Chen, S.; Zhang, C.; Zhang, Y.; He, Y. A comprehensive review on recent applications of unmanned aerial vehicle remote sensing with various sensors for high-throughput plant phenotyping. Comput. Electron. Agric. 2021, 182, 106033. [Google Scholar] [CrossRef]
  8. Fu, P.; Montes, C.M.; Siebers, M.H.; Gomez-Casanovas, N.; McGrath, J.M.; Ainsworth, E.A.; Bernacchi, C.J.; Alistair, M. Advances in field-based high-throughput photosynthetic phenotyping. J. Exp. Bot. 2022, 73, 3157–3172. [Google Scholar] [CrossRef]
  9. Zheng, C.; Abd-Elrahman, A.; Whitaker, V. Remote Sensing and Machine Learning in Crop Phenotyping and Management, with an Emphasis on Applications in Strawberry Farming. Remote Sens. 2021, 13, 531. [Google Scholar] [CrossRef]
  10. Zhang, Y.; Zhao, D.; Liu, H.; Huang, X.; Deng, J.; Jia, R.; He, X.; Tahir, M.N.; Lan, Y. Research hotspots and frontiers in agricultural multispectral technology: Bibliometrics and scientometrics analysis of the Web of Science. Front. Plant Sci. 2022, 13, 955340. [Google Scholar] [CrossRef]
  11. Reddy, P.; Guthridge, K.M.; Panozzo, J.; Ludlow, E.J.; Spangenberg, G.C.; Rochfort, S.J. Near-Infrared Hyperspectral Imaging Pipelines for Pasture Seed Quality Evaluation: An Overview. Sensors 2022, 22, 1981. [Google Scholar] [CrossRef] [PubMed]
  12. Lin, Y. LiDAR: An important tool for next-generation phenotyping technology of high potential for plant phenomics? Comput. Electron. Agric. 2015, 119, 61–73. [Google Scholar] [CrossRef]
  13. Sun, D.; Robbins, K.; Morales, N.; Shu, Q.; Cen, H. Advances in optical phenotyping of cereal crops. Trends Plant Sci. 2022, 27, 191–208. [Google Scholar] [CrossRef] [PubMed]
  14. Liu, X.; Li, N.; Huang, Y.; Lin, X.; Ren, Z. A comprehensive review on acquisition of phenotypic information of Prunoideae fruits: Image technology. Front. Plant Sci. 2023, 13, 1084847. [Google Scholar] [CrossRef] [PubMed]
  15. Sarić, R.; Nguyen, V.D.; Burge, T.; Berkowitz, O.; Trtílek, M.; Whelan, J.; Lewsey, M.G.; Čustović, E. Applications of hyperspectral imaging in plant phenotyping. Trends Plant Sci. 2022, 27, 301–315. [Google Scholar] [CrossRef]
  16. ElMasry, G.; Mandour, N.; Al-Rejaie, S.; Belin, E.; Rousseau, D. Recent Applications of Multispectral Imaging in Seed Phenotyping and Quality Monitoring—An Overview. Sensors 2019, 19, 1090. [Google Scholar] [CrossRef]
  17. Antonecchia, E.; Backer, M.; Cafolla, D.; Ciardiello, M.; Kuhl, C.; Pagnani, G.; Wang, J.; Wang, S.; Zhou, F.; D’Ascenzo, N.; et al. Design Study of a Novel Positron Emission Tomography System for Plant Imaging. Front. Plant Sci. 2022, 13, 736221. [Google Scholar] [CrossRef]
  18. Piovesan, A.; Vancauwenberghe, V.; Van De Looverbosch, T.; Verboven, P.; Nicolaï, B. X-ray computed tomography for 3D plant imaging. Trends Plant Sci. 2021, 26, 1171–1185. [Google Scholar] [CrossRef]
  19. Yu, H.; Ding, D.; Huang, Y.; Yuan, Y.; Song, J.; Yin, Y. Characteristic information analysis of Raman spectrum of cucumber chlorophyll content and hardness and detection model construction. J. Food Meas. Charact. 2024, 18, 3492–3501. [Google Scholar] [CrossRef]
  20. Fu, L.; Majeed, Y.; Zhang, X.; Karkee, M.; Zhang, Q. Faster R–CNN–based apple detection in dense-foliage fruiting-wall trees using RGB and depth features for robotic harvesting. Biosyst. Eng. 2020, 197, 245–256. [Google Scholar] [CrossRef]
  21. Javornik, T.; Carović-Stanko, K.; Gunjača, J.; Vidak, M.; Lazarević, B. Monitoring Drought Stress in Common Bean Using Chlorophyll Fluorescence and Multispectral Imaging. Plants 2023, 12, 1386. [Google Scholar] [CrossRef] [PubMed]
  22. Jayakumari, R.; Nidamanuri, R.R.; Ramiya, A.M. Object-level classification of vegetable crops in 3D LiDAR point cloud using deep learning convolutional neural networks. Precis. Agric. 2021, 22, 1617–1633. [Google Scholar] [CrossRef]
  23. Ramirez, D.A.; Gruneberg, W.; Andrade, M.; De Boeck, B.; Loayza, H.; Makunde, G.; Ninanya, J.; Rinza, J.; Heck, S.; Campos, H. Phenotyping of productivity and resilience in sweetpotato under water stress through UAV-based multispectral and thermal imagery in Mozambique. J. Agron. Crop Sci. 2023, 209, 41–55. [Google Scholar] [CrossRef]
  24. Collewet, G.; Moussaoui, S.; Quellec, S.; Hajjar, G.; Leport, L.; Musse, M. Characterization of Potato Tuber Tissues Using Spatialized MRI T2 Relaxometry. Biomolecules 2023, 13, 286. [Google Scholar] [CrossRef]
  25. Hansen, M.A.E.; Hay, F.R.; Carstensen, J.M. A virtual seed file: The use of multispectral image analysis in the management of genebank seed accessions. Plant Genet. Resour.-Charact. Util. 2016, 14, 238–241. [Google Scholar] [CrossRef]
  26. Taner, A.; Öztekin, Y.B.; Duran, H. Performance Analysis of Deep Learning CNN Models for Variety Classification in Hazelnut. Sustainability 2021, 13, 6527. [Google Scholar] [CrossRef]
  27. Li, X.; Fan, X.; Zhao, L.; Huang, S.; He, Y.; Suo, X. Discrimination of Pepper Seed Varieties by Multispectral Imaging Combined with Machine Learning. Appl. Eng. Agric. 2020, 36, 743–749. [Google Scholar] [CrossRef]
  28. Yu, Z.; Fang, H.; Zhangjin, Q.; Mi, C.; Feng, X.; He, Y. Hyperspectral imaging technology combined with deep learning for hybrid okra seed identification. Biosyst. Eng. 2021, 212, 46–61. [Google Scholar] [CrossRef]
  29. Taheri-Garavand, A.; Nasiri, A.; Fanourakis, D.; Fatahi, S.; Omid, M.; Nikoloudakis, N. Automated In Situ Seed Variety Identification via Deep Learning: A Case Study in Chickpea. Plants 2021, 10, 1406. [Google Scholar] [CrossRef]
  30. Gené-Mola, J.; Sanz-Cortiella, R.; Rosell-Polo, J.R.; Morros, J.-R.; Ruiz-Hidalgo, J.; Vilaplana, V.; Gregorio, E. Fruit detection and 3D location using instance segmentation neural networks and structure-from-motion photogrammetry. Comput. Electron. Agric. 2020, 169, 105165. [Google Scholar] [CrossRef]
  31. Lin, Z.; Guo, W. Sorghum Panicle Detection and Counting Using Unmanned Aerial System Images and Deep Learning. Front. Plant Sci. 2020, 11, 534853. [Google Scholar] [CrossRef] [PubMed]
  32. Fu, L.; Feng, Y.; Wu, J.; Liu, Z.; Gao, F.; Majeed, Y.; Al-Mallahi, A.; Zhang, Q.; Li, R.; Cui, Y. Fast and accurate detection of kiwifruit in orchard using improved YOLOv3-tiny model. Precis. Agric. 2020, 22, 754–776. [Google Scholar] [CrossRef]
  33. Seo, D.; Cho, B.-H.; Kim, K.-C. Development of Monitoring Robot System for Tomato Fruits in Hydroponic Greenhouses. Agronomy 2021, 11, 2211. [Google Scholar] [CrossRef]
  34. Li, C.; Lin, J.; Li, B.; Zhang, S.; Li, J. Partition harvesting of a column-comb litchi harvester based on 3D clustering. Comput. Electron. Agric. 2022, 197, 106975. [Google Scholar] [CrossRef]
  35. Peng, Y.; Zhao, S.; Liu, J. Fused Deep Features-Based Grape Varieties Identification Using Support Vector Machine. Agriculture 2021, 11, 869. [Google Scholar] [CrossRef]
  36. Vayssade, J.-A.; Jones, G.; Gée, C.; Paoli, J.-N. Pixelwise instance segmentation of leaves in dense foliage. Comput. Electron. Agric. 2022, 195, 106797. [Google Scholar] [CrossRef]
  37. Liu, Y.; Su, J.; Shen, L.; Lu, N.; Fang, Y.; Liu, F.; Song, Y.; Su, B. Development of a mobile application for identification of grapevine (Vitis vinifera L.) cultivars via deep learning. Int. J. Agric. Biol. Eng. 2021, 14, 172–179. [Google Scholar] [CrossRef]
  38. Gautam, V.; Rani, J. Mango Leaf Stress Identification Using Deep Neural Network. Intell. Autom. Soft Comput. 2022, 34, 849–864. [Google Scholar] [CrossRef]
  39. Zhang, L.; Xia, C.; Xiao, D.; Weckler, P.; Lan, Y.; Lee, J.M. A coarse-to-fine leaf detection approach based on leaf skeleton identification and joint segmentation. Biosyst. Eng. 2021, 206, 94–108. [Google Scholar] [CrossRef]
  40. Wang, X.; Liu, J. Tomato Anomalies Detection in Greenhouse Scenarios Based on YOLO-Dense. Front. Plant Sci. 2021, 12, 634103. [Google Scholar] [CrossRef]
  41. Zhang, C.; Craine, W.; McGee, R.; Vandemark, G.; Davis, J.; Brown, J.; Hulbert, S.; Sankaran, S. Image-Based Phenotyping of Flowering Intensity in Cool-Season Crops. Sensors 2020, 20, 1450. [Google Scholar] [CrossRef] [PubMed]
  42. Wu, D.; Lv, S.; Jiang, M.; Song, H. Using channel pruning-based YOLO v4 deep learning algorithm for the real-time and accurate detection of apple flowers in natural environments. Comput. Electron. Agric. 2020, 178, 105742. [Google Scholar] [CrossRef]
  43. Wei, P.; Jiang, T.; Peng, H.; Jin, H.; Sun, H.; Chai, D.; Huang, J. Coffee Flower Identification Using Binarization Algorithm Based on Convolutional Neural Network for Digital Images. Plant Phenomics 2020, 2020, 6323965. [Google Scholar] [CrossRef] [PubMed]
  44. Lin, P.; Lee, W.S.; Chen, Y.M.; Peres, N.; Fraisse, C. A deep-level region-based visual representation architecture for detecting strawberry flowers in an outdoor field. Precis. Agric. 2019, 21, 387–402. [Google Scholar] [CrossRef]
  45. Wang, Z.; Wang, K.; Wang, X.; Pan, S.; Qiao, X. Dynamic ensemble selection of convolutional neural networks and its application in flower classification. Int. J. Agric. Biol. Eng. 2022, 15, 216–223. [Google Scholar] [CrossRef]
  46. Korchagin, S.A.; Gataullin, S.T.; Osipov, A.V.; Smirnov, M.V.; Suvorov, S.V.; Serdechnyi, D.V.; Bublikov, K.V. Development of an Optimal Algorithm for Detecting Damaged and Diseased Potato Tubers Moving along a Conveyor Belt Using Computer Vision Systems. Agronomy 2021, 11, 1980. [Google Scholar] [CrossRef]
  47. Xie, W.; Wei, S.; Zheng, Z.; Yang, D. A CNN-based lightweight ensemble model for detecting defective carrots. Biosyst. Eng. 2021, 208, 287–299. [Google Scholar] [CrossRef]
  48. Joseph Fernando, E.A.; Gomez Selvaraj, M.; Delgado, A.; Rabbi, I.; Kulakow, P. Frontline remote sensing tool to locate hidden traits in root and tuber crops. Mol. Plant 2022, 15, 1500–1502. [Google Scholar] [CrossRef]
  49. Zhao, D.; Eyre, J.X.; Wilkus, E.; de Voil, P.; Broad, I.; Rodriguez, D. 3D characterization of crop water use and the rooting system in field agronomic research. Comput. Electron. Agric. 2022, 202, 107409. [Google Scholar] [CrossRef]
  50. Han, L.; Yang, G.; Yang, H.; Xu, B.; Li, Z.; Yang, X. Clustering Field-Based Maize Phenotyping of Plant-Height Growth and Canopy Spectral Dynamics Using a UAV Remote-Sensing Approach. Front. Plant Sci. 2018, 9, 1638. [Google Scholar] [CrossRef]
  51. Pearse, G.D.; Tan, A.Y.S.; Watt, M.S.; Franz, M.O.; Dash, J.P. Detecting and mapping tree seedlings in UAV imagery using convolutional neural networks and field-verified data. ISPRS J. Photogramm. Remote Sens. 2020, 168, 156–169. [Google Scholar] [CrossRef]
  52. Tseng, H.-H.; Yang, M.-D.; Saminathan, R.; Hsu, Y.-C.; Yang, C.-Y.; Wu, D.-H. Rice Seedling Detection in UAV Images Using Transfer Learning and Machine Learning. Remote Sens. 2022, 14, 2837. [Google Scholar] [CrossRef]
  53. Zhang, L.; Xu, Z.; Xu, D.; Ma, J.; Chen, Y.; Fu, Z. Growth monitoring of greenhouse lettuce based on a convolutional neural network. Hortic. Res. 2020, 7, 124. [Google Scholar] [CrossRef] [PubMed]
  54. Sun, Z.; Guo, X.; Xu, Y.; Zhang, S.; Cheng, X.; Hu, Q.; Wang, W.; Xue, X. Image Recognition of Male Oilseed Rape (Brassica napus) Plants Based on Convolutional Neural Network for UAAS Navigation Applications on Supplementary Pollination and Aerial Spraying. Agriculture 2022, 12, 62. [Google Scholar] [CrossRef]
  55. Aeberli, A.; Johansen, K.; Robson, A.; Lamb, D.W.; Phinn, S. Detection of Banana Plants Using Multi-Temporal Multispectral UAV Imagery. Remote Sens. 2021, 13, 2123. [Google Scholar] [CrossRef]
  56. Zhu, T.; Ma, X.; Guan, H.; Wu, X.; Wang, F.; Yang, C.; Jiang, Q. A calculation method of phenotypic traits based on three-dimensional reconstruction of tomato canopy. Comput. Electron. Agric. 2023, 204, 107515. [Google Scholar] [CrossRef]
  57. Feng, Q.; Yang, J.; Liu, Y.; Ou, C.; Zhu, D.; Niu, B.; Liu, J.; Li, B. Multi-Temporal Unmanned Aerial Vehicle Remote Sensing for Vegetable Mapping Using an Attention-Based Recurrent Convolutional Neural Network. Remote Sens. 2020, 12, 1668. [Google Scholar] [CrossRef]
  58. Yarak, K.; Witayangkurn, A.; Kritiyutanont, K.; Arunplod, C.; Shibasaki, R. Oil Palm Tree Detection and Health Classification on High-Resolution Imagery Using Deep Learning. Agriculture 2021, 11, 183. [Google Scholar] [CrossRef]
  59. Tan, S.; Liu, J.; Lu, H.; Lan, M.; Yu, J.; Liao, G.; Wang, Y.; Li, Z.; Qi, L.; Ma, X. Machine Learning Approaches for Rice Seedling Growth Stages Detection. Front. Plant Sci. 2022, 13, 914771. [Google Scholar] [CrossRef]
  60. Quiroz, I.A.; Alférez, G.H. Image recognition of Legacy blueberries in a Chilean smart farm through deep learning. Comput. Electron. Agric. 2020, 168, 105044. [Google Scholar] [CrossRef]
  61. Yang, W.; Nigon, T.; Hao, Z.; Dias Paiao, G.; Fernández, F.G.; Mulla, D.; Yang, C. Estimation of corn yield based on hyperspectral imagery and convolutional neural network. Comput. Electron. Agric. 2021, 184, 106092. [Google Scholar] [CrossRef]
  62. Sun, G.; Wang, X.; Sun, Y.; Ding, Y.; Lu, W. Measurement Method Based on Multispectral Three-Dimensional Imaging for the Chlorophyll Contents of Greenhouse Tomato Plants. Sensors 2019, 19, 5295. [Google Scholar] [CrossRef] [PubMed]
  63. Jiang, Y.; Snider, J.L.; Li, C.; Rains, G.C.; Paterson, A.H. Ground Based Hyperspectral Imaging to Characterize Canopy-Level Photosynthetic Activities. Remote Sens. 2020, 12, 315. [Google Scholar] [CrossRef]
  64. Taha, M.F.; Mao, H.; Wang, Y.; Elmanawy, A.I.; Elmasry, G.; Wu, L.; Memon, M.S.; Niu, Z.; Huang, T.; Qiu, Z. High-Throughput Analysis of Leaf Chlorophyll Content in Aquaponically Grown Lettuce Using Hyperspectral Reflectance and RGB Images. Plants 2024, 13, 392. [Google Scholar] [CrossRef] [PubMed]
  65. Sapkota, B.B.; Hu, C.; Bagavathiannan, M.V. Evaluating Cross-Applicability of Weed Detection Models Across Different Crops in Similar Production Environments. Front. Plant Sci. 2022, 13, 837726. [Google Scholar] [CrossRef] [PubMed]
  66. Su, D.; Kong, H.; Qiao, Y.; Sukkarieh, S. Data augmentation for deep learning based semantic segmentation and crop-weed classification in agricultural robotics. Comput. Electron. Agric. 2021, 190, 106418. [Google Scholar] [CrossRef]
  67. Nasiri, A.; Omid, M.; Taheri-Garavand, A.; Jafari, A. Deep learning-based precision agriculture through weed recognition in sugar beet fields. Sustain. Comput. Inform. Syst. 2022, 35, 100759. [Google Scholar] [CrossRef]
  68. Yang, J.; Wang, Y.; Chen, Y.; Yu, J. Detection of Weeds Growing in Alfalfa Using Convolutional Neural Networks. Agronomy 2022, 12, 1459. [Google Scholar] [CrossRef]
  69. Bendel, N.; Kicherer, A.; Backhaus, A.; Klueck, H.-C.; Seiffert, U.; Fischer, M.; Voegele, R.T.; Toepfer, R. Evaluating the suitability of hyper- and multispectral imaging to detect foliar symptoms of the grapevine trunk disease Esca in vineyards. Plant Methods 2020, 16, 142. [Google Scholar] [CrossRef]
  70. Blasch, G.; Anberbir, T.; Negash, T.; Tilahun, L.; Belayineh, F.Y.; Alemayehu, Y.; Mamo, G.; Hodson, D.P.; Rodrigues, F.A., Jr. The potential of UAV and very high-resolution satellite imagery for yellow and stem rust detection and phenotyping in Ethiopia. Sci. Rep. 2023, 13, 16768. [Google Scholar] [CrossRef]
  71. McNish, I.G.; Smith, K.P. Oat Crown Rust Disease Severity Estimated at Many Time Points Using Multispectral Aerial Photos. Phytopathology 2022, 112, 682–690. [Google Scholar] [CrossRef] [PubMed]
  72. Ferrari, V.; Calvini, R.; Boom, B.; Menozzi, C.; Rangarajan, A.K.; Maistrello, L.; Offermans, P.; Ulrici, A. Evaluation of the potential of near infrared hyperspectral imaging for monitoring the invasive brown marmorated stink bug. Chemom. Intell. Lab. Syst. 2023, 234, 104751. [Google Scholar] [CrossRef]
  73. Nabwire, S.; Wakholi, C.; Faqeerzada, M.A.; Arief, M.A.A.; Kim, M.S.; Baek, I.; Cho, B.-K. Estimation of Cold Stress, Plant Age, and Number of Leaves in Watermelon Plants Using Image Analysis. Front. Plant Sci. 2022, 13, 847225. [Google Scholar] [CrossRef] [PubMed]
  74. Rossi, R.; Costafreda-Aumedes, S.; Leolini, L.; Leolini, C.; Bindi, M.; Moriondo, M. Implementation of an algorithm for automated phenotyping through plant 3D-modeling: A practical application on the early detection of water stress. Comput. Electron. Agric. 2022, 197, 106937. [Google Scholar] [CrossRef]
  75. Lazarević, B.; Šatović, Z.; Nimac, A.; Vidak, M.; Gunjača, J.; Politeo, O.; Carović-Stanko, K. Application of Phenotyping Methods in Detection of Drought and Salinity Stress in Basil (Ocimum basilicum L.). Front. Plant Sci. 2021, 12, 629441. [Google Scholar] [CrossRef] [PubMed]
  76. Carvalho, L.C.; Gonçalves, E.F.; Marques da Silva, J.; Costa, J.M. Potential Phenotyping Methodologies to Assess Inter- and Intravarietal Variability and to Select Grapevine Genotypes Tolerant to Abiotic Stress. Front. Plant Sci. 2021, 12, 718202. [Google Scholar] [CrossRef]
  77. Li, J.; Shi, Y.; Veeranampalayam-Sivakumar, A.-N.; Schachtman, D.P. Elucidating Sorghum Biomass, Nitrogen and Chlorophyll Contents With Spectral and Morphological Traits Derived From Unmanned Aircraft System. Front. Plant Sci. 2018, 9, 1406. [Google Scholar] [CrossRef]
  78. Jay, S.; Maupas, F.; Bendoula, R.; Gorretta, N. Retrieving LAI, chlorophyll and nitrogen contents in sugar beet crops from multi-angular optical remote sensing: Comparison of vegetation indices and PROSAIL inversion for field phenotyping. Field Crop. Res. 2017, 210, 33–46. [Google Scholar] [CrossRef]
  79. Luisa Buchaillot, M.; Gracia-Romero, A.; Vergara-Diaz, O.; Zaman-Allah, M.A.; Tarekegne, A.; Cairns, J.E.; Prasanna, B.M.; Luis Araus, J.; Kefauver, S.C. Evaluating Maize Genotype Performance under Low Nitrogen Conditions Using RGB UAV Phenotyping Techniques. Sensors 2019, 19, 1815. [Google Scholar] [CrossRef]
  80. de Oliveira, G.S.; Marcato Junior, J.; Polidoro, C.; Osco, L.P.; Siqueira, H.; Rodrigues, L.; Jank, L.; Barrios, S.; Valle, C.; Simeão, R.; et al. Convolutional Neural Networks to Estimate Dry Matter Yield in a Guineagrass Breeding Program Using UAV Remote Sensing. Sensors 2021, 21, 3971. [Google Scholar] [CrossRef]
  81. Zhou, Z.; Song, Z.; Fu, L.; Gao, F.; Li, R.; Cui, Y. Real-time kiwifruit detection in orchard using deep learning on Android™ smartphones for yield estimation. Comput. Electron. Agric. 2020, 179, 105856. [Google Scholar] [CrossRef]
  82. Khaki, S.; Pham, H.; Han, Y.; Kuhl, A.; Kent, W.; Wang, L. Convolutional Neural Networks for Image-Based Corn Kernel Detection and Counting. Sensors 2020, 20, 2721. [Google Scholar] [CrossRef] [PubMed]
  83. Li, Q.; Jin, S.; Zang, J.; Wang, X.; Sun, Z.; Li, Z.; Xu, S.; Ma, Q.; Su, Y.; Guo, Q.; et al. Deciphering the contributions of spectral and structural data to wheat yield estimation from proximal sensing. Crop J. 2022, 10, 1334–1345. [Google Scholar] [CrossRef]
  84. Lu, W.; Du, R.; Niu, P.; Xing, G.; Luo, H.; Deng, Y.; Shu, L. Soybean Yield Preharvest Prediction Based on Bean Pods and Leaves Image Recognition Using Deep Learning Neural Network Combined With GRNN. Front. Plant Sci. 2022, 12, 791256. [Google Scholar] [CrossRef]
  85. Gang, M.-S.; Kim, H.-J.; Kim, D.-W. Estimation of Greenhouse Lettuce Growth Indices Based on a Two-Stage CNN Using RGB-D Images. Sensors 2022, 22, 5499. [Google Scholar] [CrossRef]
  86. Chen, J.; Wang, Z.; Wu, J.; Hu, Q.; Zhao, C.; Tan, C.; Teng, L.; Luo, T. An improved Yolov3 based on dual path network for cherry tomatoes detection. J. Food Process Eng. 2021, 44, 13803. [Google Scholar] [CrossRef]
  87. Al-Badri, A.H.; Ismail, N.A.; Al-Dulaimi, K.; Salman, G.A.; Khan, A.R.; Al-Sabaawi, A.; Salam, M.S.H. Classification of weed using machine learning techniques: A review—Challenges, current and future potential techniques. J. Plant Dis. Prot. 2022, 129, 745–768. [Google Scholar] [CrossRef]
  88. Fan, J.; Zhang, Y.; Wen, W.; Gu, S.; Lu, X.; Guo, X. The future of Internet of Things in agriculture: Plant high-throughput phenotypic platform. J. Clean. Prod. 2021, 280, 123651. [Google Scholar] [CrossRef]
  89. Prey, L.; von Bloh, M.; Schmidhalter, U. Evaluating RGB Imaging and Multispectral Active and Hyperspectral Passive Sensing for Assessing Early Plant Vigor in Winter Wheat. Sensors 2018, 18, 2931. [Google Scholar] [CrossRef]
  90. Liang, T.; Duan, B.; Luo, X.; Ma, Y.; Yuan, Z.; Zhu, R.; Peng, Y.; Gong, Y.; Fang, S.; Wu, X. Identification of High Nitrogen Use Efficiency Phenotype in Rice (Oryza sativa L.) Through Entire Growth Duration by Unmanned Aerial Vehicle Multispectral Imagery. Front. Plant Sci. 2021, 12, 740414. [Google Scholar] [CrossRef]
  91. He, W.; Ye, Z.; Li, M.; Yan, Y.; Lu, W.; Xing, G. Extraction of soybean plant trait parameters based on SfM-MVS algorithm combined with GRNN. Front. Plant Sci. 2023, 14, 1181322. [Google Scholar] [CrossRef]
  92. Koji, T.; Iwata, H.; Ishimori, M.; Takanashi, H.; Yamasaki, Y.; Tsujimoto, H. Multispectral Phenotyping and Genetic Analyses of Spring Appearance in Greening Plant, Phedimus spp. Plant Phenomics 2023, 5, 0063. [Google Scholar] [CrossRef]
  93. Yu, F.; Zhang, Q.; Xiao, J.; Ma, Y.; Wang, M.; Luan, R.; Liu, X.; Ping, Y.; Nie, Y.; Tao, Z.; et al. Progress in the Application of CNN-Based Image Classification and Recognition in Whole Crop Growth Cycles. Remote Sens. 2023, 15, 2988. [Google Scholar] [CrossRef]
  94. Jiang, H.; Zhang, C.; Qiao, Y.; Zhang, Z.; Zhang, W.; Song, C. CNN feature based graph convolutional network for weed and crop recognition in smart farming. Comput. Electron. Agric. 2020, 174, 105450. [Google Scholar] [CrossRef]
  95. Ayankojo, I.T.; Thorp, K.R.; Thompson, A.L. Advances in the Application of Small Unoccupied Aircraft Systems (sUAS) for High-Throughput Plant Phenotyping. Remote Sens. 2023, 15, 2623. [Google Scholar] [CrossRef]
  96. Briechle, S.; Krzystek, P.; Vosselman, G. Silvi-Net—A dual-CNN approach for combined classification of tree species and standing dead trees from remote sensing data. Int. J. Appl. Earth Obs. Geoinf. 2021, 98, 102292. [Google Scholar] [CrossRef]
  97. Coupel-Ledru, A.; Pallas, B.; Delalande, M.; Segura, V.; Guitton, B.; Muranty, H.; Durel, C.E.; Regnard, J.L.; Costes, E. Tree architecture, light interception and water-use related traits are controlled by different genomic regions in an apple tree core collection. New Phytol. 2022, 234, 209–226. [Google Scholar] [CrossRef]
  98. Singh, A.; Jones, S.; Ganapathysubramanian, B.; Sarkar, S.; Mueller, D.; Sandhu, K.; Nagasubramanian, K. Challenges and Opportunities in Machine-Augmented Plant Stress Phenotyping. Trends Plant Sci. 2021, 26, 53–69. [Google Scholar] [CrossRef]
  99. Zubairova, U.S.; Kravtsova, A.Y.; Romashchenko, A.V.; Pushkareva, A.A.; Doroshkov, A.V. Particle-Based Imaging Tools Revealing Water Flows in Maize Nodal Vascular Plexus. Plants 2022, 11, 1533. [Google Scholar] [CrossRef]
  100. Pflugfelder, D.; Kochs, J.; Koller, R.; Jahnke, S.; Mohl, C.; Pariyar, S.; Fassbender, H.; Nagel, K.A.; Watt, M.; van Dusschoten, D.; et al. The root system architecture of wheat establishing in soil is associated with varying elongation rates of seminal roots: Quantification using 4D magnetic resonance imaging. J. Exp. Bot. 2022, 73, 2050–2060. [Google Scholar] [CrossRef]
  101. Mayer, S.; Munz, E.; Hammer, S.; Wagner, S.; Guendel, A.; Rolletschek, H.; Jakob, P.M.; Borisjuk, L.; Neuberger, T. Quantitative monitoring of paramagnetic contrast agents and their allocation in plant tissues via DCE-MRI. Plant Methods 2022, 18, 47. [Google Scholar] [CrossRef]
  102. Tang, W.; Wu, N.; Xiao, Q.; Chen, S.; Gao, P.; He, Y.; Feng, L. Early detection of cotton verticillium wilt based on root magnetic resonance images. Front. Plant Sci. 2023, 14, 1135718. [Google Scholar] [CrossRef]
  103. Wang, J.; Liu, H.; Yao, Q.; Gillbanks, J.; Zhao, X. Research on High-Throughput Crop Root Phenotype 3D Reconstruction Using X-ray CT in 5G Era. Electronics 2023, 12, 276. [Google Scholar] [CrossRef]
  104. Munné-Bosch, S.; Villadangos, S. Cheap, cost-effective, and quick stress biomarkers for drought stress detection and monitoring in plants. Trends Plant Sci. 2023, 28, 527–536. [Google Scholar] [CrossRef]
  105. Wang, C.; Hou, D.; Yu, J.; Yang, Y.; Zhu, B.; Jing, S.; Liu, L.; Bai, J.; Xu, H.; Kou, L. X-ray irradiation maintains quality and delays the reduction of energy charge of fresh figs (Ficus carica L. Siluhongyu). Food Control 2024, 160, 110318. [Google Scholar] [CrossRef]
  106. Ye, L.; Niu, Y.; Wang, Y.; Shi, Y.; Liu, Y.; Yu, J.; Bai, J.; Luo, A. Effect of X-ray irradiation on quality, cell ultrastructure and electrical parameters of postharvest kiwifruit. Innov. Food Sci. Emerg. Technol. 2023, 89, 103483. [Google Scholar] [CrossRef]
  107. Bai, G.; Blecha, S.; Ge, Y.; Walia, H.; Phansak, P. Characterizing wheat response to water limitation using multispectral and thermal imaging. Trans. Asabe 2017, 60, 1457–1466. [Google Scholar] [CrossRef]
  108. Singh, T.; Garg, N.M.; Iyengar, S.R.S. Nondestructive identification of barley seeds variety using near-infrared hyperspectral imaging coupled with convolutional neural network. J. Food Process Eng. 2021, 44, e13821. [Google Scholar] [CrossRef]
  109. Li, H.; Zhang, L.; Sun, H.; Rao, Z.; Ji, H. Discrimination of unsound wheat kernels based on deep convolutional generative adversarial network and near-infrared hyperspectral imaging technology. Spectrochim. Acta Part A Mol. Biomol. Spectrosc. 2022, 268, 120722. [Google Scholar] [CrossRef]
  110. Herr, A.W.; Adak, A.; Carroll, M.E.; Elango, D.; Kar, S.; Li, C.; Jones, S.E.; Carter, A.H.; Murray, S.C.; Paterson, A.; et al. Unoccupied aerial systems imagery for phenotyping in cotton, maize, soybean, and wheat breeding. Crop Sci. 2023, 63, 1722–1749. [Google Scholar] [CrossRef]
  111. Virlet, N.; Lebourgeois, V.; Martinez, S.; Costes, E.; Labbe, S.; Regnard, J.-L. Stress indicators based on airborne thermal imagery for field phenotyping a heterogeneous tree population for response to water constraints. J. Exp. Bot. 2014, 65, 5429–5442. [Google Scholar] [CrossRef]
  112. Lazarević, B.; Carović-Stanko, K.; Živčak, M.; Vodnik, D.; Javornik, T.; Safner, T. Classification of high-throughput phenotyping data for differentiation among nutrient deficiency in common bean. Front. Plant Sci. 2022, 13, 931877. [Google Scholar] [CrossRef]
  113. Galieni, A.; D’Ascenzo, N.; Stagnari, F.; Pagnani, G.; Xie, Q.; Pisante, M. Past and Future of Plant Stress Detection: An Overview From Remote Sensing to Positron Emission Tomography. Front. Plant Sci. 2021, 11, 609155. [Google Scholar] [CrossRef]
  114. Huang, L.; Zhang, Y.; Guo, J.; Peng, Q.; Zhou, Z.; Duan, X.; Tanveer, M.; Guo, Y. High-throughput root phenotyping of crop cultivars tolerant to low N in waterlogged soils. Front. Plant Sci. 2023, 14, 1271539. [Google Scholar] [CrossRef]
  115. Musaev, F.; Priyatkin, N.; Potrakhov, N.; Beletskiy, S.; Chesnokov, Y. Assessment of Brassicaceae Seeds Quality by X-ray Analysis. Horticulturae 2022, 8, 29. [Google Scholar] [CrossRef]
  116. Hong, S.-J.; Park, S.; Lee, C.-H.; Kim, S.; Roh, S.-W.; Nurhisna, N.I.; Kim, G. Application of X-ray Imaging and Convolutional Neural Networks in the Prediction of Tomato Seed Viability. IEEE Access 2023, 11, 38061–38071. [Google Scholar] [CrossRef]
  117. Jin, B.; Zhang, C.; Jia, L.; Tang, Q.; Gao, L.; Zhao, G.; Qi, H. Identification of rice seed varieties based on near-infrared hyperspectral imaging technology combined with deep learning. ACS Omega 2022, 7, 4735–4749. [Google Scholar] [CrossRef]
  118. Lazarevic, B.; Carovic-Stanko, K.; Safner, T.; Poljak, M. Study of high-temperature-induced morphological and physiological changes in potato using nondestructive plant phenotyping. Plants 2022, 11, 3534. [Google Scholar] [CrossRef]
  119. Park, M.; Somborn, A.; Schlehuber, D.; Keuter, V.; Deerberg, G. Raman spectroscopy in crop quality assessment: Focusing on sensing secondary metabolites: A review. Hortic. Res. 2023, 10, uhad074. [Google Scholar] [CrossRef]
  120. Saletnik, A.; Saletnik, B.; Puchalski, C. Raman Method in Identification of Species and Varieties, Assessment of Plant Maturity and Crop Quality—A Review. Molecules 2022, 27, 4454. [Google Scholar] [CrossRef]
  121. Xu, S.; Huang, X.; Lu, H. Advancements and Applications of Raman Spectroscopy in Rapid Quality and Safety Detection of Fruits and Vegetables. Horticulturae 2023, 9, 843. [Google Scholar] [CrossRef]
  122. Payne, W.Z.; Kurouski, D. Raman spectroscopy enables phenotyping and assessment of nutrition values of plants: A review. Plant Methods 2021, 17, 78. [Google Scholar] [CrossRef]
  123. Wang, H.; Liu, M.; Zhao, H.; Ren, X.; Lin, T.; Zhang, P.; Zheng, D. Rapid detection and identification of fungi in grain crops using colloidal Au nanoparticles based on surface-enhanced Raman scattering and multivariate statistical analysis. World J. Microbiol. Biotechnol. 2023, 39, 26. [Google Scholar] [CrossRef]
  124. Zhang, X.; Bian, F.; Wang, Y.; Hu, L.; Yang, N.; Mao, H. A Method for Capture and Detection of Crop Airborne Disease Spores Based on Microfluidic Chips and Micro Raman Spectroscopy. Foods 2022, 11, 3462. [Google Scholar] [CrossRef]
  125. Sabir, A.; Kumar, A. Study of integrated optical and synthetic aperture radar-based temporal indices database for specific crop mapping using fuzzy machine learning model. J. Appl. Remote Sens. 2023, 17, 014502. [Google Scholar] [CrossRef]
  126. Bai, J.-W.; Zhang, L.; Cai, J.-R.; Wang, Y.-C.; Tian, X.-Y. Laser light backscattering image to predict moisture content of mango slices with different ripeness during drying process. J. Food Process Eng. 2021, 44, e13900. [Google Scholar] [CrossRef]
  127. Gutierrez, S.; Wendel, A.; Underwood, J. Spectral filter design based on in-field hyperspectral imaging and machine learning for mango ripeness estimation. Comput. Electron. Agric. 2019, 164, 104890. [Google Scholar] [CrossRef]
  128. Cao, X.-F.; Yu, K.-Q.; Zhao, Y.-R.; Zhang, H.-H. Current Status of High-Throughput Plant Phenotyping for Abiotic Stress by Imaging Spectroscopy: A REVIEW. Spectrosc. Spectr. Anal. 2020, 40, 3365–3372. [Google Scholar]
  129. Wang, X.; Silva, P.; Bello, N.M.; Singh, D.; Evers, B.; Mondal, S.; Espinosa, F.P.; Singh, R.P.; Poland, J. Improved Accuracy of High-Throughput Phenotyping From Unmanned Aerial Systems by Extracting Traits Directly From Orthorectified Images. Front. Plant Sci. 2020, 11, 587093. [Google Scholar] [CrossRef]
  130. Hall, R.D.; D’Auria, J.C.; Silva Ferreira, A.C.; Gibon, Y.; Kruszka, D.; Mishra, P.; van de Zedde, R. High-throughput plant phenotyping: A role for metabolomics? Trends Plant Sci. 2022, 27, 549–563. [Google Scholar] [CrossRef]
  131. Xu, R.; Li, C.; Bernardes, S. Development and Testing of a UAV-Based Multi-Sensor System for Plant Phenotyping and Precision Agriculture. Remote Sens. 2021, 13, 3517. [Google Scholar] [CrossRef]
  132. Wan, L.; Zhang, J.; Dong, X.; Du, X.; Zhu, J.; Sun, D.; Liu, Y.; He, Y.; Cen, H. Unmanned aerial vehicle-based field phenotyping of crop biomass using growth traits retrieved from PROSAIL model. Comput. Electron. Agric. 2021, 187, 106304. [Google Scholar] [CrossRef]
  133. Henke, M.; Junker, A.; Neumann, K.; Altmann, T.; Gladilin, E. A two-step registration-classification approach to automated segmentation of multimodal images for high-throughput greenhouse plant phenotyping. Plant Methods 2020, 16, 95. [Google Scholar] [CrossRef] [PubMed]
  134. Zhang, S.-H.; He, L.; Duan, J.-Z.; Zang, S.-L.; Yang, T.-C.; Schulthess, U.R.S.; Guo, T.-C.; Wang, C.-Y.; Feng, W. Aboveground wheat biomass estimation from a low-altitude UAV platform based on multimodal remote sensing data fusion with the introduction of terrain factors. Precis. Agric. 2023, 25, 119–145. [Google Scholar] [CrossRef]
  135. Zheng, F.; Wang, X.; Ji, J.; Ma, H.; Cui, H.; Shi, Y.; Zhao, S. Synchronous Retrieval of LAI and Cab from UAV Remote Sensing: Development of Optimal Estimation Inversion Framework. Agronomy 2023, 13, 1119. [Google Scholar] [CrossRef]
  136. Ampatzidis, Y.; Partel, V. UAV-Based High Throughput Phenotyping in Citrus Utilizing Multispectral Imaging and Artificial Intelligence. Remote Sens. 2019, 11, 410. [Google Scholar] [CrossRef]
Figure 1. Spectral band division and various imaging techniques.
Figure 1. Spectral band division and various imaging techniques.
Plants 13 03088 g001
Figure 2. Application of spectral imaging in morphological phenotypes of plants.
Figure 2. Application of spectral imaging in morphological phenotypes of plants.
Plants 13 03088 g002
Figure 3. Application of spectral imaging in physiological phenotypes of plants.
Figure 3. Application of spectral imaging in physiological phenotypes of plants.
Plants 13 03088 g003
Figure 4. Application of spectral imaging in biochemical phenotypes of plants.
Figure 4. Application of spectral imaging in biochemical phenotypes of plants.
Plants 13 03088 g004
Figure 5. Application of spectral imaging in performance phenotype of plants.
Figure 5. Application of spectral imaging in performance phenotype of plants.
Plants 13 03088 g005
Figure 6. Application of spectral imaging to crop phenotypes.
Figure 6. Application of spectral imaging to crop phenotypes.
Plants 13 03088 g006
Table 1. Main characteristics of plant phenotype monitoring by spectral technique.
Table 1. Main characteristics of plant phenotype monitoring by spectral technique.
Research ObjectPhenotypic CharacteristicsImaging TechniquesReferences
Morphological PhenotypeSeedsize, shape, quantity, incompletenessHSI, MSI, RGB, PET, MRI, X-ray CTsoybeans [16], rice [25], hazelnut [26], pepper seed [27], hybrid okra seed [28], chickpea [29]
Fruit/Earsize, shape, quantity, color, ear length, ear thickness, symmetry, incompleteness, maturityHSI, MSI, RGB, PET, X-ray CTapple [30], sorghum panicle [31], kiwifruit [32], tomato [33], litchi [34], grape [35]
Leavesleaf area, length, width, number, inclination angle, color, veins, texture, symmetryHSI, MSI, RGB, PETleaves [36], grape [37], mango [38], sweet potato [39], tomato [40]
Flowerquantity, color, number of petals, degree of opennessHSI, MSI, RGBcool-season crops [41], apple [42], coffee [43], strawberry [44], flower [45]
Rootroot morphology, tuberous roots, lesions, defectsMRI, X-ray CTpotato [46], carrot [47], root [48], sorghum [49]
Plantplant height, stem thickness, leaves number, leaf area ratio, canopy coverage, stem length, plant spacingHSI, MSI, RGB, LiDARmaize [50], tree seedling [51], rice seedling [52], lettuce [53], oilseed [54], banana [55]
Canopybiomass, canopy coverage, coverage rate, average leaf angle, 3D spatial structureHSI, MSI, RGB, LiDARtomato [56], vegetables [57], oil palm tree [58], rice [59], blueberries [60], corn [61]
Physiological PhenotypePlantleaf texture, leaf surface temperature, photosynthetic capacity, seed hardness, fruit hardness, canopy temperature, texture, densityHSI, MSI, RGB, TI, MRI, X-ray CTtomato [62], cotton [63], lettuce [64], cucumber [19]
Biological Stressdisease stress, disease spots, disease severity, pest stress, weed stressHSI, MSI, RGB, NIRI, MRI, X-ray CTmango [38], weed [65,66,67,68], grape [69], wheat [70], oats [71], plants [72]
Abiotic Stressdrought stress, high/low-temperature stress, salt stress, nutritional stressHSI, MSI, ChlF, NIRI, TI, PET, MRI, X-ray CTcommon bean [21], watermelon [73], tomato [74], basil [75], grape [76]
Biochemical PhenotypePlantprotein, carbohydrates, nitrogen content, carotenoids, fatty acids, chlorophyll content, water content, anthocyanins, starch, sugarHSI, MSI, RGB, ChlF, NIRI, Ramansorghum [77], sugar beet [78], corn [79]
Performance PhenotypePlantyield, quality, biomass, fresh weight, dry weightHSI, MSI, RGB, NIRIguinea grass [80], kiwifruit [81], corn [61,82], wheat [83], soybean [84]
Table 2. Comparison of different spectral imaging techniques.
Table 2. Comparison of different spectral imaging techniques.
TechniquesAdvantagesDisadvantages
HSI, MSInon-invasive, fast, and high-throughput; MSI provides higher spatial resolution than HSI; capture stress signals before visiblehigh cost and heavy weight compared to RGB sensors; high data dimension requires greater computing power, time, and resources; unsuitable for online applications [95]; limitations on plant research at small-scale or patch level [127]; higher challenges for data mining and machine learning
RGBqualitative, reliable, inexpensive, convenient, and wild used; advantages in spatial resolution, signal-to-noise ratio, throughput, and repeatabilitylimited image accuracy due to inherent size distortion between 2D planes and 3D plants; only obtain surface features due to inability to penetrate the crop canopy; unsuitable complex environments with variable lighting, observation angles, object directions, and various occlusion [9]
NIRIpenetrates deeper than other instruments of the same wavelength; highly sensitive in identifying water’s presence, water stress, and cellular physical structureunable to provide reliable data on plant chemical composition; relatively large relative error due to the interference between adjacent peaks in the spectrum; dependent on mathematical model to conduct analysis; sensitive to temperature and humidity
TImonitors plant stress responses simpler and cheaper; higher spatial resolution, targeting, and sensitivity to certain environmental factors in a constantly changing environmentonly obtains features related to surface temperature; relatively poor spatial resolution and repeatability; higher cost and more difficult to deploy compared with infrared thermometers; very limited effectiveness in small temperature differences [76]; easy to be disturbed for soil, air, and canopy temperature
LiDARdurability, high accuracy, data resolution, and reading speed, and sensitivity to small distance changes; suitable for various lighting conditions, such as nighttime and field measurementshigh cost, large data volume, and narrow band; unsuitable for complex leaf angles and flat canopy; time-consuming and large computational load for 3D point cloud generation; scanning noise was easily generated due to wind and rain interference; low accuracy in large-scale phenotype analysis; difficult analysis
ChlFchange in ChlF can occur before most other signs of stress; fast, non-invasive, easy-to-operate, low-cost, and highly sensitive; short measurement time, large measurement area, and high fluxvulnerable to interference from uneven lighting, wind, and rain; unable to distinguish potential causes of the stress; difficult to distinguish temperature signals and light signals when outdoors; unable to measure soluble solids content, fruit pH value, and maturity of the plant; requires dark-adapted measurements
MRIprovides spatial information of the nucleus; suitable for obtaining plant morphological characteristics under limited flux and spatial resolutiononly operated in laboratory; no suitable portable devices for the in-field crop; long time consumption of data collection and limited throughput; unable to be used on aerial platforms due to the size and weight of the equipment; very high cost
X-ray CThigh spatial resolution, signal-to-noise ratio, and repeatability; multi-spectral X-ray provides higher sensitivity for plant identificationhigh cost, long time consumption, and low throughput; poor environmental adaptability; not suitable for airborne use; only scans roots with a diameter of 1 mm or more; unable to measure many fine roots; only achieves 3D visualization and qualitative interpretation of plant organs and tissues; low automation
Ramanhigh spectral resolution; highly sensitive for the detection of minor components; surface-enhanced Raman has better sensitivityrisk of tissue burns when laser irradiation applied due to small sample volume and self-luminous high; very weak and unstable—should be combined with other methods; strong interference of biological fluorescence signals in the background; high cost
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zhang, Q.; Luan, R.; Wang, M.; Zhang, J.; Yu, F.; Ping, Y.; Qiu, L. Research Progress of Spectral Imaging Techniques in Plant Phenotype Studies. Plants 2024, 13, 3088. https://doi.org/10.3390/plants13213088

AMA Style

Zhang Q, Luan R, Wang M, Zhang J, Yu F, Ping Y, Qiu L. Research Progress of Spectral Imaging Techniques in Plant Phenotype Studies. Plants. 2024; 13(21):3088. https://doi.org/10.3390/plants13213088

Chicago/Turabian Style

Zhang, Qian, Rupeng Luan, Ming Wang, Jinmeng Zhang, Feng Yu, Yang Ping, and Lin Qiu. 2024. "Research Progress of Spectral Imaging Techniques in Plant Phenotype Studies" Plants 13, no. 21: 3088. https://doi.org/10.3390/plants13213088

APA Style

Zhang, Q., Luan, R., Wang, M., Zhang, J., Yu, F., Ping, Y., & Qiu, L. (2024). Research Progress of Spectral Imaging Techniques in Plant Phenotype Studies. Plants, 13(21), 3088. https://doi.org/10.3390/plants13213088

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop