UAV Multisensory Data Fusion and Multi-Task Deep Learning for High-Throughput Maize Phenotyping
Abstract
:1. Introduction
2. Materials and Preprocessing
2.1. Test Site and UAV Data Acquisition
2.2. Data Acquisition
2.2.1. Field Data Collection
2.2.2. UAV Data Acquisition
2.3. Post-Collection Hyperspectral Imagery Processing
2.4. Post-Collection LiDAR Point Cloud Processing
2.5. Post-Collection Thermal Imagery Processing
2.6. Image Co-Registration
3. Methods
3.1. Ground-Truth Data Exploration
3.2. Plot-Level Chip Image Segmentation and Feature Scaling
3.3. An Extended Normalized Difference Spectral Indices (NDSIs) as a Simple Fusion
3.4. Feature Engineering and Traditional Machine Learning
3.5. Multimodal Fusion and Multi-Task Deep Learning
3.5.1. Deep Learning and the Need for Data Augmentation
3.5.2. Convolutional Neural Network for Imagery Representation Learning
3.5.3. Multimodal Fusion and Multi-Task Prediction Block
3.5.4. Loss Function
3.6. Model Evaluation and Performance Metrics
4. Results
4.1. Results of a Naïve Fusion NDSI Method
4.2. Machine Learning and Deep Learning Performance on Multisensory Fused Data
4.3. Spatial Distribution Maps of Predicted Results
5. Discussion
5.1. Remote Sensing Data for High-Throughput Maize Phenotyping
5.2. Contribution of Different Data Modalities for Phenotyping Predictions
5.3. Feature- and Imagery-Based Prediction Comparison
5.4. Mono-Task and Multi-Task Learning Comparison
5.5. Impacts of Data Augmentation on Deep Learning Regression
5.6. Performance of Different Methods over Space
6. Conclusions
- The success level of UAV multisensory data for high-throughput maize phenotyping varies from trait to trait because each trait is responsive to the experiment and environmental conditions in different mechanisms. Grain density prediction was the least successful (R2 = 0.34) in contrast with very high predictable traits: plant total nitrogen content and grain nitrogen content (R2 = 0.85). The resulting RMSE and MAE were congruent in high R2 models and became discrepant in low R2 models, which signifies extreme values in the ground dataset. Expanding observations and collecting more data are highly recommended, particularly for grain density, grain NutE, and harvest index in future research.
- There is a varying contribution of each data modality (hyperspectral, thermal, LiDAR canopy height, LiDAR canopy intensity) individually and their fusion for phenotyping predictions. Hyperspectral data were the most primarily contributory to virtually all eight estimations, especially dry grain yield, and nitrogen content in plants and grains. LiDAR canopy height enjoyed its merit in predicting stalk biomass more accurately than any other modality. The superiority of multisensory data fusion in all phenotype predictions was evident in the study because the fusion can help to exceed limitations of single data modality, for example, the vegetation saturation effect occurring in optical remote sensing.
- Feature- and imagery-based prediction are comparable if the latter is not superior to the former. Image-based deep learning within a framework of convolutional neural networks (CNNs) demonstrated an automation of the feature extraction, neither relying on human expertise nor being prone to human errors. This is concretely evidenced by the outperformance of image-based deep learning when thermal or LiDAR intensity data were funneled to the CNNs across maize trait predictions. The image-based deep learning remained stable as indicated by a smaller deviation through dataset shuffling.
- Mono-task and multi-task learning are comparable if the latter is not superior to the former. Multi-task deep learning leverages latent relatedness among maize traits during optimizing cycles of weights and biases of each network node. The sharing protocol of multi-task models can reach its full potential when interacting with multisensory data fusion, which becomes multi-input multi-output models. It is also evident that executing multi-task learning models only requires a fraction of the computational resources and time needed for mono-task learning models, while accelerating high throughput phenotyping by simultaneous predictions.
- Data augmentation for deep learning in the context of regression succeeds to elevate the intrinsic issue of a small sample size in remote sensing research (i.e., the Hughes effect). Augmented data also help to build up the rigidity and reliability of deep learning models by faster convergence and less overfitting.
- A randomness over space of the prediction residuals from the Global Morans’ I analysis implies that there were no confounding variables implicitly veering the predictive performance of maize traits. A small and random regression error also reinforces the versatility of UAV airborne multisensory data fusion in the framework of multi-task deep learning. Cob biomass is the only trait showing a clustering pattern of prediction errors in all models, which needs to be investigated further in future research.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Appendix A
Appendix B
Datasets | Metrics | Stalk Biomass (kg/ha) | Cob Biomass (kg/ha) | Dry Grain Yield (kg/ha) | |||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Hand-Crafted Features-Based | Imagery-Based | Hand-Crafted Features-Based | Imagery-Based | Hand-Crafted Features-Based | Imagery-Based | ||||||||||||||||||||
SVR | RFR | Mono-Task | Multi-Task | SVR | RFR | Mono-Task | Multi-Task | SVR | RFR | Mono-Task | Multi-Task | ||||||||||||||
Train | Test | Train | Test | Train | Test | Train | Test | Train | Test | Train | Test | Train | Test | Train | Test | Train | Test | Train | Test | Train | Test | Train | Test | ||
Thermal | R2 | 0.06 | 0 | 0.1 | 0.02 | 0.15 | 0.12 | 0.15 | 0.11 | 0.05 | 0 | 0.08 | 0 | 0.22 | 0.22 | 0.2 | 0.2 | 0.07 | 0.02 | 0.13 | 0.05 | 0.28 | 0.26 | 0.28 | 0.26 |
MAE | 1522 | 1518 | 1519 | 1486 | 1419 | 1403 | 1422 | 1404 | 377 | 401 | 377 | 393 | 336 | 335 | 339 | 337 | 2672 | 2784 | 2603 | 2743 | 2272 | 2351 | 2273 | 2326 | |
RMSE | 2120 | 1969 | 2088 | 1943 | 2015 | 1844 | 2016 | 1855 | 485 | 507 | 474 | 498 | 438 | 443 | 444 | 448 | 3170 | 3252 | 3081 | 3211 | 2786 | 2833 | 2795 | 2830 | |
LiDAR Intensity | R2 | 0.2 | 0.09 | 0.16 | 0.04 | 0.15 | 0.16 | 0.14 | 0.13 | 0.3 | 0.09 | 0.16 | 0.03 | 0.19 | 0.22 | 0.18 | 0.21 | 0.37 | 0.04 | 0.17 | 0 | 0.14 | 0.17 | 0.13 | 0.15 |
MAE | 1394 | 1440 | 1463 | 1502 | 1454 | 1355 | 1473 | 1360 | 316 | 363 | 362 | 383 | 352 | 341 | 356 | 346 | 2108 | 2615 | 2560 | 2807 | 2484 | 2462 | 2510 | 2493 | |
RMSE | 1961 | 1867 | 2003 | 1926 | 2019 | 1805 | 2034 | 1824 | 416 | 474 | 454 | 492 | 446 | 443 | 449 | 446 | 2603 | 3209 | 2993 | 3311 | 3047 | 3008 | 3065 | 3023 | |
LiDAR Height | R2 | 0.55 | 0.48 | 0.58 | 0.45 | 0.37 | 0.38 | 0.31 | 0.31 | 0.37 | 0.27 | 0.45 | 0.33 | 0.28 | 0.3 | 0.29 | 0.30 | 0.47 | 0.33 | 0.41 | 0.37 | 0.25 | 0.25 | 0.26 | 0.26 |
MAE | 996 | 1040 | 1052 | 1108 | 1254 | 1191 | 1326 | 1266 | 276 | 307 | 274 | 300 | 325 | 313 | 323 | 313 | 1698 | 1886 | 1925 | 1956 | 2269 | 2249 | 2254 | 2250 | |
RMSE | 1472 | 1414 | 1419 | 1464 | 1744 | 1550 | 1823 | 1632 | 393 | 423 | 369 | 410 | 422 | 419 | 419 | 418 | 2390 | 2668 | 2526 | 2611 | 2857 | 2851 | 2843 | 2840 | |
Hyper spectral | R2 | 0.54 | 0.5 | 0.48 | 0.36 | 0.43 | 0.36 | 0.4 | 0.32 | 0.57 | 0.5 | 0.56 | 0.46 | 0.45 | 0.37 | 0.45 | 0.39 | 0.81 | 0.77 | 0.81 | 0.78 | 0.76 | 0.72 | 0.76 | 0.73 |
MAE | 953 | 1000 | 1076 | 1146 | 1131 | 1167 | 1145 | 1171 | 223 | 244 | 238 | 262 | 270 | 291 | 268 | 284 | 1061 | 1206 | 1082 | 1175 | 1225 | 1323 | 1218 | 1276 | |
RMSE | 1488 | 1392 | 1582 | 1576 | 1651 | 1568 | 1703 | 1613 | 324 | 353 | 329 | 368 | 369 | 398 | 368 | 390 | 1430 | 1578 | 1438 | 1554 | 1602 | 1722 | 1616 | 1688 | |
Hyper + LiDAR Height | R2 | 0.6 | 0.53 | 0.64 | 0.47 | 0.5 | 0.46 | 0.4 | 0.37 | 0.54 | 0.43 | 0.58 | 0.47 | 0.47 | 0.4 | 0.44 | 0.41 | 0.84 | 0.73 | 0.81 | 0.78 | 0.76 | 0.73 | 0.73 | 0.72 |
MAE | 911 | 1008 | 989 | 1068 | 1069 | 1096 | 1164 | 1156 | 221 | 265 | 233 | 259 | 264 | 281 | 271 | 279 | 914 | 1297 | 1083 | 1176 | 1223 | 1287 | 1274 | 1312 | |
RMSE | 1375 | 1343 | 1321 | 1428 | 1545 | 1443 | 1694 | 1549 | 335 | 376 | 322 | 364 | 363 | 387 | 372 | 384 | 1301 | 1682 | 1437 | 1555 | 1609 | 1691 | 1708 | 1718 | |
Hyper + LiDAR Height + LiDAR Intensity | R2 | 0.57 | 0.49 | 0.64 | 0.47 | 0.47 | 0.47 | 0.42 | 0.39 | 0.48 | 0.41 | 0.57 | 0.47 | 0.44 | 0.41 | 0.45 | 0.41 | 0.86 | 0.72 | 0.81 | 0.77 | 0.73 | 0.72 | 0.74 | 0.72 |
MAE | 946 | 1028 | 990 | 1069 | 1116 | 1088 | 1147 | 1135 | 248 | 271 | 236 | 261 | 272 | 281 | 267 | 279 | 833 | 1380 | 1082 | 1179 | 1282 | 1323 | 1268 | 1315 | |
RMSE | 1437 | 1393 | 1321 | 1430 | 1594 | 1430 | 1674 | 1530 | 357 | 383 | 326 | 365 | 372 | 386 | 368 | 384 | 1197 | 1748 | 1436 | 1558 | 1705 | 1730 | 1692 | 1723 | |
Hyper + LiDAR Height + LiDAR Intensity+ Thermal | R2 | 0.57 | 0.49 | 0.64 | 0.47 | 0.47 | 0.46 | 0.45 | 0.43 | 0.48 | 0.41 | 0.57 | 0.46 | 0.44 | 0.40 | 0.46 | 0.43 | 0.85 | 0.72 | 0.81 | 0.77 | 0.74 | 0.72 | 0.73 | 0.71 |
MAE | 939 | 1038 | 989 | 1070 | 1113 | 1083 | 1107 | 1108 | 248 | 271 | 237 | 263 | 269 | 284 | 262 | 274 | 923 | 1361 | 1082 | 1179 | 1277 | 1338 | 1267 | 1321 | |
RMSE | 1432 | 1402 | 1321 | 1430 | 1599 | 1448 | 1623 | 1478 | 357 | 383 | 326 | 367 | 370 | 388 | 365 | 380 | 1228 | 1734 | 1436 | 1558 | 1696 | 1752 | 1701 | 1747 |
Datasets | Metrics | Harvest Index | Grain Nitrogen Utilization Efficiency (Grain NutE) | Grain Nitrogen Content (kg/ha) | |||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Hand-Crafted Features-Based | Imagery-Based | Hand-Crafted Features-Based | Imagery-Based | Hand-Crafted Features-Based | Imagery-Based | ||||||||||||||||||||
SVR | RFR | Mono-Task | Multi-Task | SVR | RFR | Mono-Task | Multi-Task | SVR | RFR | Mono-Task | Multi-Task | ||||||||||||||
Train | Test | Train | Test | Train | Test | Train | Test | Train | Test | Train | Test | Train | Test | Train | Test | Train | Test | Train | Test | Train | Test | Train | Test | ||
Thermal | R2 | 0.02 | 0 | 0.07 | 0 | 0.14 | 0.13 | 0.13 | 0.13 | 0 | 0 | 0.04 | 0 | 0.12 | 0.1 | 0.11 | 0.09 | 0.03 | 0 | 0.12 | 0.03 | 0.33 | 0.3 | 0.33 | 0.31 |
MAE | 0.07 | 0.07 | 0.07 | 0.07 | 0.06 | 0.06 | 0.06 | 0.06 | 8.3 | 7.97 | 8.42 | 8.17 | 7.78 | 7.41 | 7.78 | 7.43 | 41.08 | 43.28 | 40.6 | 43.28 | 33.41 | 34.7 | 33.22 | 34.66 | |
RMSE | 0.09 | 0.09 | 0.09 | 0.09 | 0.08 | 0.08 | 0.08 | 0.08 | 11.14 | 10.89 | 10.94 | 10.76 | 10.49 | 10.15 | 10.53 | 10.24 | 48.44 | 50.67 | 46.29 | 48.97 | 40.25 | 41.34 | 40.33 | 41.30 | |
LiDAR Intensity | R2 | 0.12 | 0 | 0.1 | 0 | 0.04 | 0.04 | 0.02 | 0.02 | 0.14 | 0.08 | 0.17 | 0.02 | 0.11 | 0.11 | 0.09 | 0.09 | 0.38 | 0.05 | 0.2 | 0.02 | 0.19 | 0.2 | 0.17 | 0.18 |
MAE | 0.06 | 0.07 | 0.07 | 0.07 | 0.07 | 0.06 | 0.07 | 0.07 | 7.76 | 7.58 | 7.81 | 7.91 | 7.89 | 7.64 | 8.08 | 7.66 | 31.29 | 39.66 | 39.14 | 43.55 | 37.98 | 38.0 | 38.94 | 39.1 | |
RMSE | 0.08 | 0.09 | 0.08 | 0.09 | 0.09 | 0.09 | 0.09 | 0.09 | 10.36 | 10.27 | 10.15 | 10.60 | 10.52 | 10.14 | 10.69 | 10.22 | 38.58 | 48.06 | 44.09 | 49.09 | 44.39 | 44.26 | 44.99 | 45.04 | |
LiDAR Height | R2 | 0.23 | 0.02 | 0.28 | 0.12 | 0.05 | 0.04 | 0.04 | 0.03 | 0.33 | 0.2 | 0.35 | 0.26 | 0.18 | 0.17 | 0.17 | 0.19 | 0.54 | 0.36 | 0.47 | 0.42 | 0.3 | 0.3 | 0.3 | 0.3 |
MAE | 0.05 | 0.06 | 0.06 | 0.06 | 0.07 | 0.06 | 0.07 | 0.06 | 6.48 | 6.96 | 6.81 | 6.96 | 7.52 | 7.37 | 7.63 | 7.41 | 23.24 | 27.93 | 27.35 | 28.63 | 33.98 | 34.33 | 34.26 | 34.53 | |
RMSE | 0.08 | 0.09 | 0.08 | 0.08 | 0.09 | 0.09 | 0.09 | 0.09 | 9.15 | 9.59 | 9.03 | 9.24 | 10.12 | 9.74 | 10.16 | 9.65 | 33.27 | 39.52 | 35.74 | 37.60 | 41.1 | 41.55 | 41.31 | 41.57 | |
Hyper spectral | R2 | 0.53 | 0.49 | 0.53 | 0.42 | 0.45 | 0.41 | 0.45 | 0.41 | 0.39 | 0.29 | 0.44 | 0.3 | 0.36 | 0.29 | 0.33 | 0.27 | 0.87 | 0.82 | 0.88 | 0.85 | 0.85 | 0.81 | 0.84 | 0.81 |
MAE | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 5.57 | 5.85 | 5.86 | 6.01 | 6.13 | 6.05 | 6.23 | 6.15 | 12.79 | 15.87 | 12.67 | 13.97 | 14.66 | 16.13 | 14.69 | 16.33 | |
RMSE | 0.06 | 0.06 | 0.06 | 0.07 | 0.07 | 0.07 | 0.07 | 0.07 | 8.75 | 9.05 | 8.35 | 8.97 | 8.92 | 9.04 | 9.13 | 9.18 | 17.66 | 20.95 | 17.22 | 19.24 | 19.26 | 21.37 | 19.63 | 21.56 | |
Hyper + LiDAR Height | R2 | 0.55 | 0.48 | 0.6 | 0.56 | 0.47 | 0.41 | 0.42 | 0.43 | 0.42 | 0.32 | 0.55 | 0.49 | 0.38 | 0.3 | 0.27 | 0.25 | 0.87 | 0.81 | 0.88 | 0.85 | 0.84 | 0.81 | 0.81 | 0.79 |
MAE | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.05 | 0.04 | 5.36 | 5.90 | 5.51 | 5.55 | 6.01 | 6.12 | 6.74 | 6.51 | 12.69 | 16.62 | 12.67 | 13.99 | 14.63 | 16.21 | 16.03 | 16.77 | |
RMSE | 0.06 | 0.06 | 0.06 | 0.06 | 0.06 | 0.07 | 0.07 | 0.07 | 8.48 | 8.82 | 7.48 | 7.62 | 8.79 | 8.97 | 9.54 | 9.27 | 17.77 | 21.76 | 17.22 | 19.27 | 19.58 | 21.73 | 21.42 | 22.43 | |
Hyper + LiDAR Height + LiDAR Intensity | R2 | 0.56 | 0.47 | 0.6 | 0.55 | 0.46 | 0.42 | 0.42 | 0.43 | 0.44 | 0.33 | 0.56 | 0.48 | 0.26 | 0.24 | 0.27 | 0.27 | 0.85 | 0.79 | 0.88 | 0.85 | 0.81 | 0.79 | 0.81 | 0.79 |
MAE | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 5.37 | 6.02 | 5.55 | 5.57 | 6.86 | 6.67 | 6.71 | 6.44 | 14.4 | 16.71 | 12.66 | 14.00 | 15.99 | 16.96 | 15.86 | 16.89 | |
RMSE | 0.06 | 0.06 | 0.06 | 0.06 | 0.07 | 0.07 | 0.07 | 0.07 | 8.34 | 8.77 | 7.43 | 7.70 | 9.62 | 9.37 | 9.53 | 9.17 | 19.33 | 22.36 | 17.21 | 19.27 | 21.25 | 22.62 | 21.22 | 22.65 | |
Hyper + LiDAR Height + LiDAR Intensity + Thermal | R2 | 0.55 | 0.46 | 0.6 | 0.55 | 0.45 | 0.42 | 0.44 | 0.45 | 0.45 | 0.31 | 0.56 | 0.48 | 0.29 | 0.23 | 0.29 | 0.29 | 0.86 | 0.8 | 0.88 | 0.85 | 0.81 | 0.78 | 0.82 | 0.79 |
MAE | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 5.21 | 6.08 | 5.55 | 5.57 | 6.71 | 6.65 | 6.61 | 6.36 | 13.47 | 16.65 | 12.66 | 14.00 | 16.24 | 17.01 | 15.51 | 16.7 | |
RMSE | 0.06 | 0.06 | 0.06 | 0.06 | 0.07 | 0.07 | 0.07 | 0.07 | 8.24 | 8.90 | 7.43 | 7.69 | 9.44 | 9.39 | 9.39 | 9.02 | 18.57 | 22.20 | 17.21 | 19.28 | 21.66 | 23.10 | 21.15 | 22.50 |
Datasets | Metrics | Total Plant N (kg/ha) | Grain Density | ||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Hand-Crafted Features-Based | Imagery-Based | Hand-Crafted Features-Based | Imagery-Based | ||||||||||||||
SVR | RFR | Mono-Task | Multi-Task | SVR | RFR | Mono-Task | Multi-Task | ||||||||||
Train | Test | Train | Test | Train | Test | Train | Test | Train | Test | Train | Test | Train | Test | Train | Test | ||
Thermal | R2 | 0.03 | 0 | 0.13 | 0.05 | 0.34 | 0.32 | 0.32 | 0.29 | 0.01 | 0 | 0.03 | 0 | 0.14 | 0.09 | 0.12 | 0.11 |
MAE | 58.2 | 60.79 | 57.97 | 60.30 | 46.8 | 47.91 | 48.32 | 49.19 | 0.03 | 0.03 | 0.03 | 0.03 | 0.03 | 0.02 | 0.03 | 0.03 | |
RMSE | 68.96 | 71.64 | 65.48 | 68.07 | 56.93 | 57.49 | 57.74 | 58.52 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.03 | |
LiDAR Intensity | R2 | 0.42 | 0.04 | 0.21 | 0.04 | 0.21 | 0.23 | 0.18 | 0.19 | 0.11 | 0.05 | 0.13 | 0.03 | 0.1 | 0.12 | 0.1 | 0.13 |
MAE | 43.12 | 54.85 | 55.63 | 61.01 | 52.5 | 51.9 | 55.19 | 54.67 | 0.03 | 0.03 | 0.03 | 0.03 | 0.03 | 0.02 | 0.03 | 0.03 | |
RMSE | 53.19 | 67.95 | 62.47 | 68.23 | 62.17 | 60.97 | 63.65 | 62.65 | 0.04 | 0.04 | 0.04 | 0.04 | 0.04 | 0.03 | 0.04 | 0.03 | |
LiDAR Height | R2 | 0.58 | 0.44 | 0.52 | 0.47 | 0.36 | 0.36 | 0.34 | 0.34 | 0.29 | 0.15 | 0.28 | 0.19 | 0.14 | 0.13 | 0.15 | 0.14 |
MAE | 32.49 | 37.95 | 37.19 | 39.17 | 46.26 | 46.61 | 46.77 | 47.39 | 0.02 | 0.03 | 0.02 | 0.02 | 0.03 | 0.03 | 0.03 | 0.03 | |
RMSE | 45.58 | 51.94 | 48.53 | 50.80 | 56.26 | 55.94 | 56.86 | 56.57 | 0.03 | 0.03 | 0.03 | 0.03 | 0.04 | 0.03 | 0.04 | 0.03 | |
Hyperspectral | R2 | 0.88 | 0.85 | 0.88 | 0.86 | 0.87 | 0.85 | 0.85 | 0.83 | 0.4 | 0.29 | 0.38 | 0.28 | 0.34 | 0.32 | 0.33 | 0.32 |
MAE | 17.37 | 20.39 | 17.31 | 18.98 | 18.97 | 20.25 | 20.01 | 21.56 | 0.02 | 0.02 | 0.02 | 0.02 | 0.02 | 0.02 | 0.02 | 0.02 | |
RMSE | 24.51 | 26.91 | 24.24 | 26.16 | 25.45 | 27.06 | 27.18 | 28.74 | 0.03 | 0.03 | 0.03 | 0.03 | 0.03 | 0.03 | 0.03 | 0.03 | |
Hyper + LiDAR Height | R2 | 0.88 | 0.85 | 0.88 | 0.85 | 0.86 | 0.84 | 0.82 | 0.82 | 0.47 | 0.34 | 0.44 | 0.34 | 0.34 | 0.32 | 0.31 | 0.31 |
MAE | 17.27 | 20.38 | 17.41 | 19.16 | 19.68 | 20.83 | 22.25 | 22.62 | 0.02 | 0.02 | 0.02 | 0.02 | 0.02 | 0.02 | 0.02 | 0.02 | |
RMSE | 24.3 | 27.22 | 24.15 | 26.64 | 26.25 | 27.43 | 29.53 | 29.40 | 0.03 | 0.03 | 0.03 | 0.03 | 0.03 | 0.03 | 0.03 | 0.03 | |
Hyper + LiDAR Height + LiDAR Intensity | R2 | 0.87 | 0.83 | 0.88 | 0.85 | 0.84 | 0.83 | 0.83 | 0.82 | 0.38 | 0.31 | 0.46 | 0.35 | 0.3 | 0.28 | 0.3 | 0.3 |
MAE | 17.9 | 21.82 | 17.39 | 19.22 | 21.37 | 22.24 | 21.98 | 22.5 | 0.02 | 0.02 | 0.02 | 0.02 | 0.03 | 0.03 | 0.03 | 0.02 | |
RMSE | 24.86 | 28.31 | 24.11 | 26.74 | 28.4 | 28.68 | 29.17 | 29.37 | 0.03 | 0.03 | 0.03 | 0.03 | 0.03 | 0.03 | 0.03 | 0.03 | |
Hyper + LiDAR Height + LiDAR Intensity + Thermal | R2 | 0.88 | 0.83 | 0.88 | 0.85 | 0.83 | 0.83 | 0.83 | 0.82 | 0.38 | 0.32 | 0.46 | 0.35 | 0.29 | 0.25 | 0.31 | 0.29 |
MAE | 17.12 | 21.93 | 17.39 | 19.23 | 21.63 | 21.85 | 21.65 | 22.65 | 0.02 | 0.02 | 0.02 | 0.02 | 0.03 | 0.03 | 0.02 | 0.02 | |
RMSE | 23.95 | 28.54 | 24.11 | 26.75 | 28.76 | 28.39 | 28.94 | 29.40 | 0.03 | 0.03 | 0.03 | 0.03 | 0.03 | 0.03 | 0.03 | 0.03 |
References
- Noureldin, N.; Aboelghar, M.; Saudy, H.; Ali, A. Rice yield forecasting models using satellite imagery in Egypt. Egypt. J. Remote Sens. Space Sci. 2013, 16, 125–131. [Google Scholar] [CrossRef]
- Wang, L.; Tian, Y.; Yao, X.; Zhu, Y.; Cao, W. Predicting grain yield and protein content in wheat by fusing multi-sensor and multi-temporal remote-sensing images. Field Crops Res. 2014, 164, 178–188. [Google Scholar] [CrossRef]
- Reynolds, C.A.; Yitayew, M.; Slack, D.C.; Hutchinson, C.F.; Huete, A.; Petersen, M.S. Estimating crop yields and production by integrating the FAO Crop Specific Water Balance model with real-time satellite data and ground-based ancillary data. Int. J. Remote Sens. 2000, 21, 3487–3508. [Google Scholar] [CrossRef]
- Schut, A.G.; Traore, P.C.S.; Blaes, X.; Rolf, A. Assessing yield and fertilizer response in heterogeneous smallholder fields with UAVs and satellites. Field Crops Res. 2018, 221, 98–107. [Google Scholar] [CrossRef]
- Maimaitijiang, M.; Sagan, V.; Sidike, P.; Hartling, S.; Esposito, F.; Fritschi, F.B. Soybean yield prediction from UAV using multimodal data fusion and deep learning. Remote Sens. Environ. 2020, 237, 111599. [Google Scholar] [CrossRef]
- Geipel, J.; Link, J.; Claupein, W. Combined spectral and spatial modeling of corn yield based on aerial images and crop surface models acquired with an unmanned aircraft system. Remote Sens. 2014, 6, 10335–10355. [Google Scholar] [CrossRef]
- Vega, F.A.; Ramirez, F.C.; Saiz, M.P.; Rosúa, F.O. Multi-temporal imaging using an unmanned aerial vehicle for monitoring a sunflower crop. Biosyst. Eng. 2015, 132, 19–27. [Google Scholar] [CrossRef]
- Yang, Q.; Shi, L.; Han, J.; Zha, Y.; Zhu, P. Deep convolutional neural networks for rice grain yield estimation at the ripening stage using UAV-based remotely sensed images. Field Crops Res. 2019, 235, 142–153. [Google Scholar] [CrossRef]
- Anderson, S.L.; Murray, S.C.; Malambo, L.; Ratcliff, C.; Popescu, S.; Cope, D.; Chang, A.; Jung, J.; Thomasson, J.A. Prediction of maize grain yield before maturity using improved temporal height estimates of unmanned aerial systems. Plant Phenome J. 2019, 2, 1–15. [Google Scholar] [CrossRef]
- Ballester, C.; Hornbuckle, J.; Brinkhoff, J.; Smith, J.; Quayle, W. Assessment of in-season cotton nitrogen status and lint yield prediction from unmanned aerial system imagery. Remote Sens. 2017, 9, 1149. [Google Scholar] [CrossRef] [Green Version]
- Uto, K.; Seki, H.; Saito, G.; Kosugi, Y. Characterization of rice paddies by a UAV-mounted miniature hyperspectral sensor system. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2013, 6, 851–860. [Google Scholar] [CrossRef]
- Aasen, H.; Burkart, A.; Bolten, A.; Bareth, G. Generating 3D hyperspectral information with lightweight UAV snapshot cameras for vegetation monitoring: From camera calibration to quality assurance. ISPRS J. Photogramm. Remote Sens. 2015, 108, 245–259. [Google Scholar] [CrossRef]
- Quemada, M.; Gabriel, J.L.; Zarco-Tejada, P. Airborne hyperspectral images and ground-level optical sensors as assessment tools for maize nitrogen fertilization. Remote Sens. 2014, 6, 2940–2962. [Google Scholar] [CrossRef]
- Honkavaara, E.; Saari, H.; Kaivosoja, J.; Pölönen, I.; Hakala, T.; Litkey, P.; Mäkynen, J.; Pesonen, L. Processing and Assessment of Spectrometric, Stereoscopic Imagery Collected Using a Lightweight UAV Spectral Camera for Precision Agriculture. Remote Sens. 2013, 5, 5006–5039. [Google Scholar] [CrossRef]
- Zarco-Tejada, P.J.; González-Dugo, V.; Berni, J.A. Fluorescence, temperature and narrow-band indices acquired from a UAV platform for water stress detection using a micro-hyperspectral imager and a thermal camera. Remote Sens. Environ. 2012, 117, 322–337. [Google Scholar] [CrossRef]
- Maimaitiyiming, M.; Sagan, V.; Sidike, P.; Maimaitijiang, M.; Miller, A.J.; Kwasniewski, M. Leveraging Very-High Spatial Resolution Hyperspectral and Thermal UAV Imageries for Characterizing Diurnal Indicators of Grapevine Physiology. Remote Sens. 2020, 12, 3216. [Google Scholar] [CrossRef]
- Kumar, A.; Lee, W.S.; Ehsani, R.J.; Albrigo, L.G.; Yang, C.; Mangan, R.L. Citrus greening disease detection using aerial hyperspectral and multispectral imaging techniques. J. Appl. Remote Sens. 2012, 6, 063542. [Google Scholar]
- Nguyen, C.; Sagan, V.; Maimaitiyiming, M.; Maimaitijiang, M.; Bhadra, S.; Kwasniewski, M.T. Early Detection of Plant Viral Disease Using Hyperspectral Imaging and Deep Learning. Sensors 2021, 21, 742. [Google Scholar] [CrossRef]
- Kanning, M.; Kühling, I.; Trautz, D.; Jarmer, T. High-resolution UAV-based hyperspectral imagery for LAI and chlorophyll estimations from wheat for yield prediction. Remote Sens. 2018, 10, 2000. [Google Scholar] [CrossRef]
- Feng, L.; Zhang, Z.; Ma, Y.; Du, Q.; Williams, P.; Drewry, J.; Luck, B. Alfalfa Yield Prediction Using UAV-Based Hyperspectral Imagery and Ensemble Learning. Remote Sens. 2020, 12, 2028. [Google Scholar] [CrossRef]
- Zhang, X.; Zhao, J.; Yang, G.; Liu, J.; Cao, J.; Li, C.; Zhao, X.; Gai, J. Establishment of Plot-Yield Prediction Models in Soybean Breeding Programs Using UAV-Based Hyperspectral Remote Sensing. Remote Sens. 2019, 11, 2752. [Google Scholar] [CrossRef] [Green Version]
- Zhang, Y.; Qin, Q.; Ren, H.; Sun, Y.; Li, M.; Zhang, T.; Ren, S. Optimal Hyperspectral Characteristics Determination for Winter Wheat Yield Prediction. Remote Sens. 2018, 10, 2015. [Google Scholar] [CrossRef]
- Hughes, G. On the mean accuracy of statistical pattern recognizers. IEEE Trans. Inf. Theory 1968, 14, 55–63. [Google Scholar] [CrossRef]
- Bhadra, S.; Sagan, V.; Maimaitijiang, M.; Maimaitiyiming, M.; Newcomb, M.; Shakoor, N.; Mockler, T.C. Quantifying Leaf Chlorophyll Concentration of Sorghum from Hyperspectral Data Using Derivative Calculus and Machine Learning. Remote Sens. 2020, 12, 2082. [Google Scholar] [CrossRef]
- Maimaitiyiming, M.; Sagan, V.; Sidike, P.; Kwasniewski, M.T. Dual Activation Function-Based Extreme Learning Machine (ELM) for Estimating Grapevine Berry Yield and Quality. Remote Sens. 2019, 11, 740. [Google Scholar] [CrossRef]
- Bravo, C.; Moshou, D.; West, J.; McCartney, A.; Ramon, H. Early disease detection in wheat fields using spectral reflectance. Biosyst. Eng. 2003, 84, 137–145. [Google Scholar] [CrossRef]
- Xie, C.; He, Y. Spectrum and image texture features analysis for early blight disease detection on eggplant leaves. Sensors 2016, 16, 676. [Google Scholar] [CrossRef]
- Huang, L.; Zhang, H.; Ruan, C.; Huang, W.; Hu, T.; Zhao, J. Detection of scab in wheat ears using in situ hyperspectral data and support vector machine optimized by genetic algorithm. Int. J. Agric. Biol. Eng. 2020, 13, 182–188. [Google Scholar] [CrossRef]
- Liu, F.; Xiao, Z. Disease Spots Identification of Potato Leaves in Hyperspectral Based on Locally Adaptive 1D-CNN. In Proceedings of the 2020 IEEE International Conference on Artificial Intelligence and Computer Applications (ICAICA), Dalian, China, 27–29 June 2020; pp. 355–358. [Google Scholar]
- Jin, X.; Jie, L.; Wang, S.; Qi, H.J.; Li, S.W. Classifying wheat hyperspectral pixels of healthy heads and Fusarium head blight disease using a deep neural network in the wild field. Remote Sens. 2018, 10, 395. [Google Scholar] [CrossRef]
- Hruška, J.; Adão, T.; Pádua, L.; Marques, P.; Peres, E.; Sousa, A.; Morais, R.; Sousa, J.J. Deep Learning-Based Methodological Approach for Vineyard Early Disease Detection Using Hyperspectral Data. In Proceedings of the IGARSS 2018-2018 IEEE International Geoscience and Remote Sensing Symposium, Valencia, Spain, 22–37 June 2018; pp. 9063–9066. [Google Scholar]
- Wu, D.; Sun, D.-W. Advanced applications of hyperspectral imaging technology for food quality and safety analysis and assessment: A review—Part I: Fundamentals. Innov. Food Sci. Emerg. Technol. 2013, 19, 1–14. [Google Scholar] [CrossRef]
- Mulla, D.J. Twenty five years of remote sensing in precision agriculture: Key advances and remaining knowledge gaps. Biosyst. Eng. 2013, 114, 358–371. [Google Scholar] [CrossRef]
- Gómez-Candón, D.; Virlet, N.; Labbé, S.; Jolivot, A.; Regnard, J.-L. Field phenotyping of water stress at tree scale by UAV-sensed imagery: New insights for thermal acquisition and calibration. Precis. Agric. 2016, 17, 786–800. [Google Scholar] [CrossRef]
- Zúñiga Espinoza, C.; Khot, L.R.; Sankaran, S.; Jacoby, P.W. High resolution multispectral and thermal remote sensing-based water stress assessment in subsurface irrigated grapevines. Remote Sens. 2017, 9, 961. [Google Scholar] [CrossRef]
- Park, S.; Ryu, D.; Fuentes, S.; Chung, H.; Hernández-Montes, E.; O’Connell, M. Adaptive estimation of crop water stress in nectarine and peach orchards using high-resolution imagery from an unmanned aerial vehicle (UAV). Remote Sens. 2017, 9, 828. [Google Scholar] [CrossRef]
- Gonzalez-Dugo, V.; Goldhamer, D.; Zarco-Tejada, P.J.; Fereres, E. Improving the precision of irrigation in a pistachio farm using an unmanned airborne thermal system. Irrig. Sci 2015, 33, 43–52. [Google Scholar] [CrossRef]
- Ludovisi, R.; Tauro, F.; Salvati, R.; Khoury, S.; Mugnozza Scarascia, G.; Harfouche, A. UAV-based thermal imaging for high-throughput field phenotyping of black poplar response to drought. Front. Plant Sci. 2017, 8, 1681. [Google Scholar] [CrossRef]
- Sagan, V.; Maimaitijiang, M.; Sidike, P.; Eblimit, K.; Peterson, K.T.; Hartling, S.; Esposito, F.; Khanal, K.; Newcomb, M.; Pauli, D.; et al. UAV-Based High Resolution Thermal Imaging for Vegetation Monitoring, and Plant Phenotyping Using ICI 8640 P, FLIR Vue Pro R 640, and thermoMap Cameras. Remote Sens. 2019, 11, 330. [Google Scholar] [CrossRef]
- Da Luz, B.R.; Crowley, J.K. Spectral reflectance and emissivity features of broad leaf plants: Prospects for remote sensing in the thermal infrared (8.0–14.0 μm). Remote Sens. Environ. 2007, 109, 393–405. [Google Scholar] [CrossRef]
- Maimaitijiang, M.; Ghulam, A.; Sidike, P.; Hartling, S.; Maimaitiyiming, M.; Peterson, K.; Shavers, E.; Fishman, J.; Peterson, J.; Kadam, S. Unmanned Aerial System (UAS)-based phenotyping of soybean using multi-sensor data fusion and extreme learning machine. ISPRS J. Photogramm. Remote Sens. 2017, 134, 43–58. [Google Scholar] [CrossRef]
- García, M.; Saatchi, S.; Ustin, S.; Balzter, H. Modelling forest canopy height by integrating airborne LiDAR samples with satellite Radar and multispectral imagery. Int. J. Appl. Earth Obs. Geoinf. 2018, 66, 159–173. [Google Scholar] [CrossRef]
- Shi, Y.; Wang, T.; Skidmore, A.K.; Heurich, M. Improving LiDAR-based tree species mapping in Central European mixed forests using multi-temporal digital aerial colour-infrared photographs. Int. J. Appl. Earth Obs. Geoinf. 2020, 84, 101970. [Google Scholar] [CrossRef]
- Blomley, R.; Hovi, A.; Weinmann, M.; Hinz, S.; Korpela, I.; Jutzi, B. Tree species classification using within crown localization of waveform LiDAR attributes. ISPRS J. Photogramm. Remote Sens. 2017, 133, 142–156. [Google Scholar] [CrossRef]
- Qin, Y.; Li, S.; Vu, T.-T.; Niu, Z.; Ban, Y. Synergistic application of geometric and radiometric features of LiDAR data for urban land cover mapping. Opt Express 2015, 23, 13761–13775. [Google Scholar] [CrossRef] [PubMed]
- Andújar, D.; Moreno, H.; Bengochea-Guevara, J.M.; de Castro, A.; Ribeiro, A. Aerial imagery or on-ground detection? An economic analysis for vineyard crops. Comput. Electron. Agric. 2019, 157, 351–358. [Google Scholar] [CrossRef]
- Wang, D.; Xin, X.; Shao, Q.; Brolly, M.; Zhu, Z.; Chen, J. Modeling Aboveground Biomass in Hulunber Grassland Ecosystem by Using Unmanned Aerial Vehicle Discrete Lidar. Sensors 2017, 17, 180. [Google Scholar] [CrossRef]
- Sankey, T.; Donager, J.; McVay, J.; Sankey, J.B. UAV lidar and hyperspectral fusion for forest monitoring in the southwestern USA. Remote Sens. Environ. 2017, 195, 30–43. [Google Scholar] [CrossRef]
- Qin, H.; Zhou, W.; Yao, Y.; Wang, W. Individual tree segmentation and tree species classification in subtropical broadleaf forests using UAV-based LiDAR, hyperspectral, and ultrahigh-resolution RGB data. Remote Sens. Environ. 2022, 280, 113143. [Google Scholar] [CrossRef]
- Yu, R.; Luo, Y.; Zhou, Q.; Zhang, X.; Wu, D.; Ren, L. A machine learning algorithm to detect pine wilt disease using UAV-based hyperspectral imagery and LiDAR data at the tree level. Int. J. Appl. Earth Obs. Geoinf. 2021, 101, 102363. [Google Scholar] [CrossRef]
- Dilmurat, K.; Sagan, V.; Maimaitijiang, M.; Moose, S.; Fritschi, F.B. Estimating Crop Seed Composition Using Machine Learning from Multisensory UAV Data. Remote Sens. 2022, 14, 4786. [Google Scholar] [CrossRef]
- Jones, D.B. Factors for Converting Percentages of Nitrogen in Foods and Feeds into Percentages of Proteins; US Department of Agriculture: Washington, DC, USA, 1931. [Google Scholar]
- Brede, B.; Lau, A.; Bartholomeus, H.M.; Kooistra, L. Comparing RIEGL RiCOPTER UAV LiDAR derived canopy height and DBH with terrestrial LiDAR. Sensors 2017, 17, 2371. [Google Scholar] [CrossRef]
- Gómez-Chova, L.; Alonso, L.; Guanter, L.; Camps-Valls, G.; Calpe, J.; Moreno, J. Correction of systematic spatial noise in push-broom hyperspectral sensors: Application to CHRIS/PROBA images. Appl. Opt. 2008, 47, F46–F60. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Barreto, M.A.P.; Johansen, K.; Angel, Y.; McCabe, M.F. Radiometric assessment of a UAV-based push-broom hyperspectral camera. Sensors 2019, 19, 4699. [Google Scholar] [CrossRef] [PubMed]
- Liu, Y.; Wang, T.; Ma, L.; Wang, N. Spectral calibration of hyperspectral data observed from a hyperspectrometer loaded on an unmanned aerial vehicle platform. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2014, 7, 2630–2638. [Google Scholar] [CrossRef]
- Falco, G.; Pini, M.; Marucco, G. Loose and tight GNSS/INS integrations: Comparison of performance assessed in real urban scenarios. Sensors 2017, 17, 255. [Google Scholar] [CrossRef] [PubMed]
- Dong, Y.; Wang, D.; Zhang, L.; Li, Q.; Wu, J. Tightly coupled GNSS/INS integration with robust sequential kalman filter for accurate vehicular navigation. Sensors 2020, 20, 561. [Google Scholar] [CrossRef]
- Han, Y.; Choi, J.; Jung, J.; Chang, A.; Oh, S.; Yeom, J. Automated coregistration of multisensor orthophotos generated from unmanned aerial vehicle platforms. J. Sens. 2019, 2019, 2962734. [Google Scholar] [CrossRef]
- Maimaitijiang, M.; Sagan, V.; Bhadra, S.; Nguyen, C.; Mockler, T.; Shakoor, N. A fully automated and fast approach for canopy cover estimation using super high-resolution remote sensing imagery. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2021, 5, 219–226. [Google Scholar] [CrossRef]
- Vassilvitskii, S.; Arthur, D. k-means++: The advantages of careful seeding. In Proceedings of the Eighteenth Annual ACM-SIAM Symposium on Discrete Algorithms, Miami, FL, USA, 22–24 January 2006; pp. 1027–1035. [Google Scholar]
- Raschka, S. Python Machine Learning; Packt Publishing Ltd.: Birmingham, UK, 2015. [Google Scholar]
- Gosselin, N.; Sagan, V.; Maimaitiyiming, M.; Fishman, J.; Belina, K.; Podleski, A.; Maimaitijiang, M.; Bashir, A.; Balakrishna, J.; Dixon, A. Using Visual Ozone Damage Scores and Spectroscopy to Quantify Soybean Responses to Background Ozone. Remote Sens. 2020, 12, 93. [Google Scholar] [CrossRef]
- Maimaitiyiming, M.; Ghulam, A.; Bozzolo, A.; Wilkins, J.L.; Kwasniewski, M.T. Early Detection of Plant Physiological Responses to Different Levels of Water Stress Using Reflectance Spectroscopy. Remote Sens. 2017, 9, 745. [Google Scholar] [CrossRef]
- Dilmurat, K.; Sagan, V.; Moose, S. AI-driven maize yield forecasting using unmanned aerial vehicle-based hyperspectral and lidar data fusion. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2022, 5, 193–199. [Google Scholar] [CrossRef]
- Maimaitijiang, M.; Sagan, V.; Erkbol, H.; Adrian, J.; Newcomb, M.; LeBauer, D.; Pauli, D.; Shakoor, N.; Mockler, T.C. UAV-based sorghum growth monitoring: A comparative analysis of lidar and photogrammetry. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2020, 5, 489–496. [Google Scholar] [CrossRef]
- Cortes, C.; Vapnik, V. Support-vector networks. Mach. Learn. 1995, 20, 273–297. [Google Scholar] [CrossRef]
- Breiman, L. Random forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef]
- Kohavi, R. A study of cross-validation and bootstrap for accuracy estimation and model selection. In Proceedings of the IJCAI, Montreal, QC, Canada, 20–25 August 1995; pp. 1137–1145. [Google Scholar]
- Gitelson, A.A.; Gritz, Y.; Merzlyak, M.N. Relationships between leaf chlorophyll content and spectral reflectance and algorithms for non-destructive chlorophyll assessment in higher plant leaves. J. Plant Physiol. 2003, 160, 271–282. [Google Scholar] [CrossRef] [PubMed]
- Gitelson, A.; Merzlyak, M.N. Quantitative estimation of chlorophyll-a using reflectance spectra: Experiments with autumn chestnut and maple leaves. J. Photochem. Photobiol. B Biol. 1994, 22, 247–252. [Google Scholar] [CrossRef]
- Rondeaux, G.; Steven, M.; Baret, F. Optimization of soil-adjusted vegetation indices. Remote Sens. Environ. 1996, 55, 95–107. [Google Scholar] [CrossRef]
- Zarco-Tejada, P.J.; Berjón, A.; López-Lozano, R.; Miller, J.R.; Martín, P.; Cachorro, V.; González, M.; De Frutos, A. Assessing vineyard condition with hyperspectral indices: Leaf and canopy reflectance simulation in a row-structured discontinuous canopy. Remote Sens. Environ. 2005, 99, 271–287. [Google Scholar] [CrossRef]
- Penuelas, J.; Baret, F.; Filella, I. Semi-empirical indices to assess carotenoids/chlorophyll a ratio from leaf spectral reflectance. Photosynthetica 1995, 31, 221–230. [Google Scholar]
- Haboudane, D.; Miller, J.R.; Tremblay, N.; Zarco-Tejada, P.J.; Dextraze, L. Integrated narrow-band vegetation indices for prediction of crop chlorophyll content for application to precision agriculture. Remote Sens. Environ. 2002, 81, 416–426. [Google Scholar] [CrossRef]
- Bausch, W.C.; Duke, H.R. Remote Sensing of Plant Nitrogen Status in Corn. Trans. ASAE 1996, 39, 1869–1875. [Google Scholar] [CrossRef]
- Haboudane, D.; Miller, J.R.; Pattey, E.; Zarco-Tejada, P.J.; Strachan, I.B. Hyperspectral vegetation indices and novel algorithms for predicting green LAI of crop canopies: Modeling and validation in the context of precision agriculture. Remote Sens. Environ. 2004, 90, 337–352. [Google Scholar] [CrossRef]
- Gamon, J.A.; Peñuelas, J.; Field, C.B. A narrow-waveband spectral index that tracks diurnal changes in photosynthetic efficiency. Remote Sens. Environ. 1992, 41, 35–44. [Google Scholar] [CrossRef]
- Chappelle, E.W.; Kim, M.S.; McMurtrey, J.E. Ratio analysis of reflectance spectra (RARS): An algorithm for the remote estimation of the concentrations of chlorophyll A, chlorophyll B, and carotenoids in soybean leaves. Remote Sens. Environ. 1992, 39, 239–247. [Google Scholar] [CrossRef]
- Blackburn, G.A. Spectral indices for estimating photosynthetic pigment concentrations: A test using senescent tree leaves. Int. J. Remote Sens. 1998, 19, 657–675. [Google Scholar] [CrossRef]
- Peñuelas, J.; Gamon, J.A.; Fredeen, A.L.; Merino, J.; Field, C.B. Reflectance indices associated with physiological changes in nitrogen- and water-limited sunflower leaves. Remote Sens. Environ. 1994, 48, 135–146. [Google Scholar] [CrossRef]
- Metternicht, G. Vegetation indices derived from high-resolution airborne videography for precision crop management. Int. J. Remote Sens. 2003, 24, 2855–2877. [Google Scholar] [CrossRef]
- Schell, J.; Deering, D. Monitoring vegetation systems in the Great Plains with ERTS. NASA Spec. Publ. 1973, 351, 309. [Google Scholar]
- Gitelson, A.A.; Merzlyak, M.N. Remote estimation of chlorophyll content in higher plant leaves. Int. J. Remote Sens. 1997, 18, 2691–2697. [Google Scholar] [CrossRef]
- Jordan, C.F. Derivation of leaf-area index from quality of light on the forest floor. Ecology 1969, 50, 663–666. [Google Scholar] [CrossRef]
- Barnes, E.; Clarke, T.; Richards, S.; Colaizzi, P.; Haberland, J.; Kostrzewski, M.; Waller, P.; Choi, C.; Riley, E.; Thompson, T. Coincident detection of crop water stress, nitrogen status and canopy density using ground based multispectral data. In Proceedings of the Fifth International Conference on Precision Agriculture, Bloomington, MN, USA, 16–19 July 2000. [Google Scholar]
- Broge, N.H.; Leblanc, E. Comparing prediction power and stability of broadband and hyperspectral vegetation indices for estimation of green leaf area index and canopy chlorophyll density. Remote Sens. Environ. 2001, 76, 156–172. [Google Scholar] [CrossRef]
- Zarco-Tejada, P.J.; Miller, J.R.; Mohammed, G.H.; Noland, T.L.; Sampson, P.H. Chlorophyll fluorescence effects on vegetation apparent reflectance: II. Laboratory and airborne canopy-level measurements with hyperspectral data. Remote Sens. Environ. 2000, 74, 596–608. [Google Scholar] [CrossRef]
- Zarco-Tejada, P.J.; Miller, J.R.; Mohammed, G.H.; Noland, T.L. Chlorophyll fluorescence effects on vegetation apparent reflectance: I. Leaf-level measurements and model simulation. Remote Sens. Environ. 2000, 74, 582–595. [Google Scholar] [CrossRef]
- Dobrowski, S.; Pushnik, J.; Zarco-Tejada, P.J.; Ustin, S. Simple reflectance indices track heat and water stress-induced changes in steady-state chlorophyll fluorescence at the canopy scale. Remote Sens. Environ. 2005, 97, 403–414. [Google Scholar] [CrossRef]
- Sims, D.A.; Gamon, J.A. Relationships between leaf pigment content and spectral reflectance across a wide range of species, leaf structures and developmental stages. Remote Sens. Environ. 2002, 81, 337–354. [Google Scholar] [CrossRef]
- Barnes, J.D.; Balaguer, L.; Manrique, E.; Elvira, S.; Davison, A.W. A reappraisal of the use of DMSO for the extraction and determination of chlorophylls a and b in lichens and higher plants. Environ. Exp. Bot. 1992, 32, 85–100. [Google Scholar] [CrossRef]
- Merton, R. Monitoring community hysteresis using spectral shift analysis and the red-edge vegetation stress index. In Proceedings of the Seventh Annual JPL Airborne Earth Science Workshop, Pasadena, CA, USA, 12–16 January 1998; pp. 12–16. [Google Scholar]
- Peñuelas, J.; Filella, I.; Biel, C.; Serrano, L.; Save, R. The reflectance at the 950–970 nm region as an indicator of plant water status. Int. J. Remote Sens. 1993, 14, 1887–1905. [Google Scholar] [CrossRef]
- Babar, M.; Reynolds, M.; Van Ginkel, M.; Klatt, A.; Raun, W.; Stone, M. Spectral reflectance to estimate genetic variation for in-season biomass, leaf chlorophyll, and canopy temperature in wheat. Crop Sci. 2006, 46, 1046–1057. [Google Scholar] [CrossRef]
- Elsayed, S.; Rischbeck, P.; Schmidhalter, U. Comparing the performance of active and passive reflectance sensors to assess the normalized relative canopy temperature and grain yield of drought-stressed barley cultivars. Field Crops Res. 2015, 177, 148–160. [Google Scholar] [CrossRef]
- Yu, X.; Wu, X.; Luo, C.; Ren, P. Deep learning in remote sensing scene classification: A data augmentation enhanced convolutional neural network framework. GISci. Remote Sens. 2017, 54, 741–758. [Google Scholar] [CrossRef]
- Li, W.; Chen, C.; Zhang, M.; Li, H.; Du, Q. Data augmentation for hyperspectral image classification with deep CNN. IEEE Geosci. Remote Sens. Lett. 2018, 16, 593–597. [Google Scholar] [CrossRef]
- Glorot, X.; Bengio, Y. Understanding the difficulty of training deep feedforward neural networks. In Proceedings of the Thirteenth International Conference on Artificial Intelligence And Statistics, Sardinia, Italy, 13–15 May 2010; pp. 249–256. [Google Scholar]
- Nair, V.; Hinton, G.E. Rectified linear units improve restricted boltzmann machines. In Proceedings of the ICML, Haifa, Israel, 21–24 June 2010. [Google Scholar]
- Srivastava, N.; Hinton, G.; Krizhevsky, A.; Sutskever, I.; Salakhutdinov, R. Dropout: A simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 2014, 15, 1929–1958. [Google Scholar]
- Huber, P.J. Robust Estimation of a Location Parameter. Ann. Math. Stat. 1964, 35, 73–101, 129. [Google Scholar] [CrossRef]
- Sagan, V.; Maimaitijiang, M.; Bhadra, S.; Maimaitiyiming, M.; Brown, D.R.; Sidike, P.; Fritschi, F.B. Field-scale crop yield prediction using multi-temporal WorldView-3 and PlanetScope satellite data and deep learning. ISPRS J. Photogramm. Remote Sens. 2021, 174, 265–281. [Google Scholar] [CrossRef]
- Fan, J.; Zhou, J.; Wang, B.; de Leon, N.; Kaeppler, S.M.; Lima, D.C.; Zhang, Z. Estimation of Maize Yield and Flowering Time Using Multi-Temporal UAV-Based Hyperspectral Data. Remote Sens. 2022, 14, 3052. [Google Scholar] [CrossRef]
- van Klompenburg, T.; Kassahun, A.; Catal, C. Crop yield prediction using machine learning: A systematic literature review. Comput. Electron. Agric. 2020, 177, 105709. [Google Scholar] [CrossRef]
- Maresma, Á.; Ariza, M.; Martínez, E.; Lloveras, J.; Martínez-Casasnovas, J.A. Analysis of Vegetation Indices to Determine Nitrogen Application and Yield Prediction in Maize (Zea mays L.) from a Standard UAV Service. Remote Sens. 2016, 8, 973. [Google Scholar] [CrossRef]
- López-Calderón, M.J.; Estrada-Ávalos, J.; Rodríguez-Moreno, V.M.; Mauricio-Ruvalcaba, J.E.; Martínez-Sifuentes, A.R.; Delgado-Ramírez, G.; Miguel-Valle, E. Estimation of Total Nitrogen Content in Forage Maize (Zea mays L.) Using Spectral Indices: Analysis by Random Forest. Agriculture 2020, 10, 451. [Google Scholar] [CrossRef]
- Zhu, Y.; Zhao, C.; Yang, H.; Yang, G.; Han, L.; Li, Z.; Feng, H.; Xu, B.; Wu, J.; Lei, L. Estimation of maize above-ground biomass based on stem-leaf separation strategy integrated with LiDAR and optical remote sensing data. PeerJ 2019, 7, e7593. [Google Scholar] [CrossRef]
- Meiyan, S.; Mengyuan, S.; Qizhou, D.; Xiaohong, Y.; Baoguo, L.; Yuntao, M. Estimating the maize above-ground biomass by constructing the tridimensional concept model based on UAV-based digital and multi-spectral images. Field Crops Res. 2022, 282, 108491. [Google Scholar] [CrossRef]
- Thenkabail, P.S.; Lyon, J.G. Hyperspectral Remote Sensing of Vegetation; CRC Press: Boca Raton, FL, USA, 2016. [Google Scholar]
- Xue, J.; Su, B. Significant remote sensing vegetation indices: A review of developments and applications. J. Sens. 2017, 2017, 1353691. [Google Scholar] [CrossRef]
- Tilly, N.; Aasen, H.; Bareth, G. Fusion of Plant Height and Vegetation Indices for the Estimation of Barley Biomass. Remote Sens. 2015, 7, 11449–11480. [Google Scholar] [CrossRef]
- Ahamed, T.; Tian, L.; Zhang, Y.; Ting, K.C. A review of remote sensing methods for biomass feedstock production. Biomass Bioenergy 2011, 35, 2455–2469. [Google Scholar] [CrossRef]
- Freeman, K.W.; Girma, K.; Arnall, D.B.; Mullen, R.W.; Martin, K.L.; Teal, R.K.; Raun, W.R. By-plant prediction of corn forage biomass and nitrogen uptake at various growth stages using remote sensing and plant height. Agron. J. 2007, 99, 530–536. [Google Scholar] [CrossRef]
- Li, B.; Xu, X.; Zhang, L.; Han, J.; Bian, C.; Li, G.; Liu, J.; Jin, L. Above-ground biomass estimation and yield prediction in potato by using UAV-based RGB and hyperspectral imaging. ISPRS J. Photogramm. Remote Sens. 2020, 162, 161–172. [Google Scholar] [CrossRef]
- Ciurczak, E.W.; Igne, B.; Workman, J., Jr.; Burns, D.A. Handbook of Near-Infrared Analysis; CRC Press: Boca Raton, FL, USA, 2021. [Google Scholar]
- Slaton, M.R.; Raymond Hunt Jr, E.; Smith, W.K. Estimating near-infrared leaf reflectance from leaf structural characteristics. Am. J. Bot. 2001, 88, 278–284. [Google Scholar] [CrossRef] [PubMed]
- Gates, D.M.; Keegan, H.J.; Schleter, J.C.; Weidner, V.R. Spectral properties of plants. Appl Opt. 1965, 4, 11–20. [Google Scholar] [CrossRef]
- Curran, P.J. Remote sensing of foliar chemistry. Remote Sens. Environ. 1989, 30, 271–278. [Google Scholar] [CrossRef]
- Thenkabail, P.S.; Smith, R.B.; De Pauw, E. Hyperspectral vegetation indices and their relationships with agricultural crop characteristics. Remote Sens. Environ. 2000, 71, 158–182. [Google Scholar] [CrossRef]
- Höfle, B. Radiometric correction of terrestrial LiDAR point cloud data for individual maize plant detection. IEEE Geosci. Remote Sens. Lett. 2013, 11, 94–98. [Google Scholar] [CrossRef]
- Wang, L.; Chen, S.; Li, D.; Wang, C.; Jiang, H.; Zheng, Q.; Peng, Z. Estimation of paddy rice nitrogen content and accumulation both at leaf and plant levels from UAV hyperspectral imagery. Remote Sens. 2021, 13, 2956. [Google Scholar] [CrossRef]
- Yin, S.; Zhou, K.; Cao, L.; Shen, X. Estimating the Horizontal and Vertical Distributions of Pigments in Canopies of Ginkgo Plantation Based on UAV-Borne LiDAR, Hyperspectral Data by Coupling PROSAIL Model. Remote Sens. 2022, 14, 715. [Google Scholar] [CrossRef]
- Xu, J.-L.; Gobrecht, A.; Héran, D.; Gorretta, N.; Coque, M.; Gowen, A.A.; Bendoula, R.; Sun, D.-W. A polarized hyperspectral imaging system for in vivo detection: Multiple applications in sunflower leaf analysis. Comput. Electron. Agric. 2019, 158, 258–270. [Google Scholar] [CrossRef]
- Moudrý, V.; Moudrá, L.; Barták, V.; Bejček, V.; Gdulová, K.; Hendrychová, M.; Moravec, D.; Musil, P.; Rocchini, D.; Šťastný, K. The role of the vegetation structure, primary productivity and senescence derived from airborne LiDAR and hyperspectral data for birds diversity and rarity on a restored site. Landsc. Urban Plan. 2021, 210, 104064. [Google Scholar] [CrossRef]
- Neupane, K.; Baysal-Gurel, F. Automatic identification and monitoring of plant diseases using unmanned aerial vehicles: A review. Remote Sens. 2021, 13, 3841. [Google Scholar] [CrossRef]
- Zhang, L.; Niu, Y.; Zhang, H.; Han, W.; Li, G.; Tang, J.; Peng, X. Maize canopy temperature extracted from UAV thermal and RGB imagery and its application in water stress monitoring. Front. Plant Sci. 2019, 10, 1270. [Google Scholar] [CrossRef]
- Hartling, S.; Sagan, V.; Maimaitijiang, M. Urban tree species classification using UAV-based multi-sensor data fusion and machine learning. GIScience Remote Sens. 2021, 58, 1250–1275. [Google Scholar] [CrossRef]
- Li, Y.; Zhang, H.; Shen, Q. Spectral–spatial classification of hyperspectral imagery with 3D convolutional neural network. Remote Sens. 2017, 9, 67. [Google Scholar] [CrossRef]
- He, M.; Li, B.; Chen, H. Multi-scale 3D deep convolutional neural network for hyperspectral image classification. In Proceedings of the 2017 IEEE International Conference on Image Processing (ICIP), Beijing, China, 17–20 September 2017; pp. 3904–3908. [Google Scholar]
- Carrio, A.; Sampedro, C.; Rodriguez-Ramos, A.; Campoy, P. A review of deep learning methods and applications for unmanned aerial vehicles. J. Sens. 2017, 2017, 3296874. [Google Scholar] [CrossRef]
- Osco, L.P.; Junior, J.M.; Ramos, A.P.M.; de Castro Jorge, L.A.; Fatholahi, S.N.; de Andrade Silva, J.; Matsubara, E.T.; Pistori, H.; Gonçalves, W.N.; Li, J. A review on deep learning in UAV remote sensing. Int. J. Appl. Earth Obs. Geoinf. 2021, 102, 102456. [Google Scholar] [CrossRef]
- Feng, L.; Zhang, Z.; Ma, Y.; Sun, Y.; Du, Q.; Williams, P.; Drewry, J.; Luck, B. Multitask Learning of Alfalfa Nutritive Value From UAV-Based Hyperspectral Images. IEEE Geosci. Remote Sens. Lett. 2021, 19, 1–5. [Google Scholar] [CrossRef]
- Sun, X.; Panda, R.; Feris, R.; Saenko, K. Adashare: Learning what to share for efficient deep multi-task learning. Adv. Neural Inf. Process. Syst. 2020, 33, 8728–8740. [Google Scholar]
- Vandenhende, S.; Georgoulis, S.; Van Gansbeke, W.; Proesmans, M.; Dai, D.; Van Gool, L. Multi-task learning for dense prediction tasks: A survey. IEEE Trans. Pattern Anal. Mach. Intell. 2021, 44, 3614–3633. [Google Scholar] [CrossRef] [PubMed]
Phenotypic Traits | Unit | Calculation | Measuring Description |
---|---|---|---|
Cob Biomass | kg/ha | [Cob Biomass (g/plant) × Standing Plants]/Plot Size (hectare) | Average of five plants from the center of row sampled at R6 growth stage. |
Dry Grain Yield | kg/ha | [Dry Grain Biomass (g/plant) × Standing Plants]/Plot Size (hectare) | Average of five corn ears from the center of row sampled at R6 growth stage. Normalized moisture content of dry grain biomass was 15.5%. |
Dry Stalk Biomass | kg/ha | [Stalk Biomass (g/plant) × Standing Plants]/Plot Size (hectare) | Average of five plants from the center of row cut at ground level at R6 growth stage, weighed, shredded, subsample weighed fresh and dry. |
Harvest Index | / | Dry Grain Biomass (g/plant)/[Dry Stalk Biomass (g/plant) + Cob Biomass (g/plant) + Dry Grain Biomass (g/plant)] | / |
Grain Density | / | / | Measured with a near-infrared (NIR) spectroscopy Perten DA7200 analyzer (Perten Instruments, Springfield, IL, USA) on kernels sampled five ears each plot. |
Grain Nitrogen Content | kg/ha | [Grain Protein (%)/6.25] × Dry Grain Biomass (g/plant)]/Plot Size (hectare) | / |
Grain Nitrogen Utilization Efficiency (Grain NutE) | / | Dry Grain Biomass (g/plant)/[Stalk N (%) × Stalk Biomass (g/plant) + [Grain Protein (%)/6.25] × Dry Grain Biomass (g/plant)] | Describe how the plant uses the nitrogen it acquires to produce grain. It is the ratio between dry grain biomass over the total Nitrogen content of the plant. |
Plant Nitrogen Content | kg/ha | [Stalk N (%) × Stalk Biomass (g/plant) + [Grain Protein (%)/6.25] × Dry Grain Biomass (g/plant)]/Plot Size (hectare) | The amount of nitrogen of all standing plants normalized to their plot area. The total amount of nitrogen of each plant was the addition of the amount in stalk and in grain. The stalk nitrogen content was measured by a combustion analysis of dry stover. Grain protein percent was determined by a lab-based NIR spectrometer, which is converted to grain nitrogen content at the Jones factor of 6.25 in maize [52]. |
UAV Platform | Data Format | Sensor | Stabilizer | Recorded Information | Spectral Properties | GSD |
---|---|---|---|---|---|---|
DJI M600 Pro hexacopter (DJI Corporation, Shenzhen, China), | Hyperspectral Imagery | Headwall Nano-Hyperspec | DJI Ronin MX gimbal | 270 VNIR spectral bands | 400–1000 nm with FWHM of 6 nm | 3 cm |
FLIR Thermal Imagery | FLIR Vue Pro R 640 | / | / | / | ||
GPS/IMU | Applanix APX-15 | |||||
DJI M600 Pro hexacopter (DJI Corporation, Shenzhen, China), | LiDAR point cloud | Velodyne HDL-32 | Hard mount | LiDAR point cloud and attributes | / | 900 pts/m2 |
RGB Imagery | Sony A7R II | Blue, Green, Red bands | 2.4 cm | |||
DJI M600 Pro hexacopter (DJI Corporation, Shenzhen, China), | ICI Thermal Imagery | ICI 8640 P-series | Gremsy T3 gimbal | 1 thermal IR band | 7–14 μm | 8 cm |
RGB Imagery | Sony RX10 | |||||
Multispectral Imagery | Micasense Altum | Hard mount | 5 spectral bands: Blue, Green, Red, Red-edge, NIR |
Phenotypes | Count | Mean | Std * | cv (%) ** | Min | 25% | 50% | 75% | Max |
---|---|---|---|---|---|---|---|---|---|
Dry Stalk Biomass (kg/ha) | 369 | 6510.82 | 2153.74 | 33.1 | 1477 | 5033 | 6315 | 7756 | 22,035 |
Cob Biomass (kg/ha) | 369 | 1470.71 | 498.90 | 33.9 | 415 | 1091 | 1432 | 1822 | 3853 |
Dry Grain Yield (kg/ha) | 369 | 7176.92 | 3300.98 | 46 | 425 | 4282 | 7038 | 9848 | 17,450 |
Harvest Index | 369 | 0.45 | 0.09 | 19.4 | 0.03 | 0.40 | 0.46 | 0.52 | 0.75 |
Grain NutE | 369 | 55.92 | 11.10 | 19.9 | 5 | 50 | 57 | 63 | 77 |
Grain N (kg/ha) | 369 | 91.70 | 49.48 | 53.9 | 9 | 44 | 90 | 136 | 218 |
Total Plant N (kg/ha) | 369 | 135.88 | 70.18 | 51.7 | 26 | 68 | 141 | 198 | 314 |
Grain Density | 369 | 1.27 | 0.038 | 3 | 1.02 | 1.25 | 1.27 | 1.3 | 1.35 |
No. | Vegetation Index | Acronym | Equation | References |
---|---|---|---|---|
Hyperspectral-derived metrics | ||||
1 | Anthocyanin (Gitelson) | AntGitelson | AntGitelson = (1/R550 − 1/R700 ) × R780 | [70] |
2 | Chlorophyll Index | CI | CI = (R750 − R 705 )/(R750 + R705) | [71] |
3 | Optimized Soil-Adjusted Vegetation Index | OSAVI | OSAVI = (1 + 0.16) × (R800 –R 670 )/(R800 + R670 + 0.16) | [72] |
4 | Red Green Index | RGI | RGI = R690/R550 | [73] |
5 | Structure Intensive Pigment Index | SIPI | SIPI = (R800 − R 450 )/(R800 + R650) | [74] |
6 | Transformed Chlorophyll Absorption in Reflectance Index | TCARI | TCARI = 3 × ((R700 − R 670)− 0.2 × R700− R 550)× (R700/R670)) | [75] |
7 | Nitrogen Reflectance Index (NRI) | NRI | NRI = (R570 − R670)/(R570 + R670) | [76] |
8 | Modified Chlorophyll Absorption in Reflectance Index | mCARI | mCARI = 1.2 × (2.5 × (R761 − R 651 )–1.3 × (R761 − R 581 )) | [77] |
9 | Photochemical Reflectance Index | PRI | PRI = (R531 –R 570 )/(R531 + R570) | [78] |
10 | Ratio Analysis of reflectance Spectral Chlorophyll a | RARSa | RARSa = R675/R700 | [79] |
11 | Ratio Analysis of reflectance Spectral Chlorophyll b | RARSb | RARSb = R675/(R700 × R650) | [79] |
12 | Ratio Analysis of reflectance Spectral | RARSc | RARSc = R760/R500 | [79] |
13 | Pigment specific simple ratio | PSSR | PSSR = R800/R680 | [80] |
14 | Plant Senescence Reflectance Index | PSRI | PSRI = (R660 − R 510)/R760 | [81] |
15 | Normalized chlorophyll pigment ratio index | NCPI | NCPI = (R670 − R 450)/(R670 + R450) | [74] |
16 | Plant Pigment ratio | PPR | PPR = (R550 − R 450 )/(R550 + R450) | [82] |
17 | Normalized Difference Vegetation Index | NDVI | NDVI = (R860 − R 670 )/(R860 + R670) | [83] |
18 | Greenness Index | GI | GI = R554/R677 | [73] |
19 | Green NDVI | GNDVI | GNDVI = (R750 − R 540 + R570)/(R750 + R540 − R 570 ) | [84] |
20 | Simple Ratio | SR | SR = R900/R680 | [85] |
21 | Red-edgeNDVI | RNDVI | RNDVI = (R750 − R705)/(R750 + R705) | [86] |
22 | Modified Triangular Vegetation Index | MTVI | MTVI = 1.2 × (1.2 × (R800 – R550) − 2.5 × (R670 − R 550 )) | [77] |
23 | Triangular Vegetation Index | TVI | TVI = 0.5 × (120 × (R761 − R 581 ) – 200(R651 − R 581 )) | [87] |
24 | Fluorescence Ratio Index 1 | FRI1 | FRI1 = R690/R630 | [88] |
25 | Fluorescence Ratio Index 2 | FRI2 | FRI2 = R750/R800 | [89] |
26 | Fluorescence Ratio Index 3 | FRI3 | FRI3 = R690/R600 | [90] |
27 | Fluorescence Ratio Index 4 | FRI4 | FRI4 = R740/R800 | [90] |
28 | Fluorescence Curvature Index | FCI | FCI = R2683/(R675 × R691) | [88] |
29 | Modified Red Edge Simple Ratio Index | mRESR | mRESR = (R750 − R 445 )/(R705 + R445) | [91] |
30 | Normalized Phaeophytinization Index | NPQI | NPQI = (R415 − R 435 )/(R415 + R435) | [92] |
31 | Red-Edge Vegetation Stress Index 1 | RVS1 | RVS1 =((R651 + R750)/2) − R733 | [93] |
32 | Red-Edge Vegetation Stress Index 2 | RVS2 | RVS2 =((R651 + R750)/2) − R751 | [93] |
33 | Water Index | WI | WI = R900/R970 | [94] |
34 | Water Stress and Canopy Temperature | WSCT | WSCT = (R970 − R 850 )/(R970 + R850) | [95] |
LiDAR-derived canopy height metrics | ||||
1 | Maximum of canopy height | Hmax | ||
2 | Minimum of canopy height | Hmin | ||
3 | Mean of canopy height | Hmean | ||
4 | Mode of canopy height | Hmode | ||
5 | Standard deviation of canopy height | Hsd | ||
6 | Coefficient of variation of canopy height | Hcv | ||
7 | Hmad | Hmad = 1.4826 × median (|height − Hmedian|) | ||
8 | Haad | Haad = mean (|height − Hmean|) | ||
9–20 | Percentile of canopy height | Hper | H10, H20, H30, H40, H50, H60, H70, H80, H90, H95, H98, H99 | |
21 | The Interquartile Range (iqr) of canopy height | Hiqr | Hiqr = H75 − H25 | |
22 | Skewness of canopy height | Hskn | ||
23 | Kurtosis of canopy height | Hkurt | ||
24–28 | Canopy return density of height | Hcrd | The proportion of points above the height quantiles (10th, 30th, 50th, 70th, and 90th) to the total number of points: Hd10, Hd30, Hd50, Hd70, Hd90 | |
29 | Canopy relief ratio of height | Hcrr | (Hmean-Hmin)/(Hmax−Hmin) | |
30 | Hcg | The ratio of canopy returns of height and ground returns of height | ||
LiDAR-derived canopy intensity metrics | ||||
1 | Maximum of canopy intensity | Imax | ||
2 | Minimum of canopy intensity | Imin | ||
3 | Mean of canopy intensity | Imean | ||
4 | Mode of canopy intensity | Imode | ||
5 | Standard deviation of canopy intensity | Isd | ||
6 | Coefficient of variation of canopy intensity | Icv | ||
7 | Imad | Imad = 1.4826 × median (|intensity − Imedian|) | ||
8 | Iaad | Iaad = mean (|intensity−Imean|) | ||
9–20 | Percentile of canopy intensity | Iper | I10, I20, I30, I40, I50, I60, I70, I80, I90, I95, I98, I99 | |
21 | The Interquartile Range (iqr) of canopy intensity | Iiqr | Iiqr = I75−I25 | |
22 | Skewness of canopy intensity | Iskn | ||
23 | Kurtosis of canopy intensity | Ikurt | ||
24–28 | Canopy return density of intensity | Icrd | The proportion of points above the intensity quantiles (10th, 30th, 50th, 70th, and 90th) to the total number of points: Id10, Id30, Id50, Id70, Id90 | |
29 | Canopy relief ratio of intensity | Icrr | (Imean–Imin)/(Imax−Imin) | |
30 | Icg | The ratio of canopy returns of intensity and ground returns of intensity | ||
Thermal-derived metric | ||||
1 | Normalized relative canopy temperature index | Tir | Tir = (Ti–Tmin)/(Ti–Tmax) | [96] |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Nguyen, C.; Sagan, V.; Bhadra, S.; Moose, S. UAV Multisensory Data Fusion and Multi-Task Deep Learning for High-Throughput Maize Phenotyping. Sensors 2023, 23, 1827. https://doi.org/10.3390/s23041827
Nguyen C, Sagan V, Bhadra S, Moose S. UAV Multisensory Data Fusion and Multi-Task Deep Learning for High-Throughput Maize Phenotyping. Sensors. 2023; 23(4):1827. https://doi.org/10.3390/s23041827
Chicago/Turabian StyleNguyen, Canh, Vasit Sagan, Sourav Bhadra, and Stephen Moose. 2023. "UAV Multisensory Data Fusion and Multi-Task Deep Learning for High-Throughput Maize Phenotyping" Sensors 23, no. 4: 1827. https://doi.org/10.3390/s23041827
APA StyleNguyen, C., Sagan, V., Bhadra, S., & Moose, S. (2023). UAV Multisensory Data Fusion and Multi-Task Deep Learning for High-Throughput Maize Phenotyping. Sensors, 23(4), 1827. https://doi.org/10.3390/s23041827