Influence of Spatial Resolution for Vegetation Indices’ Extraction Using Visible Bands from Unmanned Aerial Vehicles’ Orthomosaics Datasets
Abstract
:1. Introduction
Related Works
2. Materials and Methods
2.1. Acquired Datasets
2.2. Photogrammetric Processing
2.3. Empirical Line Method
2.4. Vegetation Indices
- Normalized green–red difference index (NGRDI) [45]
- Green leaf index (GLI) [46]
- Visible atmospherically resistant index (VARI) [47]
- Triangular greenness index (TGI) [48]
- Red–green ratio index (IRG) [49]
- Red–green–blue vegetation index (RGBVI) [50]
- Red–green ratio index (RGRI) [51]
- Modified green–red vegetation index (MGRVI) [50]
- Excess green index (ExG) [52]
- Colour index of vegetation (CIVE) [53]
2.5. Classification Algorithm Feedback
3. Results
3.1. Radiometric Calibration of the Raw Orthophotos
3.2. Vegetation Indices
3.3. Statistics
3.4. Supervised Classification Responses
4. Discussion
5. Conclusions
- The performance of each index varied for each case study, as already observed in other works. Therefore, to estimate the performance of the indices in general, it is essential to construct a broad case history covering as many contexts as possible.
- The TGI index, able to return very significant and functional values in terms of separability between vegetated and non-vegetated areas, performs better than the NGRDI index, taken as a reference, only in a regular context without ambiguous areas. The IRG index, on the other hand, performs well in all scenarios but with moderate performance.
- High resolution of an orthomosaic was not frequently optimal for vegetation indices in the various case studies; by reducing the resolution, the noise in each pixel is smoothened out, improving the radiometric information. In fact, the classification algorithms gave optimal results for {3} resolutions, demonstrating that very high-resolution datasets are not always a guarantee of more precise results.
- The masking of areas that are strongly characterised by ambiguity, such as those in the presence of water, improves their interpretability by the indices and increases their performance.
- In areas with dense vegetation, the reduced ability of SfM-MVS techniques to establish and triangulate unambiguous junction points produces artefacts or obvious distortions that compromise VIs’ performance in extracting correct information.
- Looking at the average performance of the RF classification algorithms, for each case analysed, it emerged that RGB orthomosaics can be considered a valid source for generic vegetation extraction.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Carrera-Hernández, J.; Levresse, G.; Lacan, P. Is UAV-SfM surveying ready to replace traditional surveying techniques? Int. J. Remote Sens. 2020, 41, 4820–4837. [Google Scholar] [CrossRef]
- Eltner, A.; Sofia, G. Chapter 1—Structure from motion photogrammetric technique. In Developments in Earth Surface Processes; Tarolli, P., Mudd, S.M., Eds.; Elsevier: Amsterdam, The Netherlands, 2020; Volume 23, pp. 1–24. [Google Scholar]
- Cummings, A.R.; McKee, A.; Kulkarni, K.; Markandey, N. The rise of UAVs. Photogramm. Eng. Remote Sens. 2017, 83, 317–325. [Google Scholar] [CrossRef]
- Wang, C.; Myint, S.W. A Simplified Empirical Line Method of Radiometric Calibration for Small Unmanned Aircraft Systems-Based Remote Sensing. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2015, 8, 1876–1885. [Google Scholar] [CrossRef]
- Tmušić, G.; Manfreda, S.; Aasen, H.; James, M.R.; Gonçalves, G.; Ben-Dor, E.; Brook, A.; Polinova, M.; Arranz, J.J.; Mészáros, J.; et al. Current Practices in UAS-based Environmental Monitoring. Remote Sens. 2020, 12, 1001. [Google Scholar] [CrossRef] [Green Version]
- Nhamo, L.; Magidi, J.; Nyamugama, A.; Clulow, A.D.; Sibanda, M.; Chimonyo, V.G.P.; Mabhaudhi, T. Prospects of Improving Agricultural and Water Productivity through Unmanned Aerial Vehicles. Agriculture 2020, 10, 256. [Google Scholar] [CrossRef]
- Rahman, M.F.F.; Fan, S.; Zhang, Y.; Chen, L. A Comparative Study on Application of Unmanned Aerial Vehicle Systems in Agriculture. Agriculture 2021, 11, 22. [Google Scholar] [CrossRef]
- Nettis, A.; Saponaro, M.; Nanna, M. RPAS-Based Framework for Simplified Seismic Risk Assessment of Italian RC-Bridges. Buildings 2020, 10, 150. [Google Scholar] [CrossRef]
- Saponaro, M.; Capolupo, A.; Caporusso, G.; Borgogno Mondino, E.; Tarantino, E. Predicting the Accuracy of Photogrammetric 3d Reconstruction from Camera Calibration Parameters Through a Multivariate Statistical Approach. In Proceedings of the XXIV ISPRS Congress, Nice, France, 4–10 July 2020; pp. 479–486. [Google Scholar] [CrossRef]
- Zhou, Y.; Rupnik, E.; Meynard, C.; Thom, C.; Pierrot-Deseilligny, M. Simulation and Analysis of Photogrammetric UAV Image Blocks—Influence of Camera Calibration Error. Remote Sens. 2020, 12, 22. [Google Scholar] [CrossRef] [Green Version]
- Yu, J.J.; Kim, D.W.; Lee, E.J.; Son, S.W. Determining the Optimal Number of Ground Control Points for Varying Study Sites through Accuracy Evaluation of Unmanned Aerial System-Based 3D Point Clouds and Digital Surface Models. Drones 2020, 4, 49. [Google Scholar] [CrossRef]
- Padró, J.-C.; Muñoz, F.-J.; Planas, J.; Pons, X. Comparison of four UAV georeferencing methods for environmental monitoring purposes focusing on the combined use with airborne and satellite remote sensing platforms. Int. J. Appl. Earth Obs. Geoinf. 2019, 75, 130–140. [Google Scholar] [CrossRef]
- Ludwig, M.; Runge, C.M.; Friess, N.; Koch, T.L.; Richter, S.; Seyfried, S.; Wraase, L.; Lobo, A.; Sebastià, M.-T.; Reudenbach, C.; et al. Quality Assessment of Photogrammetric Methods—A Workflow for Reproducible UAS Orthomosaics. Remote Sens. 2020, 12, 3831. [Google Scholar] [CrossRef]
- Saponaro, M.; Turso, A.; Tarantino, E. Parallel Development of Comparable Photogrammetric Workflows Based on UAV Data Inside SW Platforms. In International Conference on Computational Science and Its Applications; Springer: Cham, Switzerland, 2020; pp. 693–708. [Google Scholar]
- Lima-Cueto, F.J.; Blanco-Sepúlveda, R.; Gómez-Moreno, M.L.; Galacho-Jiménez, F.B. Using Vegetation Indices and a UAV Imaging Platform to Quantify the Density of Vegetation Ground Cover in Olive Groves (Olea Europaea L.) in Southern Spain. Remote Sens. 2019, 11, 2564. [Google Scholar] [CrossRef] [Green Version]
- Mesas-Carrascosa, F.-J.; de Castro, A.I.; Torres-Sánchez, J.; Triviño-Tarradas, P.; Jiménez-Brenes, F.M.; García-Ferrer, A.; López-Granados, F. Classification of 3D Point Clouds Using Color Vegetation Indices for Precision Viticulture and Digitizing Applications. Remote Sens. 2020, 12, 317. [Google Scholar] [CrossRef] [Green Version]
- Ocampo, A.L.P.D.; Bandala, A.A.; Dadios, E.P. Estimation of Triangular Greenness Index for Unknown PeakWavelength Sensitivity of CMOS-acquired Crop Images. In Proceedings of the 2019 IEEE 11th International Conference on Humanoid, Nanotechnology, Information Technology, Communication and Control, Environment, and Management (HNICEM), Laoag, Philippines, 29 November–1 December 2019; pp. 1–5. [Google Scholar]
- Jiang, J.; Cai, W.; Zheng, H.; Cheng, T.; Tian, Y.; Zhu, Y.; Ehsani, R.; Hu, Y.; Niu, Q.; Gui, L.; et al. Using Digital Cameras on an Unmanned Aerial Vehicle to Derive Optimum Color Vegetation Indices for Leaf Nitrogen Concentration Monitoring in Winter Wheat. Remote Sens. 2019, 11, 2667. [Google Scholar] [CrossRef] [Green Version]
- Hunt, E.R.; Doraiswamy, P.C.; McMurtrey, J.E.; Daughtry, C.S.T.; Perry, E.M.; Akhmedov, B. A visible band index for remote sensing leaf chlorophyll content at the canopy scale. Int. J. Appl. Earth Obs. Geoinf. 2013, 21, 103–112. [Google Scholar] [CrossRef] [Green Version]
- Agapiou, A. Optimal Spatial Resolution for the Detection and Discrimination of Archaeological Proxies in Areas with Spectral Heterogeneity. Remote Sens. 2020, 12, 136. [Google Scholar] [CrossRef] [Green Version]
- Niederheiser, R.; Winkler, M.; Di Cecco, V.; Erschbamer, B.; Fernández, R.; Geitner, C.; Hofbauer, H.; Kalaitzidis, C.; Klingraber, B.; Lamprecht, A.; et al. Using automated vegetation cover estimation from close-range photogrammetric point clouds to compare vegetation location properties in mountain terrain. GIScience Remote Sens. 2021, 58, 120–137. [Google Scholar] [CrossRef]
- Räsänen, A.; Virtanen, T. Data and resolution requirements in mapping vegetation in spatially heterogeneous landscapes. Remote Sens. Environ. 2019, 230, 111207. [Google Scholar] [CrossRef]
- Kwak, G.-H.; Park, N.-W. Impact of Texture Information on Crop Classification with Machine Learning and UAV Images. Appl. Sci. 2019, 9, 643. [Google Scholar] [CrossRef] [Green Version]
- Candiago, S.; Remondino, F.; De Giglio, M.; Dubbini, M.; Gattelli, M. Evaluating Multispectral Images and Vegetation Indices for Precision Farming Applications from UAV Images. Remote Sens. 2015, 7, 4026–4047. [Google Scholar] [CrossRef] [Green Version]
- Kwan, C.; Gribben, D.; Ayhan, B.; Li, J.; Bernabe, S.; Plaza, A. An Accurate Vegetation and Non-Vegetation Differentiation Approach Based on Land Cover Classification. Remote Sens. 2020, 12, 3880. [Google Scholar] [CrossRef]
- Dash, J.P.; Pearse, G.D.; Watt, M.S. UAV Multispectral Imagery Can Complement Satellite Data for Monitoring Forest Health. Remote Sens. 2018, 10, 1216. [Google Scholar] [CrossRef] [Green Version]
- Pamart, A.; Guillon, O.; Faraci, S.; Gattet, E.; Genevois, M.; Vallet, J.M.; De Luca, L. Multispectral Photogrammetric Data Acquisition and Processing Forwall Paintings Studies. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, XLII-2/W3, 559–566. [Google Scholar] [CrossRef] [Green Version]
- Wan, L.; Li, Y.; Cen, H.; Zhu, J.; Yin, W.; Wu, W.; Zhu, H.; Sun, D.; Zhou, W.; He, Y. Combining UAV-Based Vegetation Indices and Image Classification to Estimate Flower Number in Oilseed Rape. Remote Sens. 2018, 10, 1484. [Google Scholar] [CrossRef] [Green Version]
- Agapiou, A. Vegetation Extraction Using Visible-Bands from Openly Licensed Unmanned Aerial Vehicle Imagery. Drones 2020, 4, 27. [Google Scholar] [CrossRef]
- Costa, L.; Nunes, L.; Ampatzidis, Y. A new visible band index (vNDVI) for estimating NDVI values on RGB images utilizing genetic algorithms. Comput. Electron. Agric. 2020, 172, 105334. [Google Scholar] [CrossRef]
- Zhang, X.; Zhang, F.; Qi, Y.; Deng, L.; Wang, X.; Yang, S. New research methods for vegetation information extraction based on visible light remote sensing images from an unmanned aerial vehicle (UAV). Int. J. Appl. Earth Obs. Geoinf. 2019, 78, 215–226. [Google Scholar] [CrossRef]
- Fuentes-Peailillo, F.; Ortega-Farias, S.; Rivera, M.; Bardeen, M.; Moreno, M. Comparison of vegetation indices acquired from RGB and Multispectral sensors placed on UAV. In Proceedings of the 2018 IEEE International Conference on Automation/XXIII Congress of the Chilean Association of Automatic Control (ICA-ACCA), Concepción, Chile, 17–19 October 2018; pp. 1–6. [Google Scholar]
- Haghighattalab, A.; González Pérez, L.; Mondal, S.; Singh, D.; Schinstock, D.; Rutkoski, J.; Ortiz-Monasterio, I.; Singh, R.P.; Goodin, D.; Poland, J. Application of unmanned aerial systems for high throughput phenotyping of large wheat breeding nurseries. Plant Methods 2016, 12, 35. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Pompilio, L.; Marinangeli, L.; Amitrano, L.; Pacci, G.; D’andrea, S.; Iacullo, S.; Monaco, E. Application of the empirical line method (ELM) to calibrate the airborne Daedalus-CZCS scanner. Eur. J. Remote Sens. 2018, 51, 33–46. [Google Scholar] [CrossRef] [Green Version]
- Logie, G.S.; Coburn, C.A. An investigation of the spectral and radiometric characteristics of low-cost digital cameras for use in UAV remote sensing. Int. J. Remote Sens. 2018, 39, 4891–4909. [Google Scholar] [CrossRef]
- Dainelli, R.; Toscano, P.; Di Gennaro, S.F.; Matese, A. Recent Advances in Unmanned Aerial Vehicle Forest Remote Sensing—A Systematic Review. Part I: A General Framework. Forests 2021, 12, 327. [Google Scholar] [CrossRef]
- Olsson, P.-O.; Vivekar, A.; Adler, K.; Garcia Millan, V.E.; Koc, A.; Alamrani, M.; Eklundh, L. Radiometric Correction of Multispectral UAS Images: Evaluating the Accuracy of the Parrot Sequoia Camera and Sunshine Sensor. Remote Sens. 2021, 13, 577. [Google Scholar] [CrossRef]
- Mafanya, M.; Tsele, P.; Botai, J.O.; Manyama, P.; Chirima, G.J.; Monate, T. Radiometric calibration framework for ultra-high-resolution UAV-derived orthomosaics for large-scale mapping of invasive alien plants in semi-arid woodlands: Harrisia pomanensis as a case study. Int. J. Remote Sens. 2018, 39, 5119–5140. [Google Scholar] [CrossRef] [Green Version]
- Smith, G.M.; Milton, E.J. The use of the empirical line method to calibrate remotely sensed data to reflectance. Int. J. Remote Sens. 1999, 20, 2653–2662. [Google Scholar] [CrossRef]
- Capolupo, A.; Saponaro, M.; Borgogno Mondino, E.; Tarantino, E. Combining Interior Orientation Variables to Predict the Accuracy of Rpas–Sfm 3D Models. Remote Sens. 2020, 12, 2674. [Google Scholar] [CrossRef]
- James, M.R.; Chandler, J.H.; Eltner, A.; Fraser, C.; Miller, P.E.; Mills, J.P.; Noble, T.; Robson, S.; Lane, S.N. Guidelines on the use of structure-from-motion photogrammetry in geomorphic research. Earth Surf. Process. Landf. 2019, 44, 2081–2084. [Google Scholar] [CrossRef]
- Saponaro, M.; Capolupo, A.; Tarantino, E.; Fratino, U. Comparative Analysis of Different UAV-Based Photogrammetric Processes to Improve Product Accuracies. In International Conference on Computational Science and Its Applications; Springer: Cham, Swizterland, 2019; pp. 225–238. [Google Scholar]
- Smith, M.W.; Carrivick, J.; Quincey, D. Structure from motion photogrammetry in physical geography. Prog. Phys. Geogr. 2016, 40, 247–275. [Google Scholar] [CrossRef] [Green Version]
- Team, Q.D. QGIS Geographic Information System. Available online: https://www.qgis.org (accessed on 28 March 2021).
- Tucker, C.J. Red and photographic infrared linear combinations for monitoring vegetation. Remote Sens. Environ. 1979, 8, 127–150. [Google Scholar] [CrossRef] [Green Version]
- Louhaichi, M.; Borman, M.M.; Johnson, D.E. Spatially Located Platform and Aerial Photography for Documentation of Grazing Impacts on Wheat. Geocarto Int. 2001, 16, 65–70. [Google Scholar] [CrossRef]
- Gitelson, A.A.; Kaufman, Y.J.; Stark, R.; Rundquist, D. Novel algorithms for remote estimation of vegetation fraction. Remote Sens. Environ. 2002, 80, 76–87. [Google Scholar] [CrossRef] [Green Version]
- Hunt, E.; Daughtry, C.; Eitel, J.; Long, D. Remote Sensing Leaf Chlorophyll Content Using a Visible Band Index. Agron. J. 2011, 103, 1090. [Google Scholar] [CrossRef] [Green Version]
- Gamon, J.A.; Surfus, J.S. Assessing leaf pigment content and activity with a reflectometer. New Phytol. 1999, 143, 105–117. [Google Scholar] [CrossRef]
- Bendig, J.; Yu, K.; Aasen, H.; Bolten, A.; Bennertz, S.; Broscheit, J.; Gnyp, M.L.; Bareth, G. Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley. Int. J. Appl. Earth Obs. Geoinf. 2015, 39, 79–87. [Google Scholar] [CrossRef]
- Verrelst, J.; Schaepman, M.E.; Koetz, B.; Kneubühler, M. Angular sensitivity analysis of vegetation indices derived from CHRIS/PROBA data. Remote Sens. Environ. 2008, 112, 2341–2353. [Google Scholar] [CrossRef]
- Woebbecke, D.M.; Meyer, G.E.; Von Bargen, K.; Mortensen, D.A. Color Indices for Weed Identification Under Various Soil, Residue, and Lighting Conditions. Trans. ASAE 1995, 38, 259–269. [Google Scholar] [CrossRef]
- Kataoka, T.; Kaneko, T.; Okamoto, H.; Hata, S. Crop growth estimation system using machine vision. In 2003 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM 2003); International Conference Center: Kobe, Japan, 2003; Volume 2, pp. b1079–b1083. [Google Scholar]
- Kpienbaareh, D.; Kansanga, M.; Luginaah, I. Examining the potential of open source remote sensing for building effective decision support systems for precision agriculture in resource-poor settings. GeoJournal 2019, 84, 1481–1497. [Google Scholar] [CrossRef]
- Puliti, S.; Saarela, S.; Gobakken, T.; Ståhl, G.; Næsset, E. Combining UAV and Sentinel-2 auxiliary data for forest growing stock volume estimation through hierarchical model-based inference. Remote Sens. Environ. 2018, 204, 485–497. [Google Scholar] [CrossRef]
- Zuhlke, M.; Fomferra, N.; Brockmann, C.; Peters, M.; Veci, L.; Malik, J.; Regner, P. SNAP (sentinel application platform) and the ESA sentinel 3 toolbox. In Proceedings of the Sentinel-3 for Science Workshop, Venice, Italy, 2–5 June 2015; p. 21. [Google Scholar]
- Breiman, L. Random forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef] [Green Version]
- Feng, Q.; Liu, J.; Gong, J. UAV Remote Sensing for Urban Vegetation Mapping Using Random Forest and Texture Analysis. Remote Sens. 2015, 7, 1074–1094. [Google Scholar] [CrossRef] [Green Version]
- De Castro, A.I.; Torres-Sánchez, J.; Peña, J.M.; Jiménez-Brenes, F.M.; Csillik, O.; López-Granados, F. An Automatic Random Forest-OBIA Algorithm for Early Weed Mapping between and within Crop Rows Using UAV Imagery. Remote Sens. 2018, 10, 285. [Google Scholar] [CrossRef] [Green Version]
- Zeybek, M. Classification of UAV point clouds by random forest machine learning algorithm. Turk. J. Eng. 2021, 5, 51–61. [Google Scholar] [CrossRef]
- Fernandez-Carrillo, A.; Franco-Nieto, A.; Pinto-Bañuls, E.; Basarte-Mena, M.; Revilla-Romero, B. Designing a Validation Protocol for Remote Sensing Based Operational Forest Masks Applications. Comparison of Products Across Europe. Remote Sens. 2020, 12, 3159. [Google Scholar] [CrossRef]
- Song, Q.; Jiang, H.; Liu, J. Feature selection based on FDA and F-score for multi-class classification. Expert Syst. Appl. 2017, 81, 22–27. [Google Scholar] [CrossRef]
Case Study (a) | Case Study (b) | Case Study (c) | |
---|---|---|---|
Location | Fasoula (EL), Cyprus | Grottole (MT), Italy | Bari (BA), Italy |
Equipment | DJI Phantom 4 Pro RTK RGB f-8.8, Model FC6310S | DJI Mavic 2 Zoom RGB f-4.386, Model FC2204 | DJI Inspire 1 v.2 ZenMuse X3 RGB f-3.61, Model FC350 |
Images | 174 images (5472 × 3648 pix) | 287 images (4000 × 3000 pix) | 87 images (4000 × 3000 pix) |
AGL/GSD | 50 [m]/9.7 [mm/pix] | 30 [m]/1.3 [cm/pix] | 90 [m]/3.9 [cm/pix] |
Georeferencing Strategy | DG with RTK on-board | IG with 11 GCPs in nRTK | DG with low-cost GNSS receiver |
REFERENCE SETTINGS | |
Coordinate System | (a) WGS84 (EPSG:4326) (b) WGS84 (EPSG:4326) (c) WGS 84/UTM zone 33N (EPSG:32633) |
Camera Positioning Accuracy | (a) 0.1 m (b) 2 m (c) 3 m |
Camera Accuracy, Attitude | 10 deg |
Marker Accuracy (Object Space) | 0.02 m |
Marker Accuracy (Image Space) | 0.5 pixel |
PROCESSES PLANNED | |
Estimate Image Quality | (a) [max, min]: 0.911702, 0.826775 (b) [max, min]: 0.911062, 0.802699 (c) [max, min]: 0.871442, 0.808870 |
Alignment Cameras | Accuracy: High Generic Preselection: Yes Reference Preselection: Yes Key Point Limit: 0 Tie Point Limit: 0 Adaptive Camera Model Fitting: No |
Gradual Selection | Reconstruction Uncertainty: 10 Projection Accuracy: 3 Reprojection Error: 0.4 |
Optimise Cameras | K3, K4, P3, P4: No |
Build Dense Cloud | Quality: High Depth Filtering: Aggressive |
Build DEM | Source data: Dense Cloud Interpolation: Enabled |
Build Orthomosaic | Blending Mode: Mosaic Surface: DEM Enable hole filling: Yes |
Case Study (a) | Case Study (b) | Case Study (c) | |
---|---|---|---|
Orthomosaic [m/pix] | |||
{1} min.res. | {1} 0.001 | {1} 0.017 | {1} 0.036 |
{2} min.res.x2 | {2} 0.019 | {2} 0.035 | {2} 0.071 |
{3} min.res x3 | {3} 0.029 | {3} 0.052 | {3} 0.107 |
{4} min.res.x4 | {4} 0.039 | {4} 0.070 | {4} 0.142 |
[%] | CIVE | ExG | MGRVI | RGRI | RGBVI | IRG | TGI | VARI | GLI | NGRDI | |
---|---|---|---|---|---|---|---|---|---|---|---|
{1} | (a) | NA | 21.7 | 12.1 | −6.0 | 19.7 | −42.4 | −564.1 | NA | 22.5 | NA |
(b) | −0.3 | 40.0 | NA | NA | NA | −63.1 | −675.3 | NA | NA | NA | |
(c) | −0.1 | 16.3 | NA | NA | NA | NA | 304.7 | NA | NA | NA | |
(cmask) | −0.5 | 47.3 | 21.3 | −18.5 | 36.8 | −207.2 | 304.9 | NA | 32.6 | NA | |
{2} | (a) | −0.1 | 26.7 | 14.2 | −6.2 | 26.6 | −39.4 | −482.1 | NA | 24.7 | 13.7 |
(b) | −0.3 | 46.3 | NA | NA | NA | NA | 3123.2 | NA | NA | NA | |
(c) | −0.1 | 15.6 | NA | NA | NA | NA | 137.1 | NA | NA | −12.3 | |
(cmask) | −0.5 | 44.0 | 23.3 | −19.6 | 41.5 | −316.1 | 282.0 | NA | 38.3 | NA | |
{3} | (a) | −0.1 | 27.9 | 14.7 | −5.4 | 29.8 | −40.1 | −350.0 | NA | 28.3 | 14.1 |
(b) | −0.3 | 43.7 | NA | NA | 26.3 | −101.5 | 2201.4 | NA | 25.9 | NA | |
(c) | −0.1 | 14.4 | NA | NA | NA | NA | 6337.3 | NA | NA | −13.8 | |
(cmask) | −0.6 | 45.5 | 21.8 | −19.6 | 34.5 | −310.4 | 326.7 | NA | 27.9 | NA | |
{4} | (a) | −0.1 | 31.4 | 11.8 | −7.1 | 25.2 | −46.7 | −654.8 | NA | 23.9 | NA |
(b) | −0.3 | 48.9 | 17.4 | NA | 32.7 | −100.7 | 531.1 | 5.7 | NA | NA | |
(c) | −0.1 | 14.1 | NA | NA | NA | NA | 398.5 | NA | NA | NA | |
(cmask) | −0.6 | 47.2 | 26.1 | −17.9 | 48.4 | −190.4 | 595.3 | NA | 47.9 | NA |
RGB Bands | +TGI Band | +IRG Band | ||||||||
---|---|---|---|---|---|---|---|---|---|---|
V | A | B | V | A | B | V | A | B | ||
(a) | {1} | 0.77 | 0.67 | 0.60 | 0.87 | 0.91 | 0.74 | 0.70 | 0.54 | 0.45 |
0.68 | 0.84 | 0.56 | ||||||||
{2} | 0.72 | 0.72 | 0.72 | 0.85 | 0.95 | 0.76 | 0.68 | 0.64 | 0.50 | |
0.72 | 0.85 | 0.61 | ||||||||
{3} | 0.7 | 0.65 | 0.54 | 0.78 | 0.81 | 0.57 | 0.70 | 0.65 | 0.54 | |
0.63 | 0.72 | 0.63 | ||||||||
{4} | 0.73 | 0.82 | 0.50 | 0.87 | 0.92 | 0.73 | 0.67 | 0.78 | 0.40 | |
0.68 | 0.84 | 0.62 | ||||||||
(b) | {1} | 0.85 | 0.78 | 0.75 | 0.95 | 0.88 | 0.82 | 0.90 | 0.79 | 0.81 |
0.79 | 0.88 | 0.83 | ||||||||
{2} | 0.94 | 0.70 | 0.62 | 0.98 | 0.81 | 0.85 | 0.87 | 0.73 | 0.82 | |
0.75 | 0.88 | 0.81 | ||||||||
{3} | 0.95 | 0.94 | 0.87 | 0.85 | 0.96 | 0.75 | 0.98 | 0.79 | 0.79 | |
0.92 | 0.85 | 0.85 | ||||||||
{4} | 0.94 | 0.86 | 0.85 | 0.98 | 0.92 | 0.95 | 0.98 | 0.97 | 0.95 | |
0.88 | 0.95 | 0.97 | ||||||||
(c) | {1} | 0.79 | 0.42 | 0.65 | 0.82 | 0.51 | 0.68 | 0.77 | 0.33 | 0.73 |
0.62 | 0.67 | 0.61 | ||||||||
{2} | 0.8 | 0.86 | 0.79 | 0.87 | 0.84 | 0.79 | 0.79 | 0.67 | 0.81 | |
0.82 | 0.83 | 0.76 | ||||||||
{3} | 0.87 | 0.84 | 0.87 | 0.93 | 0.70 | 0.76 | 0.84 | 0.60 | 0.81 | |
0.86 | 0.80 | 0.75 | ||||||||
{4} | 0.79 | 0.83 | 0.81 | 0.77 | 0.72 | 0.83 | 0.82 | 0.75 | 0.95 | |
0.81 | 0.77 | 0.84 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Saponaro, M.; Agapiou, A.; Hadjimitsis, D.G.; Tarantino, E. Influence of Spatial Resolution for Vegetation Indices’ Extraction Using Visible Bands from Unmanned Aerial Vehicles’ Orthomosaics Datasets. Remote Sens. 2021, 13, 3238. https://doi.org/10.3390/rs13163238
Saponaro M, Agapiou A, Hadjimitsis DG, Tarantino E. Influence of Spatial Resolution for Vegetation Indices’ Extraction Using Visible Bands from Unmanned Aerial Vehicles’ Orthomosaics Datasets. Remote Sensing. 2021; 13(16):3238. https://doi.org/10.3390/rs13163238
Chicago/Turabian StyleSaponaro, Mirko, Athos Agapiou, Diofantos G. Hadjimitsis, and Eufemia Tarantino. 2021. "Influence of Spatial Resolution for Vegetation Indices’ Extraction Using Visible Bands from Unmanned Aerial Vehicles’ Orthomosaics Datasets" Remote Sensing 13, no. 16: 3238. https://doi.org/10.3390/rs13163238
APA StyleSaponaro, M., Agapiou, A., Hadjimitsis, D. G., & Tarantino, E. (2021). Influence of Spatial Resolution for Vegetation Indices’ Extraction Using Visible Bands from Unmanned Aerial Vehicles’ Orthomosaics Datasets. Remote Sensing, 13(16), 3238. https://doi.org/10.3390/rs13163238