Next Article in Journal
Editorial of Special Issue “Drones for Biodiversity Conservation and Ecological Monitoring”
Previous Article in Journal
Calibrating Sentinel-2 Imagery with Multispectral UAV Derived Information to Quantify Damages in Mediterranean Rice Crops Caused by Western Swamphen (Porphyrio porphyrio)
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Positional Precision Analysis of Orthomosaics Derived from Drone Captured Aerial Imagery

Arthur Temple College of Forestry and Agriculture, Stephen F. Austin State University, Nacogdoches, TX 75962, USA
*
Author to whom correspondence should be addressed.
Drones 2019, 3(2), 46; https://doi.org/10.3390/drones3020046
Submission received: 29 March 2019 / Revised: 23 May 2019 / Accepted: 29 May 2019 / Published: 31 May 2019

Abstract

:
The advancement of drones has revolutionized the production of aerial imagery. Using a drone with its associated flight control and image processing applications, a high resolution orthorectified mosaic from multiple individual aerial images can be produced within just a few hours. However, the positional precision and accuracy of any orthomosaic produced should not be overlooked. In this project, we flew a DJI Phantom drone once a month over a seven-month period over Oak Grove Cemetery in Nacogdoches, Texas, USA resulting in seven orthomosaics of the same location. We identified 30 ground control points (GCPs) based on permanent features in the cemetery and recorded the geographic coordinates of each GCP on each of the seven orthomosaics. Analyzing the cluster of each GCP containing seven coincident positions depicts the positional precision of the orthomosaics. Our analysis is an attempt to answer the fundamental question, “Are we obtaining the same geographic coordinates for the same feature found on every aerial image mosaic captured by a drone over time?” The results showed that the positional precision was higher at the center of the orthomosaic compared to the edge areas. In addition, the positional precision was lower parallel to the direction of the drone flight.

1. Introduction

Aerial photography began to be used on a routine basis in the 1920s with the advent of modern film and aircraft [1]. Current use of unmanned aerial systems (UAS), commonly referred to as drones, has combined aerial photography and remote sensing to expand into successful businesses incorporating both individual users and governmental agencies to map the Earth surface on a repetitive basis [2]. Until digital cameras became available, creating a mosaic of aerial photograph required considerable work, time, energy, and expense, even after computers became widely available. In analog aerial photography, after each film roll was exposed, it had to be processed, printed, scanned, and georeferenced. Then the stitching process applies algorithms to minimize image distortion in order to produce an orthorectified aerial photomosaic [3].
The United States Geological Survey (USGS) produces a series of digital aerial photographs of the country known as a DOQ (digital orthophoto quadrangle), with each covering an area of 7.5-minutes of longitude by 7.5-minutes of latitude. DOQ has since become the general term when referring to an orthorectified aerial photograph in digital format. When a DOQ is split to four quadrats for file management purpose, it is known as a DOQQ (digital orthophoto quarter quadrangle), a term widely used in the geographic information systems (GIS) community.
While processing traditional aerial photographs was time consuming, taking those photos from an airplane was equally complicated. It involved developing a flight plan, calculating optimum flying altitude, determining optimum overlap and endlap relative to along and adjacent to any flight paths, maintaining an aerial plane with a pilot and a photographer, a ground crew for setting ground control points, etc.
This workflow has been simplified when UAS became commonplace within the remote sensing community. Coupled with the flight control software, a drone can be flown to acquire aerial imagery within a programed location with the set height, flight paths, overlaps, and camera settings [2]. With the drone’s onboard global positioning system (GPS), the positional accuracy of the imagery equals the GPS receiver. When the drone image files are loaded into a digital image processing program, they are stitched together to produce an orthomosaic, similar to a traditional DOQQ. This process for the drone imagery can be completed on a desktop computer in a timely and cost effective manner [4]. The completed orthomosaic images feature detailed ground surface with high spatial resolution. These methods decrease cost, reduce risk, and produce high accuracy potential [2,5].
While traditional aerial photography from a manned airplane continues to serve projects of large areas, especially when acquiring aerial photography at a small scale, drone acquired imagery is suitable for smaller and inaccessible areas with the drone user controlling the spatial, spectral, radiometric, and temporal resolution of the product [6,7,8,9]. With limited capital investment, the process from setting up a drone flight to the output of a rectified orthomosaic image for a specific area can be achieved within hours [5,10], providing timely cost-effective monitoring of the forest environment [11,12]. Advantages of a UAS for mapping of small areas include high spatial resolution, the ability to complete time series imagery, potentials for 3D imagery to produce both orthomosaics and 3D models, and the integration into community-based forest monitoring (CBFM) or habitat monitoring [13].
Nogueira et al. [5] utilized Pix4D Mapper Pro to process images from a DJI Phantom 4 Pro drone for mapping accuracy of a 15-hectare site with ground control points (GCPs). Pix4D Mapper is a software application that allows the user to process images captured by an aerial vehicle and generate georeferenced models for GIS output. Their findings indicated that low features were well defined but tops of trees were not. The central region of the image provided better results than the edges. Digital surface models and planimetry error vectors for the orthomosaic with and without ground control points were displayed. Also, GCPs with known coordinates increased the accuracy of UAS flights [14,15]. Lima et al. [16] demonstrated the accuracy of the DJI Phantom 4 Pro UAS for three-dimensional coordinates using Pix4D Mapper Pro software with 15 ground control points and 47 checkpoints, concluding that the coordinates and attitude angles of the UAS camera can be measured. Tomastik et al. [17] found that with forest sites of one hectare or less, GCP quantity was not as important as the configuration but additional GCP increased accuracy. For best accuracy and quality control with UAS, GCP are needed [5,14,18,19]. With all the advantages of using a drone for aerial photography, a question still needs to be asked. What is the precision and accuracy of a drone derived aerial orthomosaic? To explore the quality of drone acquired imagery in producing orthomosaics, we conducted a series of drone flights over the same area and produced a digital aerial orthomosaic for each flight. The goal was to access its geographic precision of location to ascertain if each orthomosaic resulted in the same geographic coordinates for a ground feature identified on the digital orthomosaic.

2. Materials and Methods

The study area chosen to assess the positional precision of orthomosaics derived from drone acquired imagery was the historic Oak Grove Cemetery located next to downtown Nacogdoches in east Texas, USA. Oak Grove Cemetery, consisting of 5.2 ha (12.7 ac) of cemetery plots with intermingled pine and hardwood trees, was chosen due to its permanent features that are readily identifiable within digital imagery and suitable for a precision assessment on positions identified on repeated aerial imagery. A DJI Phantom 3 or 4 drone was flown over the study area monthly from September 2017 to March 2018, depending on which DJI model was available for each scheduled flight. The comparison on specifications between the two drones is shown in Table 1. Both had a similar physical size and camera lens, while the Phantom 4 had a larger image sensor. However, they resulted in spatial resolutions of orthomosaic that were within a comparable range (Table 2). In addition, both drone models had the same hover accuracy that is deemed to be appropriate for comparison on positional precision analysis. Each drone flight was conducted as a grid mission using the Pix4DCapture app installed on an iPad, with the flight height set at 67 m (220 ft), front overlap set at 80%, side overlap set at 60%, and the angle of the camera set at 80 degrees. Each drone mission took an average flight time of 17 minutes, resulting in an average of 306 photos covering a total area of 11.7 ha (28.9 ac). Each drone flight was planned to fly outside the boundaries of the cemetery to ensure any resulting orthomosaic produced from each drone flight would completely contain the physical boundaries of the entire cemetery (Figure 1). Weather conditions at the time of each flight were observed based on the weather station located at the East Texas Regional Airport (Table 3). They were considered fair for flying a drone for aerial photography.
Upon the completion of each drone mission, images acquired by the drone were imported into Drone2Map to create an orthomosaic. Default settings were used in Drone2Map for creating each month’s 2D orthomosaic, with output image resolution set automatically that resulted in the cell size of each orthomosaic ranging from 2.78–2.95 cm for the Phantom 3 and 2.57–2.73 cm for the Phantom 4 DJI drones, respectfully. Each output mosaicked image was referenced to WGS 1984 UTM Zone 15N. A total of seven orthomosaics of the Oak Grove Cemetery were produced.
In order to assess the precision of positions, 30 ground control points were selected throughout the cemetery based on the 7 September 2017 orthomosaic. Each GCP was selected on a distinguishable corner of a family plot in the cemetery. Family plot corners were chosen as they were easily distinguishable visually both in the orthomosaics and easily identifiable on the ground. The UTM coordinates of each GCP were acquired by creating a point feature through on-screen digitization with an orthomosaic as the reference image. This process was repeated seven times on each of the seven orthomosaics produced. The outcome was a group of seven spatial point features for each GCP, where all of the seven represent the same feature on the ground (Figure 2).
To analyze the geographic distribution of the seven positions of each GCP, the mean center of each cluster (Equation (1)) was first calculated. The mean center is plotted based on the mean x coordinate and mean y coordinate of all seven positions and represents the geographic center of each cluster. Then, the standard distance for summary circle (Equation (2)) was calculated that depicts the dispersion of the positions around the same feature. A lower standard distance translates to higher precision, where the seven positions are closer to each other. Finally, when directional effect of each cluster is considered, the standard deviational ellipse (Equation (3)) was calculated. It shows the direction where the precision is diluted most than others. All geographic distribution data were plotted on an orthomosaic and analyzed visually to identify spatial patterns in each GCP’s spatial distribution over the seven orthomosaics produced (see Supplementary Materials).
S ¯ = ( μ x , μ y ) = ( i = 1 n x i n , i = 1 n y i n )
d = i = 1 n ( x i μ x ) 2 + ( y i μ y ) 2 n
d x = i = 1 n ( x i μ x ) 2 n   d y = i = 1 n ( y i μ y ) 2 n

3. Results

For each GCP, the geographic coordinates were attained seven times based on the seven orthomosaics, one for each month. Figure 3 depicts the seven positions of the same GCP when plotted on the September 2017 orthomosaic, where the GCP no. 1 to 3 were presented and a standard deviational ellipse indicated the dilution of precision for the position cluster of each GCP. Based on UTM Zone 15N, WGS 84, the x and y coordinates of mean center, standard distance of summary circle, azimuth of deviational ellipse rotation, major and minor semi-axis of ellipse, and area of the ellipse for each position cluster of representing all GCP’s are summarized in Appendix A, Table A1. The standard distances ranged from 1.25 m to 1.61 m, with lower values indicating higher precision of a cluster of seven positions for an individual GCP.
In Figure 4, the rotation of the summary ellipses showed a clockwise pattern with azimuths ranging from 6 to 178 degrees. Most of the summary ellipses (80%) have their major axis oriented along the direction pointing northeast and southwest (Figure 5). Those GCP clusters oriented southeast and northwest (20%) were found along the east side boundary of the study area. On average, the rotation has a mean azimuth of 60 degrees and a mean major semi-axis of 1.72 m.

4. Discussion

The seven flights programmed with Pix4DCapture each resulted images that were processed with Drone2Map. It created a series of georectified images that could be compared month to month based on set GCPs. The resulting images indicated a dilution of positional precision in the data over time with primarily a northeast direction. When presented in a map context (Figure 4), the positions of a GCP are more clustered at the center of the study area, while those away from the center of the orthomosaic are more dispersed. This translates to a higher positional precision of the orthomosaic within the central area, which might be contributed to more aerial imagery being available during the mosaic process with more overlapping areas for comparison as compared to areas along the cemetery boundary with fewer overlapping area.
This is similar to the findings of Nogueira et al. [5] for distances towards the edge of the mosaic of images with GCPs using Pix4DCapture software. This is a problem commonly known as edge effect; hence the reason most drone flights are designed to be flown beyond a project area to reduce these concerns. This pattern may be also related to the photo sequence set in Pix4DCapture where the first photo was captured at the northeast corner of the grid, then moving south and finally ended at the southwest corner. This northeast-southwest flight path direction is in agreement with the majority of the summary ellipse rotation, meaning the positional precision is lower along the flight direction of northeast-southwest (Figure 4).

5. Conclusions

In an orthomosaic derived from drone captured aerial imagery, the center of the orthomosaic had higher positional precision than areas along the edges. With higher precision, higher positional accuracy can be expected. When relying on the orthomosaic for any feature measurement to derive real-world units, caution should be taken to not only identify where in the mosaicked aerial photo measurement will be the most accurate, but also the direction of the measurement. Other factors that could also play a role affecting the positional precision of any derived orthomosaic could include drone GPS accuracy, camera resolution, image processing application used, and wind speed and wind direction during the drone flight. If the actual coordinates of GCPs used to assess accuracy can be attained such as by setting RTK GPS, not only the positional precision can be analyzed, but also the positional accuracy of any orthomosaic produced. As presented in our study, it is recommended to fly a drone mission covering an area beyond the physical boundary of the study area in question to produce an orthomosaic with the highest precision and accuracy attainable per flight. This is of great importance when conducting change detection over time based on drone aerial imagery.

Supplementary Materials

The following are available online at https://www.mdpi.com/2504-446X/3/2/46/s1, S1: S1_Geodatabase.zip.

Author Contributions

Conceptualization: I.-K.H. and D.U.; methodology: I.-K.H. and D.U.; data collection: I.-K.H., D.U., D.K., and Y.Z.; formal analysis: I.-K.H.; data curation: I.-K.H.; writing—original draft preparation: I.-K.H. and D.K.; writing—review and editing: D.U. and Y.Z.; visualization: I.-K.H.; supervision: I.-K.H..; project administration: I.-K.H.; funding acquisition: D.K., D.U., I.-K.H., and Y.Z..

Funding

This work is supported by McIntire Stennis Capacity grant no. NI18MSCFRXXXG012/project accession nos. 1004705, 1004707, 1011115, 1011116 from the USDA National Institute of Food and Agriculture.

Acknowledgments

The authors thank the GIS Lab in the Arthur Temple College of Forestry and Agriculture at Stephen F. Austin State University for their technical supports.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1. Position cluster statistics for each ground control point (GCP) based on seven orthomosaics.
Table A1. Position cluster statistics for each ground control point (GCP) based on seven orthomosaics.
GCP IDX Coordinate of Mean Center (m)Y Coordinate of Mean Center (m)Standard Distance of Summary Circle (m)Azimuth of Ellipse Rotation (degree)Major Semi-axis of Ellipse (m)Minor Semi-axis of Ellipse (m)Area of Deviational Ellipse (sq m)
1343,565.683,497,810.961.6155.872.120.835.50
2343,676.423,497,644.761.34175.201.571.045.16
3343,580.113,497,790.651.5153.831.970.815.03
4343,609.613,497,774.071.4055.511.760.925.07
5343,664.443,497,760.951.4275.931.681.115.84
6343,684.573,497,755.691.4389.751.651.176.08
7343,709.083,497,739.191.44106.331.631.226.24
8343,543.033,497,754.761.4941.411.980.744.57
9343,578.993,497,748.251.4044.471.820.784.48
10343,622.923,497,742.581.3352.031.600.994.99
11343,650.463,497,715.591.3151.151.471.135.25
12343,703.783,497,714.321.40117.541.531.266.05
13343,691.443,497,679.651.33160.451.431.215.45
14343,612.963,497,706.881.2937.251.570.934.57
15343,642.223,497,641.951.298.351.550.954.63
16343,601.893,497,605.101.3711.421.750.834.60
17343,537.893,497,739.311.4839.111.960.714.35
18343,593.373,497,712.181.3136.761.650.824.27
19343,617.633,497,687.931.2526.931.540.894.29
20343,634.193,497,689.961.2830.231.481.044.82
21343,641.023,497,672.011.2720.541.491.004.69
22343,691.463,497,631.791.41170.691.651.125.79
23343,663.073,497,611.751.38178.481.690.985.23
24343,614.023,497,619.441.3412.661.700.864.56
25343,597.853,497,649.511.3119.771.670.784.10
26343,535.583,497,671.471.4327.691.880.754.43
27343,507.683,497,659.141.5425.952.030.815.15
28343,522.873,497,635.291.5123.011.990.804.98
29343,598.133,497,574.591.489.181.890.905.34
30343,633.663,497,586.161.425.811.790.905.05

References

  1. Campbell, J.B.; Wynne, R.H. Introduction to Remote Sensing, 5th ed.; Guilford Press: New York, NY, USA, 2011; ISBN 978-1-60918-176-5. [Google Scholar]
  2. Nex, F.; Remondino, F. UAV for 3D mapping applications: A review. Appl. Geomat. 2014, 6, 1–15. [Google Scholar] [CrossRef]
  3. Neitzel, F.; Klonowski, J. Mobile 3D mapping with a low-cost UAV system. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2012, XXXVIII-1-C22, 39–44. [Google Scholar] [CrossRef]
  4. Viegut, R.; Kulhavy, D.L.; Unger, D.R.; Hung, I.-K.; Humphreys, B. Integrating unmanned aircraft systems to measure linear and areal features into undergraduate forestry education. Int. J. High. Educ. 2018, 7, 63–75. [Google Scholar] [CrossRef]
  5. Nogueira, F.; da, C.; Roberto, L.; Körting, T.S.; Shiguemori, E.H. Accuracy analysis of orthomosaic and DSM produced from sensor aboard UAV. In Proceedings of the Anias do XVIII Simposio Brasileiro de Sensoriamento Remoto, Santos, Brazil, 28–31 May 2017; pp. 5515–5520. [Google Scholar]
  6. Kulhavy, D.L.; Endsley, G.; Unger, D.; Grisham, R.; Gannon, M.; Coble, D. Service learning for the Port Jefferson History and Nature Center: Senior capstone forestry course. J. Community Engagem. High. Educ. 2017, 9, 41–53. [Google Scholar]
  7. Unger, D.; Hung, I.-K.; Zhang, Y.; Kulhavy, D. Integrating drone technology with GPS data collection to enhance forestry students interactive hands-on field experiences. High. Educ. Stud. 2018, 8, 49–62. [Google Scholar] [CrossRef]
  8. Fardusi, M.; Chianucci, F.; Barbati, A. Concept to practice of geospatial-information tools to assist forest management and planning under precision forestry framework: A review. Ann. Silvic. Res. 2017, 41, 3–14. [Google Scholar]
  9. Toth, C.; Jóźków, G. Remote sensing platforms and sensors: A survey. ISPRS J. Photogramm. Remote Sens. 2016, 115, 22–36. [Google Scholar] [CrossRef]
  10. Unger, D.; Kulhavy, D.; Busch-Petersen, K.; Hung, I.-K. Integrating faculty led service learning training to quantify height of natural resources from a spatial science perspective. Int. J. High. Educ. 2016, 5, 104–116. [Google Scholar] [CrossRef]
  11. Zhang, J.; Hu, J.; Lian, J.; Fan, Z.; Ouyang, X.; Ye, W. Seeing the forest from drones: Testing the potential of lightweight drones as a tool for long-term forest monitoring. Biol. Conserv. 2016, 198, 60–69. [Google Scholar] [CrossRef]
  12. Anderson, K.; Gaston, K. Lightweight unmanned aerial vehicles will revolutionize spatial ecology. Front. Ecol. Environ. 2013, 11, 138–146. [Google Scholar] [CrossRef] [Green Version]
  13. Paneque-Gálvez, J.; McCall, M.K.; Napoletano, B.M.; Wich, S.A.; Koh, L.P. Small drones for community-based forest monitoring: an assessment of their feasibility and potential in tropical areas. Forests 2014, 5, 1481–1507. [Google Scholar] [CrossRef]
  14. Liba, N.; Berg-Jürgens, J. Accuracy of orthomosaic enerated by different methods in example of UAV platform MUST Q. IOP Conf. Ser. Mater. Sci. Eng. 2015, 96, 1–8. [Google Scholar] [CrossRef]
  15. Nagendran, S.K.; Tung, W.Y.; Ismail, M.A.M. Accuracy assessment on low altitude UAV-borne photogrammetry outputs influenced by ground control point at different altitude. IOP Conf. Ser. Earth Environ. Sci. 2018, 169, 1–9. [Google Scholar] [CrossRef]
  16. Lima, S.; Kux, H.; Shiguemori, E. Accuracy of autonomy navigation of unmanned aircraft systems through imagery. Int. J. Mech. Mechatron. Eng. 2018, 12, 466–470. [Google Scholar]
  17. Tomaštík, J.; Mokroš, M.; Saloň, Š.; Chudý, F.; Tunák, D. Accuracy of photogrammetric UAV-based point clouds under conditions of partially-open forest canopy. Forests 2017, 8, 151. [Google Scholar] [CrossRef]
  18. Júnior, L.; Ferreira, M.; Côrtes, J.; Jorge, L. High accuracy mapping with cartographic assessment for a fixed-wing remotely piloted aircraft system. J. Appl. Remote Sens. 2018, 12, 014003. [Google Scholar]
  19. Prajwal, M.; Rishab, J.; Vaibhav, S.; Karthik, K.S. Optimal number of ground control points for a UAV based corridor mapping. Int. J. Innov. Res. Sci. Eng. Technol. 2016, 5, 28–32. [Google Scholar]
Figure 1. Map identifying the center location of each drone image acquired on 7 September 2017 and the corresponding drone flight path.
Figure 1. Map identifying the center location of each drone image acquired on 7 September 2017 and the corresponding drone flight path.
Drones 03 00046 g001
Figure 2. Orthomosaic of the September 7, 2017 imagery where each of the 30 ground control points (GCPs) is represented as a red cross.
Figure 2. Orthomosaic of the September 7, 2017 imagery where each of the 30 ground control points (GCPs) is represented as a red cross.
Drones 03 00046 g002
Figure 3. Positions of a GCP based on seven orthomosaics with the standard deviational ellipse of each positional cluster of a GCP.
Figure 3. Positions of a GCP based on seven orthomosaics with the standard deviational ellipse of each positional cluster of a GCP.
Drones 03 00046 g003
Figure 4. Map of mean center locations with the magnitude of precision presented as the standard distance for a summary circle and the major axis and rotation of standard deviational ellipse for each GPC position cluster.
Figure 4. Map of mean center locations with the magnitude of precision presented as the standard distance for a summary circle and the major axis and rotation of standard deviational ellipse for each GPC position cluster.
Drones 03 00046 g004
Figure 5. Directional diagram showing the rotation of each standard deviational ellipse and its major semi-axis.
Figure 5. Directional diagram showing the rotation of each standard deviational ellipse and its major semi-axis.
Drones 03 00046 g005
Table 1. Comparison of the two drone models used.
Table 1. Comparison of the two drone models used.
SpecificationsDJI Phantom 3 AdvancedDJI Phantom 4 Professional
Weight 1280 g1388 g
Diagonal size 350 mm350 mm
Max speed 57.6 km/h72.0 km/h
Max serve celling MSL 6000 m 6000 m
Max flight time 23 min30 min
GNSSGPS/GLONASSGPS/GLONASS
Camera lens FOV 94°, 20 mm, f/2.8 FOV 84°, 24 mm, f/2.8-f11
Image sensor 1/2.3” CMOS, 12.4M pixels 1” CMOS, 20.0M pixels
Hover Accuracy
Vertical:±0.1 m (with Vision Positioning)±0.1 m (with Vision Positioning)
±0.5 m (with GPS Positioning)±0.5 m (with GPS Positioning)
Horizontal:±0.3 m (with Vision Positioning)±0.3 m (with Vision Positioning)
±1.5 m (with GPS Positioning±1.5 m (with GPS Positioning
Table 2. Summary of drone flight missions and orthomosaics produced.
Table 2. Summary of drone flight missions and orthomosaics produced.
Date of FlightNumber of Photos Used for the MosaicSpatial Resolution of the Mosaic (cm)Drone Model Used
9/7/20173782.73Phantom 4
10/17/20172882.95Phantom 3
11/13/20172882.95Phantom 3
12/14/20172862.89Phantom 3
1/15/20182302.78Phantom 3
2/15/20183062.89Phantom 3
3/6/20181682.57Phantom 4
Table 3. Weather conditions at the time of each flight.
Table 3. Weather conditions at the time of each flight.
Date9/7
2017
10/17
2017
11/13
2017
12/14
2017
1/15
2018
2/15
2018
3/6
2018
Time10:0010:3010:3012:0014:1512:309:30
Temperature (C)26.124.417.813.36.122.820.0
Dew Point (C)11.13.915.01.75.618.3−6.1
Humidity (%)39268345977617
Wind DirectionVariableENENENNWNSNNW
Wind Speed (km/h)8.111.39.74.816.111.322.5
Wind Gust (km/h)00000035.4
Pressure (mm Hg)757757759752762752752
Weather ConditionFairFairCloudyFairCloudyMostly CloudyFair

Share and Cite

MDPI and ACS Style

Hung, I.-K.; Unger, D.; Kulhavy, D.; Zhang, Y. Positional Precision Analysis of Orthomosaics Derived from Drone Captured Aerial Imagery. Drones 2019, 3, 46. https://doi.org/10.3390/drones3020046

AMA Style

Hung I-K, Unger D, Kulhavy D, Zhang Y. Positional Precision Analysis of Orthomosaics Derived from Drone Captured Aerial Imagery. Drones. 2019; 3(2):46. https://doi.org/10.3390/drones3020046

Chicago/Turabian Style

Hung, I-Kuai, Daniel Unger, David Kulhavy, and Yanli Zhang. 2019. "Positional Precision Analysis of Orthomosaics Derived from Drone Captured Aerial Imagery" Drones 3, no. 2: 46. https://doi.org/10.3390/drones3020046

APA Style

Hung, I. -K., Unger, D., Kulhavy, D., & Zhang, Y. (2019). Positional Precision Analysis of Orthomosaics Derived from Drone Captured Aerial Imagery. Drones, 3(2), 46. https://doi.org/10.3390/drones3020046

Article Metrics

Back to TopTop