An Augmented Reality Geo-Registration Method for Ground Target Localization from a Low-Cost UAV Platform
Abstract
:1. Introduction
2. Related Research
3. Methodology
3.1. The Coordinate Conversion in Augmented Reality Geo-Registration
- The world coordinate system W: The geographical coordinate system. The GPS data and the ground vector map is in this coordinate system. In this research, W denotes the WGS-84 coordinate system.
- The camera coordinate system C: The origin of the coordinate system is the optical center of the lens. This coordinate system is used to relocate the objects in the world coordinate system from the perspective of observation.
- The projection surface coordinate system P: This coordinate system is a two-dimensional coordinate system, which is used to define the projection for the points of objects.
- The screen coordinate system S: The origin of this coordinate system is the upper left point of the screen.
3.2. The Acquisition of High Precision GPS/IMU Data
3.2.1. High Accuracy RTK GPS
3.2.2. Filter Process of the Data
3.3. The Target Localization Algorithm
3.4. The Workflow of the Proposed Method
4. Dataset
5. Experimental Results
5.1. The Results of the EKF Algorithm Process
5.2. The Result of Augmented Reality Based Geo-Registration
6. Assessments and Discussion
6.1. Error Analysis
6.2. Accuracy Assessments
- The mounting position error. The mounting position error consists of two parts: the offset between the GPS antenna and the center of the drone, and the offset between the center of the drone and the camera optical center. The flight controller of the M600 drone has a built-in compensation mechanism for the offset between the GPS antenna and the center of the drone. Due to the existing of the gimbal, the camera has position and angle offset from the center of the drone. The measurement of the camera position may be not accurate. Furthermore, to isolate the high-frequency vibration from the motors, there are several vibration absorbers between the gimbal and the drone body. The soft connection between the camera and the drone will lead the mounting angle between the UAV body and the camera to be varied during the flight, which will subsequently make the yaw angle of the drone differ from the yaw angle of the camera.
- The video lag. During the test, the telemetry data and the video data are transferred through different data links. The video link has a delay in the transfer process, this may cause the telemetry data aligning to a wrong video frame and bring errors to the geo-registration and target localization process. The onboard video link has a latency of 50 ms [34] and may be more according to environmental conditions. The base latency of 50 ms was compensated in the experiments and a test zone that has low radio inference was selected in order to minimize the effects of video lag. During the flight, the attitude and position data may not arrive at the same time as the video frame, this difference is compensated by applying a linear extrapolation between contiguous attitude and position data frames.
- The screen selecting error. Affected by the screen resolution and the human operation, the position selected on the screen may not the exact point of the target, these errors may be several meters in the real world.
- The rough terrain model. During the flight test, the terrain of the test zone is represented by a flat surface. As the actual terrain varies and the orientation of the UAV changes. The value and direction of the error may change irregularity. When there exists DEM data of the target zone, using actual DEM of the target zone can increase the localization accuracy.
6.3. The Use of DJI M600 Platform
- Although the DJI M600 is slightly more expensive among multi-rotor UAV platforms, the difficulty and cost of using the M600 are still much lower than that of oil-powered helicopter UAVs, which are capable of carrying navigation-level IMU sensors. A single person can operate an M600 in extreme situations that is nearly impossible for oil-powered helicopter UAVs.
- The D-RTK kit onboard the M600 is an off-the-shelf product provided by DJI, which is highly integrated with the flight controller of the M600. This experiment platform can save plenty of time and work from building a test environment from scratch. Although the M600 is not very cheap compared to cheap quadcopters, the attitude sensors of M600 and the onboard camera gimbal have no essential difference with the cheap drones. Thus the conclusions of this research are also applicable to cheap quadcopters. The proposed method in this article used none of the extra abilities that only an expensive platform like the M600 can have. There is no theoretical barrier to apply the proposed method to a real low-cost UAV platform.
7. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Puri, A. A Survey of Unmanned Aerial Vehicles (UAV) for Traffic Surveillance; Department of Computer Science and Engineering, University of South Florida: Tampa, FL, USA, 2005; pp. 1–29. [Google Scholar]
- Kanistras, K.; Martins, G.; Rutherford, M.J.; Valavanis, K.P. A survey of unmanned aerial vehicles (UAVs) for traffic monitoring. In Proceedings of the 2013 International Conference on Unmanned Aircraft Systems (ICUAS), Atlanta, GA, USA, 28–31 May 2013; pp. 221–234. [Google Scholar]
- Ou, G.; Gao, Y.; Liu, Y. Real-Time Vehicular Traffic Violation Detection in Traffic Monitoring Stream. In Proceedings of the 2012 IEEE/WIC/ACM International Conference on Web Intelligence and Intelligent Agent Technology Workshops (Wi-Iat Workshops 2012), Macau, China, 4–7 December 2012; Volume 3, pp. 15–19. [Google Scholar]
- Yuan, C.; Liu, Z.; Zhang, Y. UAV-based Forest Fire Detection and Tracking Using Image Processing Techniques. In Proceedings of the International Conference on Unmanned Aircraft Systems (ICUAS), Denver, CO, USA, 9–12 June 2015; pp. 639–643. [Google Scholar]
- Yuan, C.; Zhang, Y.; Liu, Z. A survey on technologies for automatic forest fire monitoring, detection, and fighting using unmanned aerial vehicles and remote sensing techniques. Can. J. For. Res. 2015, 45, 783–792. [Google Scholar] [CrossRef]
- Sozen, V.; Craparo, E.M. Optimal Unmanned Aerial Vehicle Allocation for Border Surveillance. Mil. Oper. Res. 2016, 21, 19–35. [Google Scholar]
- Hodgson, J.C.; Baylis, S.M.; Mott, R.; Herrod, A.; Clarke, R.H. Precision wildlife monitoring using unmanned aerial vehicles. Sci. Rep. 2016, 6. [Google Scholar] [CrossRef] [PubMed]
- Giordan, D.; Hayakawa, Y.; Nex, F.; Remondino, F.; Tarolli, P. Review article: the use of remotely piloted aircraft systems (RPASs) for natural hazards monitoring and management. Nat. Hazards Earth Syst. Sci. 2018, 18, 1079–1096. [Google Scholar] [CrossRef] [Green Version]
- Mullerova, J.; Bruna, J.; Bartalos, T.; Dvorak, P.; Vitkova, M.; Pysek, P. Timing Is Important: Unmanned Aircraft vs. Satellite Imagery in Plant Invasion Monitoring. Front. Plant Sci. 2017, 8, 887. [Google Scholar] [CrossRef] [PubMed]
- Watts, A.C.; Ambrosia, V.G.; Hinkley, E.A. Unmanned Aircraft Systems in Remote Sensing and Scientific Research: Classification and Considerations of Use. Remote Sens. 2012, 4, 1671–1692. [Google Scholar] [CrossRef] [Green Version]
- Gawron, V.J. Human Factors Issues in the Development, Evaluation, and Operation of Uninhabited Aerial Vehicles. In Proceedings of the AUVSI’98: Association for Unmanned Vehicle Systems International, Huntsville, AL, USA; 1998; pp. 431–438. [Google Scholar]
- Williams, K.W. A Summary of Unmanned Aircraft Accident/incident Data: Human Factors Implications; Federal Aviation Administration Civil Aeromedical Institute: Oklahoma City, OK, USA, 2004. [Google Scholar]
- Calhoun, G.L.; Draper, M.H.; Abernathy, M.F.; Patzek, M.; Delgado, F. Synthetic vision system for improving unmanned aerial vehicle operator situation awareness. In Proceedings of the Enhanced and Synthetic Vision 2005 Conference, Orlando, FL, USA, 28 March 2005; pp. 219–230. [Google Scholar]
- Crowley, D.E.; Murphy, R.R.; McNamara, A.; McLaughlin, T.D.; Duncan, B.A. AR browser for points of interest in disaster response in UAV imagery. In Proceedings of the CHI’14 Extended Abstracts on Human Factors in Computing Systems, Toronto, ON, Canada, 26 April–1 May 2014; pp. 2173–2178. [Google Scholar]
- Drury, J.L.; Richer, J.; Rackliffe, N.; Goodrich, M.A. Comparing Situation Awareness for Two Unmanned Aerial Vehicle Human Interface Approaches; Mitre Corp: Bedford, MA, USA, 2006. [Google Scholar]
- Jill, L.D.; Justin, R.; Nathan, R.; Michael, A.G. Comparing Situation Awareness for two Unmanned Aerial Vehicle Human Interface Approaches; Defense Technical Information Center: Fort Belvoir, VA, USA, 2006. [Google Scholar]
- Foyle, D.C.; Andre, A.D.; Hooey, B.L. Situation Awareness in an Augmented Reality Cockpit: Design, Viewpoints and Cognitive Glue. In Proceedings of the 11th International Conference on Human Computer Interaction, Las Vegas, NV, USA, 22–27 July 2005; pp. 3–9. [Google Scholar]
- Eugster, H.; Nebiker, S. Geo-registration of Video Sequences Captured from Mini UAVs: Approaches and Accuracy Assessment. In Proceedings of the 5th International Symposium on Mobile Mapping Technology, Padua, Italy, 29–31 May 2007; pp. 1–8. [Google Scholar]
- Eugster, H.; Nebiker, S. UAV-based augmented monitoring-real-time georeferencing and integration of video imagery with virtual globes. In The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences; ISPRS: Heipke, Germany, 2008; Volume XXXVI, p. 37. [Google Scholar]
- Eugster, H.; Nebiker, S. Real-time georegistration of video streams from mini or micro UAS using digital 3D city models. In Proceedings of the 6th International Symposium on Mobile Mapping Technology, At Presidente Prudente, Sao Paulo, Brazil, 21–24 July 2009. [Google Scholar]
- Ruano, S.; Cuevas, C.; Gallego, G.; Garcia, N. Augmented Reality Tool for the Situational Awareness Improvement of UAV Operators. Sensors 2017, 17, 297. [Google Scholar] [CrossRef] [PubMed]
- Uwe, S.; Jakub, K.; Ludwig, H. Texture mapping of 3d building models with oblique direct geo-referenced airborne IR image sequences. In Proceedings of the ISPRS Workshop: High-resolution Earth Imaging for Geospatial Information, Hannover, Germany 2–5 June 2009; Volume 1. [Google Scholar]
- Liu, X.; Tao, X.; Duan, Y.; Ge, N. Visual information assisted UAV positioning using priori remote-sensing information. Multimed. Tools Appl. 2018, 77, 14461–14480. [Google Scholar] [CrossRef]
- Dogancay, K. On the bias of linear least squares algorithms for passive target localization. Signal Proc. 2004, 84, 475–486. [Google Scholar] [CrossRef]
- Barber, D.B.; Redding, J.D.; McLain, T.W.; Beard, R.W.; Taylor, C.N. Vision-based target geo-location using a fixed-wing miniature air vehicle. J. Intell. Robot. Syst. 2006, 47, 361–382. [Google Scholar] [CrossRef]
- Gibbins, D.; Roberts, P.; Swierkowski, L. A video geo-location and image enhancement tool for small unmanned air vehicles (UAVs). In Proceedings of the 2004 Intelligent Sensors, Sensor Networks & Information Processing Conference, Melbourne, Australia, 14–17 December 2004; pp. 469–473. [Google Scholar]
- Ponda, S.; Kolacinski, R.; Frazzoli, E. Trajectory optimization for target localization using small unmanned aerial vehicles. In Proceedings of the AIAA Guidance, Navigation, and Control Conference, Chicago, IL, USA, 10–13 August 2009; p. 6015. [Google Scholar]
- Huang, W.; Sun, M.; Li, S. A 3D GIS-based interactive registration mechanism for outdoor augmented reality system. Expert Syst. Appl. 2016, 55, 48–58. [Google Scholar] [CrossRef]
- Hooijberg, M. Gauss-Krüger Projection. In Practical Geodesy; Springer: Berlin, Germany, 1997; pp. 81–132. [Google Scholar]
- DJI. D-RTK GNSS—Specs—DJI. Available online: https://www.dji.com/d-rtk/info#specs (accessed on 10 September 2018).
- Gonzalez, R.; Giribet, J.I.; Patino, H.D. An approach to benchmarking of loosely coupled low-cost navigation systems. Math. Comput. Model. Dyn. Syst. 2015, 21, 272–287. [Google Scholar] [CrossRef]
- Gonzalez, R.; Giribet, J.I.; Patino, H.D. NaveGo: A simulation framework for low-cost integrated navigation systems. Control Eng. Appl. Inform. 2015, 17, 110–120. [Google Scholar]
- DJI. DJI Developer. Available online: https://developer.dji.com/mobile-sdk/ (accessed on 17 October 2018).
- DJI. DJI Lightbridge 2—Professional Quality Live Streaming From the Sky. Available online: https://www.dji.com/lightbridge-2 (accessed on 17 October 2018).
Direction | Angle Random Walks | Velocity Random Walks | Gyroscope Correlation Times | Accelerometer Correlation Times | ||
---|---|---|---|---|---|---|
X | 5.481 | 3.6 | 6.35 | 7.610 | 50 | 30 |
Y | 5.907 | 9.3 | 12.78 | 3.055 | 20 | 60 |
Z | 5.717 | 9.3 | 1.12 | 4.012 | 60 | 40 |
Result Set | Errors/m | RMS Error/m | |||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
Using Original Attitude Data | 4.94 | 5.28 | 4.15 | 13.67 | 12.33 | 10.89 | 12.19 | 18.48 | 22.11 | 8.74 | 12.58 |
EKF Filtered Attitude Data | 3.86 | 5.33 | 3.56 | 5.50 | 4.62 | 5.55 | 5.63 | 5.57 | 7.32 | 4.03 | 5.21 |
© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Ren, X.; Sun, M.; Jiang, C.; Liu, L.; Huang, W. An Augmented Reality Geo-Registration Method for Ground Target Localization from a Low-Cost UAV Platform. Sensors 2018, 18, 3739. https://doi.org/10.3390/s18113739
Ren X, Sun M, Jiang C, Liu L, Huang W. An Augmented Reality Geo-Registration Method for Ground Target Localization from a Low-Cost UAV Platform. Sensors. 2018; 18(11):3739. https://doi.org/10.3390/s18113739
Chicago/Turabian StyleRen, Xiang, Min Sun, Cheng Jiang, Lei Liu, and Wei Huang. 2018. "An Augmented Reality Geo-Registration Method for Ground Target Localization from a Low-Cost UAV Platform" Sensors 18, no. 11: 3739. https://doi.org/10.3390/s18113739
APA StyleRen, X., Sun, M., Jiang, C., Liu, L., & Huang, W. (2018). An Augmented Reality Geo-Registration Method for Ground Target Localization from a Low-Cost UAV Platform. Sensors, 18(11), 3739. https://doi.org/10.3390/s18113739