Augmented Reality Tool for the Situational Awareness Improvement of UAV Operators
Abstract
:1. Introduction
- Enhancement of the video stream with the UAV flight route.
- Identification of targets (as illustrated in Figure 1c) and viewpoint-based classification of targets according to occlusion with respect to geographical information.
2. Functionalities of the Augmented Reality (AR) Tool
2.1. Route Orientation
2.2. Target Identification
3. Structure of the AR Tool
3.1. Input Data Processing
3.1.1. Mission Planning Data
3.1.2. Mission Execution Data
- information that remains constant throughout a mission: mission identification (tag 3), the type of image sensor (tag 11) and the coordinate system (tag 12).
- information that changes in time: the UNIX Time Stamp (tag 2), the camera sensor, the platform position and orientation.
3.2. AR Solution Module
3.2.1. Real World
3.2.2. Projection Model: Conversion between Coordinate Systems
3.2.3. Virtual World
3.2.4. Flying Route
3.2.5. Localization and Visualization of Targets
3.3. Augmented Video
4. Results
4.1. Results on Route Orientation
4.2. Results on Target Identification
5. Conclusions
Acknowledgments
Author Contributions
Conflicts of Interest
Abbreviations
AR | Augmented Reality |
CRD | Common Route Definition |
DTED | Digital Terrain Elevation Data |
ECEF | Earth-Centered Earth-Fixed Coordinate system |
FOV | Field of View |
GCS | Ground Control Station |
GPS | Global Positioning System |
GSD | Ground Sampling Distance |
IMU | Inertial Measurement Unit |
ISTAR | Intelligence, Surveillance, Target Acquisition and Reconnaissance |
KLV | Key-Length-Value |
MALE | Medium-Altitude Long-Endurance |
MAV | Micro Aerial Vehicles |
MI | Motion Imagery |
NATO | North Atlantic Treaty Organization |
NED | North-East-Down Coordinate System |
OBJ | Wavefront Object |
SMPTE | Society of Motion Picture and Television Engineers |
TLV | Tag-Length-Value |
UAV | Unmanned Aerial Vehicle |
UL | Universal Label |
WGS84 | World Geodetic System 1984 |
XML | Extensible Markup Language |
References
- Acevedo, J.J.; Arrue, B.C.; Maza, I.; Ollero, A. Cooperative large area surveillance with a team of aerial mobile robots for long endurance missions. J. Intell. Robot. Syst. 2013, 70, 329–345. [Google Scholar]
- Colomina, I.; Molina, P. Unmanned aerial systems for photogrammetry and remote sensing: A review. ISPRS J. Photogramm. Remote Sens. 2014, 92, 79–97. [Google Scholar] [CrossRef]
- Máthé, K.; Buşoniu, L. Vision and control for UAVs: A survey of general methods and of inexpensive platforms for infrastructure inspection. Sensors 2015, 15, 14887–14916. [Google Scholar] [CrossRef] [PubMed]
- Schauwecker, K.; Zell, A. On-board dual-stereo-vision for the navigation of an autonomous MAV. J. Intell. Robot. Syst. 2014, 74, 1–16. [Google Scholar] [CrossRef]
- Oliveira, T.; Aguiar, A.P.; Encarnação, P. Moving path following for unmanned aerial vehicles with applications to single and multiple target tracking problems. IEEE Trans. Robot. 2016, 32, 1062–1078. [Google Scholar] [CrossRef]
- Chen, J.; Cao, R.; Wang, Y. Sensor-Aware Recognition and Tracking for Wide-Area Augmented Reality on Mobile Phones. Sensors 2015, 15, 31092–31107. [Google Scholar] [CrossRef] [PubMed]
- Zollmann, S.; Schall, G.; Junghanns, S.; Reitmayr, G. Comprehensible and interactive visualizations of GIS data in augmented reality. In International Symposium on Visual Computing; Springer: Berlin, Heidelberg, Germany, 2012; pp. 675–685. [Google Scholar]
- Wither, J.; DiVerdi, S.; Höllerer, T. Annotation in outdoor augmented reality. Comput. Graph. 2009, 33, 679–689. [Google Scholar] [CrossRef]
- Zollmann, S.; Grasset, R.; Reitmayr, G.; Langlotz, T. Image-based X-ray visualization techniques for spatial understanding in Outdoor Augmented Reality. In Proceedings of the 26th Australian Computer-Human Interaction Conference on Designing Futures: The Future of Design, New York, NY, USA, 2–5 December 2014; pp. 194–203.
- Zollmann, S.; Hoppe, C.; Langlotz, T.; Reitmayr, G. FlyAR: Augmented Reality Supported Micro Aerial Vehicle Navigation. IEEE Trans. Vis. Comput. Graph. 2014, 20, 560–568. [Google Scholar] [CrossRef] [PubMed]
- Cai, Z.; Chen, M.; Yang, L. Multi-source information fusion augmented reality benefited decision-making for unmanned aerial vehicles: A effective way for accurate operation. In Proceedings of the 2011 6th IEEE Conference on Industrial Electronics and Applications, Beijing, China, 21–23 June 2011; pp. 174–178.
- Wu, H.; Cai, Z.; Wang, Y. Vison-based auxiliary navigation method using augmented reality for unmanned aerial vehicles. In Proceedings of the IEEE 10th International Conference on Industrial Informatics, Beijing, China, 25–27 July 2012; pp. 520–525.
- STANAG 4609 Ed.3, NATO Motion Imagery. Available online: http://www.gwg.nga.mil/misb/docs/natodocs/STANAG4609Ed3.pdf (accessed on 3 February 2017).
- MISB TRM 0909.3, Constructing a MISP Compliant File/Stream. Available online: http://www.gwg.nga.mil/misb/docs/trm/TRM0909.3.pdf (accessed on 3 February 2017).
- MISB STD 0902.1 Motion Imagery Sensor Minimum Metadata Set. Available online: http://www.gwg.nga.mil/misb/docs/standards/ST0902.1.pdf (accessed on 3 February 2017).
- MISB Standard 0601.2, UAS Datalink Local Metadata Set. Available online: http://www.gwg.nga.mil/misb/docs/standards/ST0601.2.pdf (accessed on 3 February 2017).
- Koks, D. Using Rotations to Build Aerospace Coordinate Systems; Technical Report; DSTO Systems Sciences Laboratory: Edinburgh, Australia, 2008. [Google Scholar]
- Hartley, R.; Zisserman, A. Multiple View Geometry in Computer Vision; Cambridge University Press: Cambridge, UK, 2003. [Google Scholar]
- Zhang, Z. A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 1330–1334. [Google Scholar] [CrossRef]
- OpenSceneGraph. Available online: http://www.openscenegraph.com/ (accessed on 3 February 2017).
- Department of Defense World Geodetic System 1984, Its Definition and Relationships With Local Geodetic Systems, NIMA TR8350.2; Technical Report; National Imagery and Mapping Agency: St. Louis, MO, USA, 1984.
- WavefrontOBJ. Available online: http://www.fileformat.info/format/wavefrontobj/egff.htm (accessed on 3 February 2017).
- Ruano, S.; Gallego, G.; Cuevas, C.; García, N. Aerial video georegistration using terrain models from dense and coherent stereo matching. In Proceedings of the International Society for Optics and Photonics, SPIE Defense + Security, Baltimore, MA, USA, 5–9 May 2014.
- Augmented Reality Tool for the Situational Awareness Improvement of UAV Operators. Avalilable online: http://www.gti.ssr.upm.es/data/ (accessed on 3 February 2017).
Tag | Name | Value | Interpretation | TLV Hex Bytes |
---|---|---|---|---|
2 | UNIX Time Stamp | 1,231,798,102,000,000 ms | Mon Jan 12 2009 22:08:22 (UTC) | 02 08 00 04 60 50 58 4E 01 80 |
3 | Mission ID | Mission 12 | Mission 12 | 03 0A 4D 69 73 73 69 6F 6E 20 31 32 |
5 | Platform Heading Angle | 0x71C2 | 159.9744 Degrees | 05 02 71 C2 |
6 | Platform Pitch Angle | 0xFD3D | −0.4315251 Degrees | 06 02 FD 3D |
7 | Platform Roll Angle | 0x08B8 | 3.405814 Degrees | 07 02 08 B8 |
11 | Image Source Sensor | EO Nose | EO Nose | 0B 07 45 4F 20 4E 6F 73 65 |
12 | Image Coordinate System | Geodetic WGS84 | Geodetic WGS84 | 0C 0E 47 65 6F 64 65 74 69 63 20 57 47 53 38 34 |
13 | Sensor Latitude | 0x5595B66D | 60.17682296 Degrees | 0D 04 55 95 B6 6D |
14 | Sensor Longitude | 0x5B5360C4 | 128.42675904 Degrees | 0E 04 5B 53 60 C4 |
15 | Sensor True Altitude | 0xC221 | 14190.72 Meters | 0F 02 C2 21 |
16 | Sensor Horizontal FoV | 0xCD9C | 144.5713 Degrees | 10 02 CD 9C |
17 | Sensor Vertical FoV | 0xD917 | 152.6436 Degrees | 11 02 D9 17 |
18 | Sensor Rel. Azimuth Angle | 0x724A0A20 | 160.71921147 Degrees | 12 04 72 4A 0A 20 |
19 | Sensor Rel. Elevation Angle | 0x87F84B86 | −168.79232483 Degrees | 13 04 87 F8 4B 86 |
20 | Sensor Rel. Roll Angle | 0x00000000 | 0.0 Degrees | 14 04 00 00 00 00 |
© 2017 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license ( http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Ruano, S.; Cuevas, C.; Gallego, G.; García, N. Augmented Reality Tool for the Situational Awareness Improvement of UAV Operators. Sensors 2017, 17, 297. https://doi.org/10.3390/s17020297
Ruano S, Cuevas C, Gallego G, García N. Augmented Reality Tool for the Situational Awareness Improvement of UAV Operators. Sensors. 2017; 17(2):297. https://doi.org/10.3390/s17020297
Chicago/Turabian StyleRuano, Susana, Carlos Cuevas, Guillermo Gallego, and Narciso García. 2017. "Augmented Reality Tool for the Situational Awareness Improvement of UAV Operators" Sensors 17, no. 2: 297. https://doi.org/10.3390/s17020297
APA StyleRuano, S., Cuevas, C., Gallego, G., & García, N. (2017). Augmented Reality Tool for the Situational Awareness Improvement of UAV Operators. Sensors, 17(2), 297. https://doi.org/10.3390/s17020297