Simultaneous Localization and Mapping (SLAM) and Data Fusion in Unmanned Aerial Vehicles: Recent Advances and Challenges
Abstract
:1. Introduction
- What are the fundamental operational requirements for fully functional UAVs?
- What developments have been achieved in UAV localization in the last 10 years and what are some promising research directions for the next decade?
- How does SLAM achieve perception in UAVs? Is it feasible to attain human level cognition and perception in UAVs using SLAM?
- What are the most recent SLAM techniques applied to UAVs and promising directions for further research?
- Why is data fusion a promising technique for solving object detection and scene perception in UAVs?
- Which sensors are used for object detection and scene perception in UAVs and how is multi-sensor data fusion and 3D point cloud analysis realized?
2. UAV Applications
- In agriculture and farming, UAVs enable field surveillance that allows farmers to remotely monitor crops and vegetation ready for harvest and damaged by pest infestation, and check frost levels in the fields [30]. UAVs can also be used for spraying pesticides and are considered safer and more precise than manual spraying. Fruit farms and orchard management with UAV image-processing yield better outcomes [31].
- UAVs are used to survey forests, wildlife, natural resources, and measure air pollution.
- UAVs also find widespread applications in structural inspection, architectural surveying and mapping, aerial photography, content-based remote-sensing, image retrieval, and image localization [32,33,34]. Inspection of highway infrastructure and scheduling of repairs have also been accomplished with UAVs [35].
- As UAVs can carry cameras to any difficult-to-reach region, UAVs find widespread applications in search and rescue operations [36].
- UAVs can play a vital role in the military for warfare purposes and border surveillance [39]. Some military-grade UAVs can remain airborne for weeks and span approximately a million kilometers before recharging the batteries [39]. Such UAVs can also assist fire fighters to locate hot spots that indicate fires and transmit a live video via Wi-Fi or cellular networks.
- When embedded with miniaturized antennas and RF transceivers, UAVs are useful to enhance wireless network coverage in areas of poor coverage where installing communication towers is not feasible [40]. This feature is especially beneficial in remote areas or areas affected by flood, earthquake or other disasters, as UAV based radio access networks can be rapidly deployed in an ad-hoc manner [3].
- Know where a UAV is at a given time t, defined as localization [26]. The localization problem can be resolved to some extent through short-term maps and trajectory computed using VO [16]. For instance, a radio localization device installed on the UAV can be paired with a GNSS receiver to provide a short-term trajectory and map.
- Estimate the surrounding environment in terms of co-ordinates and images, defined as map building. In SLAM, map-building is used for perception, planning, and control [45], briefly described as follows:
- Perception is the ability of a UAV to discern meaningful information from its sensors to understand the environment [46]. Both localization and map building enhance a UAV’s perception. As an autonomous entity, a UAV needs to understand its own state, location, the external environment, and the map. Perception leads to safe UAV path planning [47].
- Planning involves making decisions to achieve the trajectory objectives [48].
- Control refers to the ability of a UAV to execute the planned actions. Due to accumulation of errors, the accuracy of a map degrades with time. Visual odometry is used to optimize the trajectory and map over a longer timestep for accuracy and performance. In SLAM, the localization and mapping tasks are dependent [47].
Relevant Aspect | Currently | Expected with SLAM |
---|---|---|
GNSS vs. SLAM requirement | GNSS signal strength impacts localization. | Automatic localization in dynamic scenarios [26]. |
Low capabilities for path planning and UAV trajectory estimation. | Enhanced path planning and UAV trajectory estimation across dynamic terrains [44,47]. | |
Costly to provide wireless communications and networking services. | SLAM and data fusion can facilitate UAV network services in an ad hoc manner [2,3]. | |
Application scenarios lack robustness. | Robust UAV path planning in disaster relief missions [28,29]. | |
GNSS coverage interrupted in rural zones. | SLAM and data fusion make use of computer vision [40,49]. | |
Computing resources needed for ground target localization. | Dynamic localization and resource provisioning in response to sensor data [50,51]. | |
UAV Applications | GNSS may not be available in natural disasters and response scenarios. | With SLAM, only targeted locations are considered for UAV trajectory [8]. |
GNSS is suitable for remote proximity monitoring between fixed target locations. | Data fusion and SLAM support proximity monitoring between fixed and mobile targets using sensor-mounted UAVs [34]. | |
Target landmark scaling is limited. | SLAM offers incremental scaling of target locations. | |
Visual tracking of target landmarks is dependent on signal strength. | Diverse due to use of visual odometry, photogrammetry, and sensor data sequence [32,35]. | |
Sensing and Analysis Approaches | Data from all landmarks is collected. | Data from only the targeted landmarks is collected [31]. |
Massive bandwidth required for robust cooperative positioning of UAVs. | Efficient use of bandwidth through robust cooperative UAV positioning [52]. | |
End-to-end scene perception. | Feasible in cluttered GPS-denied environment [53,54,55]. | |
Depends on spatial orientation of target. | Low-cost sensors useful in GPS-denied environment [23,24]. | |
Multi-UAV trajectory planning Services | Complex task allocation and data fusion. | Simplified collaborative SLAM [56,57,58]. |
Limited coordination in varying spatial configuration. | Seamless and cognitive neighbour-aware decision making [59,60]. | |
Throughput maximization needs wireless power. | SLAM and data fusion optimize edge computing through synergy between vision and communications [61,62]. | |
Filter based UAV pose tracking. | Refined and non-linear pose tracking [63]. | |
Precise positioning is difficult. | Precision under uncertainty and is error aware [64]. | |
Low accuracy photogrammetry. | Dynamic and enhanced visual SLAM [65]. | |
Data from UAVs hard to integrate. | SLAM facilitates simplified sensor data integration [66]. | |
3D Characterization | Poor in low-texture environment. | Better due to visual-inertial SLAM and sensor fusion [67,68]. |
Camera-based target tracking. | Spatio-temporal observations from multiple sensors [20,69]. | |
UAV Deployment & Management | UAV swarms hard to track. | Smart fusion of multi-sensor data for UAV localization [17]. |
UAV placement a priori in large paths. | Dynamic placement of UAVs for any trajectory [19,21] | |
Integrated sensor fusion with GNSS. | SLAM and data fusion geo-reference target landmarks, conserve computing resources using 3D point-clouds [4,5]. | |
Other Issues | Energy saving strategies are limited. | Flexible energy saving strategies [70]. |
Complex and static control techniques. | Flexible trajectory control techniques with enhanced FoV [71] |
3. Multimodal Sensor Fusion
- The nature and format of data collected by each sensor [83]
- Sensor’s field of view [71]
- Synchronization times of various sensors [72]
- Data capture frequency [27]
- Sensor resolution and data packet size [27]
- Data association and calibration to correlate data from different sources [25]
- Most appropriate fusion approach [72]
3.1. Challenges in Multimodal Sensor Fusion
- (i)
- Impact of uncertainty and imprecise data: Each sensor operation is based on a particular data acquisition framework, which might differ based on the target phenomenon of interest. Therefore, the weight of each sensor outcome should be different. Furthermore, when merging data from different sensors, the fusion framework must reduce the impact of uncertainty and imprecise data for a broader representation of the environment [78].
- (ii)
- Different data formats: Raw sensor data, with a multitude of formats, are input to a fusion algorithm to obtain meaningful and relevant information. Therefore, the datasets need to be formatted to a universal file format that can be read by the fusion algorithm [90].
- (iii)
- Edge/Cloud processing: While cloud computing requires relaying the data to the cloud server for computing, the edge computing architecture enables the sensor data to be processed at the source of information. The choice between edge or cloud processing in UAV environments depends on latency, amount of data to be processed, available bandwidth, overall cost, and the feasibility to deploy edge servers in remote aerial locations. Using edge computing to locally process the captured sensor data while using cloud for overall SLAM analytics is a preferred approach [60].
- (iv)
- Collaboration and coordination among multiple UAVs: Signal unavailability or interference in an outdoor environment may corrupt the sensor data and induce errors [60]. Therefore, it maybe required to coordinate sensor data from multiple UAVs for optimum target coverage with low power consumption. This approach will reduce the possibility of missing target landmarks as well as prevent redundant coverage. Here, target coverage areas can be assigned to specific UAVs [61].
- (v)
- Noise and Bayesian inference: Inference methods such as Bayesian probabilistic fusion, evidential belief reasoning and fuzzy logic rely on probability distribution patterns to counter the effect of uncertainty in the sensor data. Sensor performance is affected by environmental factors such as changes in the magnetic field, temperature, etc., that add uncertainty to the measured data [76,91].
- (vi)
- Detecting overlap: Detecting overlap between individual sensor data can enhance complementary or cooperative sensor data fusion and reduce redundancy. Data from two or more sensors about the same target could be fused to enhance confidence in the data. For example, the data pertaining to overlapping landmarks in visual sensors are considered to be redundant while the data on the same landmark captured by two sensors with different fields of view are considered cooperative [92].
- (vii)
- Reliability issues: Often, the sensor data are not just uncertain, but could also be unreliable. One sensor may offset the (dis)advantage of another sensor. As one unreliable sensor may cause incorrect fusion results, reliability evaluation is indispensable in UAV SLAM applications [93].
3.2. UAV Data Fusion Requirements and Different Sensors Used in Practice
- (a)
- Monocular Camera: The monocular camera is simple, cost-effective, and easy to operate. The camera projects a 3D environment in a 2D form due to its 6 degrees of freedom (DoF) movement [77,89,108]. However, accuracy of the obtained map is based on the uncertainty associated with 6-DoF rigid body transformations [109]. The depth can be estimated from a single image but it leads to scale ambiguity as a result of translational movement. The collection of spatial points is calibrated with the intrinsic and extrinsic parameters. In a given set of points, point A and point B may be connected, while point B and point C may not be [67]. Once the distances are known, the 3D scene structure can be retrieved from a single frame to eliminate scale ambiguity. Monocular cameras estimate camera position, illumination changes, and scene structure using per-pixel depth estimates [110]. The depth estimate is achieved by comparing the latest frame with the past frames or by comparing the input image with the updated map [111].
- (b)
- Stereo Cameras: Stereo and RGB-D cameras measure the distance between objects and camera to overcome the shortcomings of monocular cameras. A stereo camera comprises multiple synchronized monocular cameras [112]. Each pixel’s 3D position is calculated from the physical distance from a baseline [68,113]. Stereo cameras require computational power to calculate image depth, stereo matching, and pixel-disparity to generate a real-time depth map. Depth estimation for stereo cameras compares images from multiple left and right cameras. Stereo cameras are used both indoors and outdoors, but are limited by baseline length, camera resolution, and calibration accuracy [102].
- (c)
- RGB-D cameras: These are superior since they can measure distance and build a point cloud from a single image frame. By combining VO and LiDAR sensors, depth information provided by RGB-D cameras can be enhanced [69,102]. RGB-D cameras suffer from:
- Narrow measurement range [102].
- Susceptibility to noisy data [69].
- Small field of view [71].
- Susceptibility to interference [114].
- Inability to detect transparent material.
- Low accuracy in 3D reconstruction and scene understanding in dynamic, unstructured, complex, uncertain and large-scale environments [115].
- (d)
- Fish-eye camera: This is a wide-angle perspective camera used to create a fish-eye view. These cameras cover up to – horizontal field-of-view. However, radial lens distortions cause nonlinear pixel mapping, making image processing algorithms more complex [116].
- (e)
- Rolling shutter cameras: These are dynamic vision sensors which produce up to one million frames per second [117]. These integrate camera videos, motion sensors (GPS/IMU), and a 3D semantic map dependent on the environment. Edges, planes, and surface features of the target landmark can be captured by these cameras deploying enhanced feature dependencies and tracking joint edges. In large-scale scenarios such as in smart cities, rolling shutter cameras that capture geometric features such as points, lines, and planes are used to infer the environmental structures [11].
- (f)
- LiDAR cameras: In SLAM, LiDAR or range-finding sensors generate mapping data based on visual feature matching and multi-view geometry to find overlapping points amidst dense data. To recognize already visited places in landmarks, compact point cloud descriptors are compared between two matching points [118]. LiDAR cameras are used for submap matching as LiDAR scans and point-clouds can be clustered into submaps [119]. LiDAR has sparse, high precision depth data while cameras have dense, but low precision depth data [105]. However, LiDAR does not capture images, and is not well suited to sense see-through surfaces or underwater small objects [120].
4. Simultaneous Localization and Mapping (SLAM)
4.1. SLAM in UAV: Motivation and Requirements
- The UAV trajectory describes how as changes from time step to n [124]. The position consists of the translation on the three axes and the rotation around the three axes, having six degrees of freedom, and a vector in . The vision sensor variables are usually represented as a vector in and the m landmarks as a vector in . A photograph of the landmarks in space is projected using the camera’s projection model, utilizing state estimation and distortion estimation [74].
- The sensor observations detect specific landmarks at trajectory position [94]. A group of point clouds can fall in the same category if the resolution of the sensor is less than , which is the angle between a landmark point and the UAV [102]. Considering all the sensors of a rotation range, the depth values can be stored into a vector, where each possible category is represented by the elements in the vector. Two 1D vectors from sequential sensor scans can be concatenated to provide input for the fusion algorithm. For instance, an image of size represents an acquisition of the sensor that detects and extracts the features in the surrounding environment [53].
- SLAM as a state estimation problem estimates the internal, hidden state variables from the sensor data. The UAV trajectory and the sensor observations may be linear or nonlinear and the noise in sensor data may be Gaussian or non-Gaussian [73].
4.1.1. Single-UAV SLAM
- filtering: updating the current state at each time step given the new observation (Using the past and future information to update the current state is called batch filtering) [95].
- smoothing: optimizing the whole trajectory based on the past observations (Using only the past information to update the current state is called incremental smoothing) [95].
4.1.2. Collaborative or Multiple UAV SLAM
- Multiple instances of single-UAV SLAM position lead to challenges related to spatial configuration over time dependency on the number of UAVs [134].
- Robust to the loss of individual units [2].
- C-SLAM is more efficient as the number of UAVs increases coordination, communication range and spatial distribution [132].
- The bandwidth available in 5G networks allows information to be continuously and seamlessly transmitted by the UAVs as they move and their trajectory changes [61].
- Distribution of tasks reduces the overall computational cost of C-SLAM [58].
4.1.3. Limitations of C-SLAM
4.2. Search Space Reduction in Linear Systems Using Kalman Filter
4.3. Search Space Reduction in Nonlinear Systems Using Extended Kalman Filters
- The UAV position and the position covariance are predicted first [140].
- A large point-cloud depends on the non-linearity of the observations and the linear approximation is valid in a small range [70,80]. For multiple submaps [119], the EKF filter is linearized at every that leads to nonlinear error in EKF [147]. Estimation of spatial uncertainty in the UAV trajectory uses EKF to estimate the mean and covariance of the states [74].
- The EKF stores the state variable’s mean and variance in the memory. If the number of landmarks is significantly larger than the UAV trajectory positions [148], then the storage grows in a quadratic manner with the number of states. As the covariance matrix is also stored, EKF SLAM is not suitable for large-scale scenarios to estimate from previous observations. The complexity increases further when it is required to simultaneously update the state vector of UAV maps based on the covariance of landmark m [146].
5. Visual SLAM and Image Fusion
- Sensor data fusion: Acquisition, processing, and synchronization of sensor data and images [84].
- Visual odometry: Estimates the sensor movement between adjacent frames to generate a UAV trajectory map [99].
- Place recognition and loop closing: Place recognition determines if a UAV has reached a previously visited position and loop closure reduces the accumulated drift [135].
5.1. Visual Odometry and Photogrammetry in UAV
5.2. Impact of Sensor Parameters on Accuracy of Visual 3D Reconstruction
5.3. Adding New Timeframes and Landmarks = in UAV Trajectory and Removing Older Ones
- The landmark is only observed in . In this scenario,
- Add a new timeframe into the window as well as its corresponding landmarks.
- Delete an old timeframe in the window which may also delete the landmarks it observes.
- The landmark is seen in –, but may not be seen in the future if UAV avoids loop closure. To track the missing feature points, this landmark needs a priori information of the future pose estimation.
- The landmark is seen in – and may be seen in the future. This landmark will be estimated later [75]. The observation of this landmark by can be discarded if did not see it.
6. Open Issues and Future Research Directions in UAV-SLAM
6.1. Open Issues
- The number of landmark points to be scanned while a UAV navigates a trajectory range in the thousands. The level of detail to be captured and the required point-cloud density imposes constraints on the resulting data for targetting landmarks distributed over a large area. For example, to capture building facades and utility lines, the usable range of a UAV sensor varies with distance and may not capture dense and detailed data representation [26].
- Loop closure impacts both the localization and map building that enable a UAV to identify the scenes it has visited before. Sensors set additional constraints on the application environment when detecting similarities between images. The accumulated error can be reduced by calculating similarities of images and reliable loop detection eliminates cumulative errors for globally consistent trajectories and maps [80].
- Sensor fusion and navigation for UAVs are usually of large dimensions because all landmark variables are considered by sensors between consecutive positions and landmarks. In visual SLAM, a single image contains hundreds of feature points, which greatly increases the feature-set dimensions. Such a feature matrix is of complexity, which is very expensive in computation. UAV-SLAM requires mechanisms to limit the problem scale to maintain real-time calculations as a UAV navigates a trajectory. As the UAV computing power is limited, calculating SLAM estimates at every moment limits the calculation time for landmarks as the iterations cannot exceed a certain upper bound. In real-time UAV-SLAM, the computational time must not exceed a few milliseconds [134].
- The complexity increases when, instead of using images, the landmark data are extracted from continuous video at regular timeframes. With limited computation, the timeframes may be used only for localization and do not contribute to mapping or vice versa. The number of timeframes increases as the scale of the map grows, limiting SLAM accuracy in real-time computing. If there are N timeframes in a window, and their positions are known in the vector space, then the previous timeframe estimates must remain unchanged during optimization, discarding the variables outside the window [74].
6.2. Future Research Trends
6.2.1. UAV Data Fusion for Static and Dynamic SLAM
6.2.2. Deep Learning Based Data Fusion and SLAM
6.2.3. UAV Imagery Impacted by Altitude and Illumination Conditions
6.2.4. Opportunities for Improving the Statistical Dependence between Sensor Data Metrics and SLAM
6.2.5. Accurate and Precise Geo-Referenceing of Landmark Data Using Google Maps
6.2.6. Reduction in Feature Space for Faster SLAM
7. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Abbreviations
1D | One Dimensional |
2D | Two Dimensional |
3D | Three Dimensional |
5G | Fifth Generation (Communication networks) |
6-DoF | Six Degrees of Freedom |
C-SLAM | Collaborative Simultaneous Localization and Mapping |
DARPA | Defense Advanced Research Projects Agency |
DGPS | Differential GPS |
EKF | Extended Kalman Filter |
FoD | Frame of Discernment |
GNSS | Global Navigation Satellite System |
GPS | Global Positioning System |
ICP | Iterative Closest Point |
IMU | Inertial Measurement Unit |
IR | Infra Red |
KF | Kalman Filter |
LED | Light Emitting Diode |
LiDAR | Light Detection and Ranging |
MAP | Maximum a Posteriori |
MLE | Maximum Likelihood Estimation |
NLoS | Non-Line-of-Sight |
RBPF | Rao Blackwellized Particle Filter |
RGB-D | Red Green Blue-Depth |
RPAS | Remotely Piloted Aircraft System |
RSSI | Received Signal Strength Indicator |
RTK-GPS | Real-Time Kinematics based Global Positioning System |
SLAM | Simultaneous Localization and Mapping |
SLR | Single-Lens Reflex |
ToA | Time of Arrival |
UAV | Unmanned Aerial Vehicle |
VLC | Visible Light Communication |
VO | Visual Odometry |
Wi-Fi | Wireless Fidelity (generic term for IEEE 802.11 communication standard) |
References
- Xiao, Z.; Zeng, Y. An overview on integrated localization and communication towards 6G. Sci. China Inf. Sci. 2022, 65, 131301. [Google Scholar] [CrossRef]
- Information Resources Management Association. Unmanned Aerial Vehicles: Breakthroughs in Research and Practice; IGI Global: Hershey, PA, USA, 2019. [Google Scholar]
- Saad, W.; Bennis, M.; Mozaffari, M.; Lin, X. Wireless Communications and Networking for Unmanned Aerial Vehicles; Cambridge University Press: Cambridge, UK, 2020. [Google Scholar]
- Štroner, M.; Urban, R.; Seidl, J.; Reindl, T.; Brouček, J. Photogrammetry Using UAV-Mounted GNSS RTK: Georeferencing Strategies without GCPs. Remote Sens. 2021, 13, 1336. [Google Scholar] [CrossRef]
- Hariz, F.; Souifi, H.; Leblanc, R.; Bouslimani, Y.; Ghribi, M.; Langin, E.; Mccarthy, D. Direct Georeferencing 3D Points Cloud Map Based on SLAM and Robot Operating System. In Proceedings of the 2021 IEEE International Symposium on Robotic and Sensors Environments (ROSE), Virtual, 28–29 October 2021; pp. 1–6. [Google Scholar]
- Mendoza-Mendoza, J.; Gonzalez-Villela, V.; Aguilar-Ibañez, C.; Fonseca-Ruiz, L. Drones to Go: A Crash Course for Scientists and Makers, 1st ed.; Apress: Berkeley, CA, USA, 2021. [Google Scholar]
- Clark, D.G.; Ford, J.D.; Tabish, T. What role can unmanned aerial vehicles play in emergency response in the Arctic: A case study from Canada. PLoS ONE 2018, 13, e0205299. [Google Scholar] [CrossRef]
- Estrada, M.A.R.; Ndoma, A. The uses of unmanned aerial vehicles –UAV’s- (or drones) in social logistic: Natural disasters response and humanitarian relief aid. Procedia Comput. Sci. 2019, 149, 375–383. [Google Scholar] [CrossRef]
- Shirazi, M.S.; Patooghy, A.; Shisheie, R.; Haque, M.M. Application of Unmanned Aerial Vehicles in Smart Cities using Computer Vision Techniques. In Proceedings of the IEEE International Smart Cities Conference (ISC2), Piscataway, NJ, USA, 28 September–1 October 2020; pp. 1–7. [Google Scholar]
- Li, X.; Savkin, A.V. Networked Unmanned Aerial Vehicles for Surveillance and Monitoring: A Survey. Future Internet 2021, 13, 174. [Google Scholar] [CrossRef]
- Al-Turjman, F. Drones in Smart-Cities: Security and Performance; Elsevier: Amsterdam, The Netherlands, 2020. [Google Scholar]
- Mohamed, N.; Al-Jaroodi, J.; Jawhar, I.; Idries, A.; Mohammed, F. Unmanned aerial vehicles applications in future smart cities. Technol. Forecast. Soc. Chang. 2020, 153, 119293. [Google Scholar] [CrossRef]
- Eltokhey, M.W.; Khalighi, M.A.; Ghassemlooy, Z. UAV Location Optimization in MISO ZF Pre-coded VLC Networks. IEEE Wirel. Commun. Lett. 2021, 11, 28–32. [Google Scholar] [CrossRef]
- Guan, W.; Huang, L.; Wen, S.; Yan, Z.; Liang, W.; Yang, C.; Liu, Z. Robot Localization and Navigation Using Visible Light Positioning and SLAM Fusion. J. Lightwave Technol. 2021, 39, 7040–7051. [Google Scholar] [CrossRef]
- Chen, C.; Chen, S.; Hu, G.; Chen, B.; Chen, P.; Su, K. An auto-landing strategy based on pan-tilt based visual servoing for unmanned aerial vehicle in GNSS-denied environments. Aerosp. Sci. Technol. 2021, 116, 106891. [Google Scholar] [CrossRef]
- Naus, K.; Szymak, P.; Piskur, P.; Niedziela, M.; Nowak, A. Methodology for the Correction of the Spatial Orientation Angles of the Unmanned Aerial Vehicle Using Real Time GNSS, a Shoreline Image and an Electronic Navigational Chart. Energies 2021, 14, 2810. [Google Scholar] [CrossRef]
- Jiao, J.; Deng, Z.; Arain, Q.A.; Li, F. Smart Fusion of Multi-sensor Ubiquitous Signals of Mobile Device for Localization in GNSS-Denied Scenarios. Wirel. Pers. Commun. 2021, 116, 1507–1523. [Google Scholar] [CrossRef]
- Paredes, J.A.; Álvarez, F.J.; Hansard, M.; Rajab, K.Z. A Gaussian Process model for UAV localization using millimetre wave radar. Expert Syst. Appl. 2021, 185, 115563. [Google Scholar] [CrossRef]
- Famiglietti, N.A.; Cecere, G.; Grasso, C.; Memmolo, A.; Vicari, A. A Test on the Potential of a Low Cost Unmanned Aerial Vehicle RTK/PPK Solution for Precision Positioning. Sensors 2021, 21, 3882. [Google Scholar] [CrossRef] [PubMed]
- Shao, Z.; Cheng, G.; Li, D.; Huang, X.; Lu, Z.; Liu, J. Spatio-temporal-spectral-angular observation model that integrates observations from UAV and mobile mapping vehicle for better urban mapping. Geo-Spat. Inf. Sci. 2021, 1–15, ahead-of-print. [Google Scholar]
- Abdelfatah, R.; Moawad, A.; Alshaer, N.; Ismail, T. UAV Tracking System Using Integrated Sensor Fusion with RTK-GPS. In Proceedings of the 2021 International Mobile, Intelligent, and Ubiquitous Computing Conference (MIUCC), Cairo, Egypt, 26–27 May 2021; pp. 352–356. [Google Scholar]
- Valenti, F.; Giaquinto, D.; Musto, L.; Zinelli, A.; Bertozzi, M.; Broggi, A. Enabling Computer Vision-Based Autonomous Navigation for Unmanned Aerial Vehicles in Cluttered GPS-Denied Environments. In Proceedings of the IEEE 21st International Conference on Intelligent Transportation Systems (ITSC), Maui, HI, USA, 4–7 November 2018; pp. 3886–3891. [Google Scholar]
- Youn, W.; Ko, H.; Choi, H.; Choi, I.; Baek, J.H.; Myung, H. Collision-free Autonomous Navigation of A Small UAV Using Low-cost Sensors in GPS-denied Environments. Int. J. Control Autom. Syst. 2021, 19, 953–968. [Google Scholar] [CrossRef]
- Selje, R.; Al-Radaideh, A.; Sun, L. Efficient Three Dimensional Formation Control for Unmanned Aerial Vehicles in GPS-Denied Environments; American Automatic Control Council: New Orleans, LA, USA, 2021; pp. 3761–3766. [Google Scholar]
- Nguyen, T.M.; Cao, M.; Yuan, S.; Lyu, Y.; Nguyen, T.H.; Xie, L. VIRAL-Fusion: A Visual-Inertial-Ranging-Lidar Sensor Fusion Approach. IEEE Trans. Robot. 2021, 1–20. [Google Scholar] [CrossRef]
- Liu, T.; Zhao, H.; Yang, H.; Zheng, K.; Chatzimisios, P. Design and implementation of a novel real-time unmanned aerial vehicle localization scheme based on received signal strength. Trans. Emerg. Telecommun. Technol. 2021, 32, e4350. [Google Scholar] [CrossRef]
- Bultmann, S.; Quenzel, J.; Behnke, S. Real-Time Multi-Modal Semantic Fusion on Unmanned Aerial Vehicles. In Proceedings of the 2021 European Conference on Mobile Robots (ECMR), Bonn, Germany, 31 August–3 September 2021; pp. 1–8. [Google Scholar]
- Radzki, G.; Golinska-Dawson, P.; Bocewicz, G.; Banaszak, Z. Modelling Robust Delivery Scenarios for a Fleet of Unmanned Aerial Vehicles in Disaster Relief Missions. J. Intell. Robot. Syst. 2021, 103, 63. [Google Scholar] [CrossRef]
- Laksham, K. Unmanned aerial vehicle (drones) in public health: A SWOT analysis. J. Fam. Med. Prim. Care 2019, 8, 342–346. [Google Scholar] [CrossRef]
- Luo, S.; Liu, W.; Zhang, Y.; Wang, C.; Xi, X.; Nie, S.; Ma, D.; Lin, Y.; Zhou, G. Maize and soybean heights estimation from unmanned aerial vehicle (UAV) LiDAR data. Comput. Electron. Agric. 2021, 182, 106005. [Google Scholar] [CrossRef]
- Zhang, C.; Valente, J.; Kooistra, L.; Guo, L.; Wang, W. Orchard management with small unmanned aerial vehicles: A survey of sensing and analysis approaches. Precis. Agric. 2021, 22, 2007–2052. [Google Scholar] [CrossRef]
- Liu, F.; Yang, A. Application of gcForest to visual tracking using UAV image sequences. Multimed. Tools Appl. 2019, 78, 27933–27956. [Google Scholar] [CrossRef]
- Zheng, J.; Fu, H.; Li, W.; Wu, W.; Yu, L.; Yuan, S.; Tao, W.Y.W.; Pang, T.K.; Kanniah, K.D. Growing status observation for oil palm trees using Unmanned Aerial Vehicle (UAV) images. ISPRS J. Photogramm. Remote Sens. 2021, 173, 95–121. [Google Scholar] [CrossRef]
- Kim, D.; Liu, M.; Lee, S.; Kamat, V.R. Remote proximity monitoring between mobile construction resources using camera-mounted UAVs. Autom. Constr. 2019, 99, 168–182. [Google Scholar] [CrossRef]
- Śledź, S.; Ewertowski, M.W.; Piekarczyk, J. Applications of unmanned aerial vehicle (UAV) surveys and Structure from Motion photogrammetry in glacial and periglacial geomorphology. Geomorphology 2021, 378, 107620. [Google Scholar] [CrossRef]
- Gaffey, C.; Bhardwaj, A. Applications of Unmanned Aerial Vehicles in Cryosphere: Latest Advances and Prospects. Remote Sens. 2020, 12, 948. [Google Scholar] [CrossRef] [Green Version]
- Pakrooh, R.; Bohlooli, A. A Survey on Unmanned Aerial Vehicles-Assisted Internet of Things: A Service-Oriented Classification. Wirel. Pers. Commun. 2021, 119, 1541–1575. [Google Scholar] [CrossRef]
- Labib, N.S.; Brust, M.R.; Danoy, G.; Bouvry, P. The Rise of Drones in Internet of Things: A Survey on the Evolution, Prospects and Challenges of Unmanned Aerial Vehicles. IEEE Access 2021, 9, 115466–115487. [Google Scholar] [CrossRef]
- Gargalakos, M. The role of unmanned aerial vehicles in military communications: Application scenarios, current trends, and beyond. J. Def. Model. Simul. 2021, 154851292110316. [Google Scholar] [CrossRef]
- Cabrera-Castellanos, D.F.; Aragón-Zavala, A.; Castañón-Ávila, G. Closing Connectivity Gap: An Overview of Mobile Coverage Solutions for Not-Spots in Rural Zones. Sensors 2021, 21, 8037. [Google Scholar] [CrossRef]
- Niwa, H. Detection of organic tea farms based on the density of spider webs using aerial photography with an unmanned aerial vehicle (UAV). Landsc. Ecol. Eng. 2021, 17, 541–546. [Google Scholar] [CrossRef]
- Israr, A.; Abro, G.E.M.; Sadiq Ali Khan, M.; Farhan, M.; Bin Mohd Zulkifli, S.u.A. Internet of Things (IoT)-Enabled Unmanned Aerial Vehicles for the Inspection of Construction Sites: A Vision and Future Directions. Math. Probl. Eng. 2021, 2021, 9931112. [Google Scholar] [CrossRef]
- Wei, T.; Shangguan, D.; Yi, S.; Ding, Y. Characteristics and controls of vegetation and diversity changes monitored with an unmanned aerial vehicle (UAV) in the foreland of the Urumqi Glacier No. 1, Tianshan, China. Sci. Total Environ. 2021, 771, 145433. [Google Scholar] [CrossRef] [PubMed]
- Baca, T.; Stepan, P.; Spurny, V.; Hert, D.; Penicka, R.; Saska, M.; Thomas, J.; Loianno, G.; Kumar, V. Autonomous landing on a moving vehicle with an unmanned aerial vehicle. J. Field Robot. 2019, 36, 874–891. [Google Scholar] [CrossRef]
- Al Said, N.; Gorbachev, Y.; Avdeenko, A. An unmanned aerial vehicles navigation system on the basis of pattern recognition applications—Review of implementation options and prospects for development. Softw. Pract. Exp. 2021, 51, 1509–1517. [Google Scholar] [CrossRef]
- Liu, C.; Zhao, J.; Sun, N.; Yang, Q.; Wang, L. IT-SVO: Improved Semi-Direct Monocular Visual Odometry Combined with JS Divergence in Restricted Mobile Devices. Sensors 2021, 21, 2025. [Google Scholar] [CrossRef]
- Muñoz, J.; López, B.; Quevedo, F.; Monje, C.A.; Garrido, S.; Moreno, L.E. Multi UAV Coverage Path Planning in Urban Environments. Sensors 2021, 21, 7365. [Google Scholar] [CrossRef]
- Couturier, A.; Akhloufi, M.A. A review on absolute visual localization for UAV. Robot. Auton. Syst. 2021, 135, 103666. [Google Scholar] [CrossRef]
- Belmonte, L.M.; Morales, R.; Fernández-Caballero, A. Computer Vision in Autonomous Unmanned Aerial Vehicles—A Systematic Mapping Study. Appl. Sci. 2019, 9, 3196. [Google Scholar] [CrossRef] [Green Version]
- Le, N.P.; Tran, L.C.; Huang, X.; Dutkiewicz, E.; Ritz, C.; Phung, S.L.; Bouzerdoum, A.; Franklin, D.; Hanzo, L. Energy-Harvesting Aided Unmanned Aerial Vehicles for Reliable Ground User Localization and Communications Under Lognormal-Nakagami- m Fading Channels. IEEE Trans. Veh. Technol. 2021, 70, 1632–1647. [Google Scholar] [CrossRef]
- He, Z.; Yao, L. Research on an Obstacle Avoidance Method for UAV. Math. Probl. Eng. 2021, 2021, 3798990. [Google Scholar] [CrossRef]
- Wang, D.; Lian, B.; Tang, C. UGV-UAV robust cooperative positioning algorithm with object detection. IET Intell. Transp. Syst. 2021, 15, 851–862. [Google Scholar] [CrossRef]
- Stampa, M.; Sutorma, A.; Jahn, U.; Thiem, J.; Wolff, C.; Röhrig, C. Maturity Levels of Public Safety Applications using Unmanned Aerial Systems: A Review. J. Intell. Robot. Syst. 2021, 103, 16. [Google Scholar] [CrossRef] [PubMed]
- Ahmad, S.; Sunberg, Z.N.; Humbert, J.S. End-to-End Probabilistic Depth Perception and 3D Obstacle Avoidance using POMDP. J. Intell. Robot. Syst. 2021, 103, 33. [Google Scholar] [CrossRef]
- Martinez-Carranza, J.; Rascon, C. A Review on Auditory Perception for Unmanned Aerial Vehicles. Sensors 2020, 20, 7276. [Google Scholar] [CrossRef]
- Bestaoui Sebbane, Y. Multi-UAV Planning and Task Allocation; CRC Press: Boca Raton, FL, USA, 2020. [Google Scholar]
- Morales, J.J.; Khalife, J.J.; Kassas, Z.M. Information Fusion Strategies for Collaborative Inertial Radio SLAM. IEEE Trans. Intell. Transp. Syst. 2021, 1–18. [Google Scholar] [CrossRef]
- Jang, Y.; Oh, C.; Lee, Y.; Kim, H.J. Multirobot Collaborative Monocular SLAM Utilizing Rendezvous. IEEE Trans. Robot. 2021, 37, 1469–1486. [Google Scholar] [CrossRef]
- Li, R.; Li, X.; Xu, J.; Jiang, F.; Jia, Z.; Shao, D.; Pan, L.; Liu, X. Energy-aware decision-making for dynamic task migration in MEC-based unmanned aerial vehicle delivery system. Concurr. Comput. 2021, 33, e6092. [Google Scholar] [CrossRef]
- Ul Hasan, N.; Ejaz, W.; Zghaibeh, M.; Ejaz, N.; Alzahrani, B. On seamless and high-bandwidth connectivity for cognitive multi-unmanned aerial vehicle-assisted networks. Trans. Emerg. Telecommun. Technol. 2021, 32, e3979. [Google Scholar] [CrossRef]
- Liu, Z.; Zhan, C.; Cui, Y.; Wu, C.; Hu, H. Robust Edge Computing in UAV Systems via Scalable Computing and Cooperative Computing. IEEE Wirel. Commun. 2021, 28, 36–42. [Google Scholar] [CrossRef]
- Chen, Q.; Zhu, H.; Yang, L.; Chen, X.; Pollin, S.; Vinogradov, E. Edge Computing Assisted Autonomous Flight for UAV: Synergies between Vision and Communications. IEEE Commun. Mag. 2021, 59, 28–33. [Google Scholar] [CrossRef]
- Wang, J.; Wu, N.; Lu, X.; Zhao, W.X.; Feng, K. Deep Trajectory Recovery with Fine-Grained Calibration using Kalman Filter. IEEE Trans. Knowl. Data Eng. 2019, 33, 921–934. [Google Scholar] [CrossRef]
- Xu, Z.; Rong, Z.; Wu, Y. A survey: Which features are required for dynamic visual simultaneous localization and mapping? Vis. Comput. Ind. Biomed. Art 2021, 4, 20. [Google Scholar] [CrossRef] [PubMed]
- Hu, X.; Lang, J. DOE-SLAM: Dynamic Object Enhanced Visual SLAM. Sensors 2021, 21, 3091. [Google Scholar] [CrossRef] [PubMed]
- Nwadiugwu, W.P.; Kim, S.H.; Kim, D.S. Precise-Point-Positioning Estimations for Recreational Drones Using Optimized Cubature-Extended Kalman Filtering. IEEE Access 2021, 9, 134369–134383. [Google Scholar] [CrossRef]
- Sahoo, B.; Biglarbegian, M.; Melek, W. Monocular Visual Inertial Direct SLAM with Robust Scale Estimation for Ground Robots/Vehicles. Robotics 2021, 10, 23. [Google Scholar] [CrossRef]
- Slowak, P.; Kaniewski, P. Stratified Particle Filter Monocular SLAM. Remote Sens. 2021, 13, 3233. [Google Scholar] [CrossRef]
- Naudet-Collette, S.; Melbouci, K.; Gay-Bellile, V.; Ait-Aider, O.; Dhome, M. Constrained RGBD-SLAM. Robotica 2020, 39, 277–290. [Google Scholar] [CrossRef]
- López, A.; Jurado, J.M.; Ogayar, C.J.; Feito, F.R. An optimized approach for generating dense thermal point clouds from UAV-imagery. ISPRS J. Photogramm. Remote Sens. 2021, 182, 78–95. [Google Scholar] [CrossRef]
- Li, R.; Yang, Q.; Zhao, W.; Fang, H. Collision-Free Trajectory Generation for Multiple UAVs with Sensing Constraints; Technical Committee on Control Theory, Chinese Association of Automation: Beijing, China, 2021; pp. 5592–5597. [Google Scholar]
- Yang, J.C.; Lin, C.J.; You, B.Y.; Yan, Y.L.; Cheng, T.H. RTLIO: Real-Time LiDAR-Inertial Odometry and Mapping for UAVs. Sensors 2021, 21, 3955. [Google Scholar] [CrossRef]
- Pessanha Santos, N.; Lobo, V.; Bernardino, A. Unscented Particle Filters with Refinement Steps for UAV Pose Tracking. J. Intell. Robot. Syst. 2021, 102, 52. [Google Scholar] [CrossRef]
- Terblanche, J.; Claassens, S.; Fourie, D. Multimodal Navigation-Affordance Matching for SLAM. IEEE Robot. Autom. Lett. 2021, 6, 7728–7735. [Google Scholar] [CrossRef]
- Kokunko, Y.G.; Krasnov, D.V.; Utkin, A.V. Two Methods ‘for Synthesis of State and Disturbance Observers for an Unmanned Aerial Vehicle. Autom. Remote Control 2021, 82, 1426–1441. [Google Scholar] [CrossRef]
- Deng, Z.; Wang, J. A new evidential similarity measurement based on Tanimoto measure and its application in multi-sensor data fusion. Eng. Appl. Artif. Intell. 2021, 104, 104380. [Google Scholar] [CrossRef]
- Ali, W.; Liu, P.; Ying, R.; Gong, Z. 6-DOF Feature based LIDAR SLAM using ORB Features from Rasterized Images of 3D LIDAR Point Cloud. arXiv 2021, arXiv:2103.10678. [Google Scholar]
- Papaioannou, S.; Kolios, P.; Theocharides, T.; Panayiotou, C.G.; Polycarpou, M.M. Towards Automated 3D Search Planning for Emergency Response Missions. J. Intell. Robot. Syst. 2021, 103, 1–19. [Google Scholar] [CrossRef]
- Guo, K.; Ye, Z.; Liu, D.; Peng, X. UAV flight control sensing enhancement with a data-driven adaptive fusion model. Reliab. Eng. Syst. Saf. 2021, 213, 107654. [Google Scholar] [CrossRef]
- Sobczak, Ł.; Filus, K.; Domański, A.; Domańska, J. LiDAR Point Cloud Generation for SLAM Algorithm Evaluation. Sensors 2021, 21, 3313. [Google Scholar] [CrossRef]
- Ali, W.; Liu, P.; Ying, R.; Gong, Z. A Feature Based Laser SLAM Using Rasterized Images of 3D Point Cloud. IEEE Sens. J. 2021, 21, 24422–24430. [Google Scholar] [CrossRef]
- Chang, A.; Jung, J.; Yeom, J.; Landivar, J. 3D Characterization of Sorghum Panicles Using a 3D Point Cloud Derived from UAV Imagery. Remote Sens. 2021, 13, 282. [Google Scholar] [CrossRef]
- Lin, R.; Xu, J.; Zhang, J. GLO-SLAM: A slam system optimally combining GPS and LiDAR odometry. Ind. Robot 2021, 48, 726–736. [Google Scholar] [CrossRef]
- Huang, B.; Feng, P.; Zhang, J.; Yu, D.; Wu, Z. A Novel Positioning Module and Fusion Algorithm for Unmanned Aerial Vehicle Monitoring. IEEE Sens. J. 2021, 21, 23006–23023. [Google Scholar] [CrossRef]
- Milijas, R.; Markovic, L.; Ivanovic, A.; Petric, F.; Bogdan, S. A Comparison of LiDAR-based SLAM Systems for Control of Unmanned Aerial Vehicles. In Proceedings of the IEEE 2021 International Conference on Unmanned Aircraft Systems (ICUAS), Athens, Greece, 15–18 June 2021; pp. 1148–1154. [Google Scholar]
- Kim, J.; Guivant, J.; Sollie, M.L.; Bryne, T.H.; Johansen, T.A. Compressed pseudo-SLAM: Pseudorange-integrated compressed simultaneous localisation and mapping for unmanned aerial vehicle navigation. J. Navig. 2021, 74, 1091–1103. [Google Scholar] [CrossRef]
- Xie, Z.; Liu, J.; Sheng, M.; Zhao, N.; Li, J. Exploiting Aerial Computing for Air-to-Ground Coverage Enhancement. IEEE Wirel. Commun. 2021, 28, 50–58. [Google Scholar] [CrossRef]
- Casado, R.; Bermúdez, A. A Simulation Framework for Developing Autonomous Drone Navigation Systems. Electronics 2021, 10, 7. [Google Scholar] [CrossRef]
- Karam, S.; Lehtola, V.; Vosselman, G. Simple loop closing for continuous 6DOF LIDAR & IMU graph SLAM with planar features for indoor environments. ISPRS J. Photogramm. Remote Sens. 2021, 181, 413–426. [Google Scholar]
- Luo, Y.; Li, Y.; Li, Z.; Shuang, F. MS-SLAM: Motion State Decision of Keyframes for UAV-Based Vision Localization. IEEE Access 2021, 9, 67667–67679. [Google Scholar] [CrossRef]
- Wondosen, A.; Jeong, J.S.; Kim, S.K.; Debele, Y.; Kang, B.S. Improved Attitude and Heading Accuracy with Double Quaternion Parameters Estimation and Magnetic Disturbance Rejection. Sensors 2021, 21, 5475. [Google Scholar] [CrossRef]
- Chen, S.; Chen, H.; Chang, C.W.; Wen, C.Y. Multilayer Mapping Kit for Autonomous UAV Navigation. IEEE Access 2021, 9, 31493–31503. [Google Scholar] [CrossRef]
- Li, Z.; Giorgetti, A.; Kandeepan, S. Multiple Radio Transmitter Localization via UAV-Based Mapping. IEEE Trans. Veh. Technol. 2021, 70, 8811–8822. [Google Scholar] [CrossRef]
- Shao, P.; Mo, F.; Chen, Y.; Ding, N.; Huang, R. Monocular Object SLAM using Quadrics and Landmark Reference Map for Outdoor UAV Applications. In Proceedings of the 2021 IEEE International Conference on Real-time Computing and Robotics (RCAR), Xining, China, 15–19 July 2021; pp. 1195–1201. [Google Scholar]
- Bonyan Khamseh, H.; Ghorbani, S.; Janabi-Sharifi, F. Unscented Kalman filter state estimation for manipulating unmanned aerial vehicles. Aerosp. Sci. Technol. 2019, 92, 446–463. [Google Scholar] [CrossRef]
- Cadena, C.; Carlone, L.; Carrillo, H.; Latif, Y.; Scaramuzza, D.; Neira, J.; Reid, I.; Leonard, J.J. Past, Present, and Future of Simultaneous Localization and Mapping: Toward the Robust-Perception Age. IEEE Trans. Robot. 2016, 32, 1309–1332. [Google Scholar] [CrossRef] [Green Version]
- Lee, J.; Park, S.Y. PLF-VINS: Real-Time Monocular Visual-Inertial SLAM With Point-Line Fusion and Parallel-Line Fusion. IEEE Robot. Autom. Lett. 2021, 6, 7033–7040. [Google Scholar] [CrossRef]
- Pisciotta, A.; Vitale, G.; Scudero, S.; Martorana, R.; Capizzi, P.; D’Alessandro, A. A Lightweight Prototype of a Magnetometric System for Unmanned Aerial Vehicles. Sensors 2021, 21, 4691. [Google Scholar] [CrossRef] [PubMed]
- Yu, H.; Li, G.; Zhang, W.; Huang, Q.; Du, D.; Tian, Q.; Sebe, N. The Unmanned Aerial Vehicle Benchmark: Object Detection, Tracking and Baseline. Int. J. Comput. Vis. 2020, 128, 1141–1159. [Google Scholar] [CrossRef]
- Banfi, F.; Mandelli, A. Computer Vision Meets Image Processing and UAS PhotoGrammetric Data Integration: From HBIM to the eXtended Reality Project of Arco della Pace in Milan and Its Decorative Complexity. J. Imaging 2021, 7, 118. [Google Scholar] [CrossRef]
- Akbari, Y.; Almaadeed, N.; Al-maadeed, S.; Elharrouss, O. Applications, databases and open computer vision research from drone videos and images: A survey. Artif. Intell. Rev. 2021, 54, 3887–3938. [Google Scholar] [CrossRef]
- Wei, H.; Zhang, T.; Zhang, L. GMSK-SLAM: A new RGB-D SLAM method with dynamic areas detection towards dynamic environments. Multimed. Tools Appl. 2021, 80, 31729–31751. [Google Scholar] [CrossRef]
- Debeunne, C.; Vivet, D. A Review of Visual-LiDAR Fusion based Simultaneous Localization and Mapping. Sensors 2020, 20, 2068. [Google Scholar] [CrossRef] [Green Version]
- Riabukha, V.P. Radar Surveillance of Unmanned Aerial Vehicles (Review). Radioelectron. Commun. Syst. 2020, 63, 561–573. [Google Scholar] [CrossRef]
- Helgesen, H.H.; Bryne, T.H.; Wilthil, E.F.; Johansen, T.A. Camera-Based Tracking of Floating Objects using Fixed-wing UAVs. J. Intell. Robot. Syst. 2021, 102, 80. [Google Scholar] [CrossRef]
- Xie, X.; Yang, T.; Ning, Y.; Zhang, F.; Zhang, Y. A Monocular Visual Odometry Method Based on Virtual-Real Hybrid Map in Low-Texture Outdoor Environment. Sensors 2021, 21, 3394. [Google Scholar] [CrossRef] [PubMed]
- Khuc, T.; Nguyen, T.A.; Dao, H.; Catbas, F.N. Swaying displacement measurement for structural monitoring using computer vision and an unmanned aerial vehicle. Meas. J. Int. Meas. Confed. 2020, 159, 107769. [Google Scholar] [CrossRef]
- Medeiros, R.A.; Pimentel, G.A.; Garibotti, R. An Embedded Quaternion-Based Extended Kalman Filter Pose Estimation for Six Degrees of Freedom Systems. J. Intell. Robot. Syst. 2021, 102, 18. [Google Scholar] [CrossRef]
- Lo, L.Y.; Yiu, C.H.; Tang, Y.; Yang, A.S.; Li, B.; Wen, C.Y. Dynamic Object Tracking on Autonomous UAV System for Surveillance Applications. Sensors 2021, 21, 7888. [Google Scholar] [CrossRef] [PubMed]
- Hai, J.; Hao, Y.; Zou, F.; Lin, F.; Han, S. A Visual Navigation System for UAV under Diverse Illumination Conditions. Appl. Artif. Intell. 2021, 1–21. [Google Scholar] [CrossRef]
- Liu, W.; Mohta, K.; Loianno, G.; Daniilidis, K.; Kumar, V. Semi-dense visual-inertial odometry and mapping for computationally constrained platforms. Auton. Robot. 2021, 45, 773–787. [Google Scholar] [CrossRef]
- Huang, X.; Dong, X.; Ma, J.; Liu, K.; Ahmed, S.; Lin, J.; Qiu, B. The Improved A Obstacle Avoidance Algorithm for the Plant Protection UAV with Millimeter Wave Radar and Monocular Camera Data Fusion. Remote Sens. 2021, 13, 3364. [Google Scholar] [CrossRef]
- Leong, W.L.; Wang, P.; Huang, S.; Ma, Z.; Yang, H.; Sun, J.; Zhou, Y.; Abdul Hamid, M.R.; Srigrarom, S.; Teo, R. Vision-Based Sense and Avoid with Monocular Vision and Real-Time Object Detection for UAVs. In Proceedings of the 2021 International Conference on Unmanned Aircraft Systems (ICUAS), Athens, Greece, 15–18 June 2021; pp. 1345–1354. [Google Scholar]
- Zhou, Y.; Gallego, G.; Shen, S. Event-Based Stereo Visual Odometry. IEEE Trans. Robot. 2021, 37, 1433–1450. [Google Scholar] [CrossRef]
- Cazzato, D.; Cimarelli, C.; Sanchez-Lopez, J.L.; Voos, H.; Leo, M. A Survey of Computer Vision Methods for 2D Object Detection from Unmanned Aerial Vehicles. J. Imaging 2020, 6, 78. [Google Scholar] [CrossRef]
- Seki, H.; Kawai, K.; Hikizu, M. Localization System for Indoor Mobile Robot Using Large Square-Shaped Reflective Marker. Int. J. Autom. Technol. 2021, 15, 182–190. [Google Scholar] [CrossRef]
- Lao, Y.; Ait-Aider, O.; Bartoli, A. Solving Rolling Shutter 3D Vision Problems using Analogies with Non-rigidity. Int. J. Comput. Vis. 2021, 129, 100–122. [Google Scholar] [CrossRef]
- Zhou, H.; Yao, Z.; Lu, M. Lidar/UWB Fusion Based SLAM With Anti-Degeneration Capability. IEEE Trans. Veh. Technol. 2021, 70, 820–830. [Google Scholar] [CrossRef]
- Zhan, Z.; Jian, W.; Li, Y.; Yue, Y. A SLAM Map Restoration Algorithm Based on Submaps and an Undirected Connected Graph. IEEE Access 2021, 9, 12657–12674. [Google Scholar] [CrossRef]
- Qian, J.; Chen, K.; Chen, Q.; Yang, Y.; Zhang, J.; Chen, S. Robust Visual-Lidar Simultaneous Localization and Mapping System for UAV. IEEE Geosci. Remote Sens. Lett. 2021, 19, 6502105. [Google Scholar] [CrossRef]
- Park, C.; Moghadam, P.; Williams, J.; Kim, S.; Sridharan, S.; Fookes, C. Elasticity Meets Continuous-Time: Map-Centric Dense 3D LiDAR SLAM. IEEE Trans. Robot. 2021, 1–20. [Google Scholar] [CrossRef]
- James, M.R.; Chandler, J.H.; Eltner, A.; Fraser, C.; Miller, P.E.; Mills, J.P.; Noble, T.; Robson, S.; Lane, S.N. Guidelines on the use of structure-from-motion photogrammetry in geomorphic research. Earth Surf. Process. Landf. 2019, 44, 2081–2084. [Google Scholar] [CrossRef]
- Adjidjonu, D.; Burgett, J. Assessing the Accuracy of Unmanned Aerial Vehicles Photogrammetric Survey. Int. J. Constr. Educ. Res. 2021, 17, 85–96. [Google Scholar] [CrossRef]
- Costanzo, A.; Pisciotta, A.; Pannaccione Apa, M.I.; Bongiovanni, S.; Capizzi, P.; D’Alessandro, A.; Falcone, S.; La Piana, C.; Martorana, R. Integrated use of unmanned aerial vehicle photogrammetry and terrestrial laser scanning to support archaeological analysis: The Acropolis of Selinunte case (Sicily, Italy). Archaeol. Prospect. 2021, 28, 153–165. [Google Scholar] [CrossRef]
- Chen, C.; Wu, X.; Bo, Y.; Chen, Y.; Liu, Y.; Alsaadi, F.E. SARSA in extended Kalman Filter for complex urban environments positioning. Int. J. Syst. Sci. 2021, 52, 3044–3059. [Google Scholar] [CrossRef]
- Burdziakowski, P.; Bobkowska, K. UAV Photogrammetry under Poor Lighting Conditions—Accuracy Considerations. Sensors 2021, 21, 3531. [Google Scholar] [CrossRef]
- Lin, S.; Jin, L.; Chen, Z. Real-Time Monocular Vision System for UAV Autonomous Landing in Outdoor Low-Illumination Environments. Sensors 2021, 21, 6226. [Google Scholar] [CrossRef] [PubMed]
- Walter, C.A.; Braun, A.; Fotopoulos, G. Impact of three-dimensional attitude variations of an unmanned aerial vehicle magnetometry system on magnetic data quality. Geophys. Prospect. 2019, 67, 465–479. [Google Scholar] [CrossRef]
- Xu, C.; Yin, C.; Huang, D.; Han, W.; Wang, D. 3D target localization based on multi–unmanned aerial vehicle cooperation. Meas. Control 2021, 54, 895–907. [Google Scholar] [CrossRef]
- Bresson, G.; Alsayed, Z.; Yu, L.; Glaser, S. Simultaneous Localization and Mapping: A Survey of Current Trends in Autonomous Driving. IEEE Trans. Intell. Veh. 2017, 2, 194–220. [Google Scholar] [CrossRef] [Green Version]
- Saeedi, S.; Trentini, M.; Seto, M.; Li, H. Multiple-Robot Simultaneous Localization and Mapping: A Review. J. Field Robot. 2016, 33, 3–46. [Google Scholar] [CrossRef]
- Lei, M.; Zhang, X.; Yu, B.; Fowler, S.; Yu, B. Throughput maximization for UAV-assisted wireless powered D2D communication networks with a hybrid time division duplex/frequency division duplex scheme. Wirel. Netw. 2021, 27, 2147–2157. [Google Scholar] [CrossRef]
- Dang, T.; Tranzatto, M.; Khattak, S.; Mascarich, F.; Alexis, K.; Hutter, M. Graph-based subterranean exploration path planning using aerial and legged robots. J. Field Robot. 2020, 37, 1363–1388. [Google Scholar] [CrossRef]
- Hu, J.; Zhang, H.; Li, Z.; Zhao, C.; Xu, Z.; Pan, Q. Object traversing by monocular UAV in outdoor environment. Asian J. Control 2021, 23, 2766–2775. [Google Scholar] [CrossRef]
- Karakostas, I.; Mygdalis, V.; Tefas, A.; Pitas, I. Occlusion detection and drift-avoidance framework for 2D visual object tracking. Signal Process. Image Commun. 2021, 90. [Google Scholar] [CrossRef]
- Cattaneo, D.; Vaghi, M.; Valada, A. LCDNet: Deep Loop Closure Detection and Point Cloud Registration for LiDAR SLAM. IEEE Trans. Robot. 2021, 1–20. [Google Scholar] [CrossRef]
- Bi, S.; Ma, L.; Shen, T.; Xu, Y.; Li, F. Neural network assisted Kalman filter for INS/UWB integrated seamless quadrotor localization. PeerJ Comput. Sci. 2021, 7, e630. [Google Scholar] [CrossRef] [PubMed]
- Chang, Y.; Wang, Y.; Shen, Y.; Ji, C. A new fuzzy strong tracking cubature Kalman filter for INS/GNSS. GPS Solut. 2021, 25, 120. [Google Scholar] [CrossRef]
- Gośliński, J.; Giernacki, W.; Królikowski, A. A Nonlinear Filter for Efficient Attitude Estimation of Unmanned Aerial Vehicle (UAV). J. Intell. Robot. Syst. 2019, 95, 1079–1095. [Google Scholar] [CrossRef] [Green Version]
- Sadeghzadeh-Nokhodberiz, N.; Can, A.; Stolkin, R.; Montazeri, A. Dynamics-Based Modified Fast Simultaneous Localization and Mapping for Unmanned Aerial Vehicles With Joint Inertial Sensor Bias and Drift Estimation. IEEE Access 2021, 9, 120247–120260. [Google Scholar] [CrossRef]
- Peng, J.; Liu, J.; Wei, H. A compressed matrix sequence method for solving normal equations of bundle adjustment. Mach. Vis. Appl. 2021, 32, 81. [Google Scholar] [CrossRef]
- Sun, S.; Dong, K.; Guo, C.; Tan, D. A Wind Estimation Based on Unscented Kalman Filter for Standoff Target Tracking Using a Fixed-Wing UAV. Int. J. Aeronaut. Space Sci. 2021, 22, 366–375. [Google Scholar] [CrossRef]
- Sualeh, M.; Kim, G.W. Semantics Aware Dynamic SLAM Based on 3D MODT. Sensors 2021, 21, 6355. [Google Scholar] [CrossRef]
- Zhong, S.; Guo, M.; Lv, R.; Chen, J.; Xie, Z.; Liu, Z. A Robust Rigid Registration Framework of 3D Indoor Scene Point Clouds Based on RGB-D Information. Remote Sens. 2021, 13, 4755. [Google Scholar] [CrossRef]
- Hou, Z.; Bu, F. A small UAV tracking algorithm based on AIMM-UKF. Aircr. Eng. Aerosp. Technol. 2021, 93, 579–591. [Google Scholar] [CrossRef]
- Xu, H.; Li, P.; Wen, L.; Wang, Z. Vision/inertial integrated navigation method for quadcopter based on EKF state observer. J. Phys. Conf. Ser. 2021, 1748, 62076. [Google Scholar] [CrossRef]
- Carbonell, R.; Cuenca, Á.; Casanova, V.; Pizá, R.; Salt Llobregat, J.J. Dual-Rate Extended Kalman Filter Based Path-Following Motion Control for an Unmanned Ground Vehicle: Realistic Simulation. Sensors 2021, 21, 7557. [Google Scholar] [CrossRef] [PubMed]
- Fuentes-Pacheco, J.; Ruiz-Ascencio, J.; Rendón-Mancha, J.M. Visual simultaneous localization and mapping: A survey. Artif. Intell. Rev. 2015, 43, 55–81. [Google Scholar] [CrossRef]
- Azzam, R.; Taha, T.; Huang, S.; Zweiri, Y. Feature-based visual simultaneous localization and mapping: A survey. SN Appl. Sci. 2020, 2, 224. [Google Scholar] [CrossRef] [Green Version]
- Mansour, M.; Davidson, P.; Stepanov, O.; Piché, R. Towards Semantic SLAM: 3D Position and Velocity Estimation by Fusing Image Semantic Information with Camera Motion Parameters for Traffic Scene Analysis. Remote Sens. 2021, 13, 388. [Google Scholar] [CrossRef]
- Wen, S.; Li, P.; Zhao, Y.; Zhang, H.; Sun, F.; Wang, Z. Semantic visual SLAM in dynamic environment. Auton. Robot. 2021, 45, 493–504. [Google Scholar] [CrossRef]
- van Goor, P.; Mahony, R.; Hamel, T.; Trumpf, J. Constructive observer design for Visual Simultaneous Localisation and Mapping. Automatica 2021, 132, 109803. [Google Scholar] [CrossRef]
- Varbla, S.; Ellmann, A.; Puust, R. Centimetre-range deformations of built environment revealed by drone-based photogrammetry. Autom. Constr. 2021, 128, 103787. [Google Scholar] [CrossRef]
- Roncella, R.; Forlani, G. UAV Block Geometry Design and Camera Calibration: A Simulation Study. Sensors 2021, 21, 6090. [Google Scholar] [CrossRef]
- Wang, W.; Liu, J.; Wang, C.; Luo, B.; Zhang, C. DV-LOAM: Direct Visual LiDAR Odometry and Mapping. Remote Sens. 2021, 13, 3340. [Google Scholar] [CrossRef]
- Kim, S.; Tadiparthi, V.; Bhattacharya, R. Computationally Efficient Attitude Estimation with Extended H2 Filtering. J. Guid. Control Dyn. 2021, 44, 418–427. [Google Scholar] [CrossRef]
- Wang, Y.; Wang, H.; Liu, B.; Liu, Y.; Wu, J.; Lu, Z. A Visual Navigation Framework for the Aerial Recovery of UAVs. IEEE Trans. Instrum. Meas. 2021, 70, 1–13. [Google Scholar] [CrossRef]
- Chauchat, P.; Barrau, A.; Bonnabel, S. Factor Graph-Based Smoothing Without Matrix Inversion for Highly Precise Localization. IEEE Trans. Control Syst. Technol. 2020, 29, 1219–1232. [Google Scholar] [CrossRef]
- Elsheshtawy, A.M.; Gavrilova, L.A. Improving Linear Projects Georeferencing to Create Digital Models Using UAV Imagery. E3S Web Conf. 2021, 310, 4001. [Google Scholar] [CrossRef]
- Biswas, S.; Sharma, R. Goal-Aware Robocentric Mapping and Navigation of a Quadrotor Unmanned Aerial Vehicle. J. Aerosp. Inf. Syst. 2021, 18, 488–505. [Google Scholar] [CrossRef]
- Suwoyo, H.; Tian, Y.; Wang, W.; Li, L.; Adriansyah, A.; Xi, F.; Yuan, G. Maximum likelihood estimation-assisted ASVSF through state covariance-based 2D SLAM algorithm. Telkomnika 2021, 19, 327–338. [Google Scholar] [CrossRef]
- de Oliveira, F.D.B.; da Silva, M.R.; Araújo, A.F.R. Spatio-temporal Data Association for Object-augmented Mapping. J. Intell. Robot. Syst. 2021, 103, 1. [Google Scholar] [CrossRef]
- Giubilato, R.; Gentil, C.L.; Vayugundla, M.; Schuster, M.J.; Vidal-Calleja, T.; Triebel, R. GPGM-SLAM: A Robust SLAM System for Unstructured Planetary Environments with Gaussian Process Gradient Maps. arXiv 2021, arXiv:2109.06596. [Google Scholar]
- Cho, H.M.; Jo, H.; Kim, E. SP-SLAM: Surfel-Point Simultaneous Localization and Mapping. IEEE/ASME Trans. Mechatron. 2021, 1–12. [Google Scholar] [CrossRef]
- Wen, S.; Liu, X.; Zhang, H.; Sun, F.; Sheng, M.; Fan, S. Dense point cloud map construction based on stereo VINS for mobile vehicles. ISPRS J. Photogramm. Remote Sens. 2021, 178, 328–344. [Google Scholar] [CrossRef]
Symbol | Definition |
---|---|
= | UAV trajectory at different time steps |
represent UAV position along the three axes and is the angle | |
Approximate trajectory while capturing landmarks data | |
Data for UAV 1 from time t = 0 to t = n | |
= | Gaussian distributed landmark data (state variables) at time n |
Set that contains information about UAV trajectory, position, and landmarks | |
m | Number of landmarks in a scene |
Sensor data | |
UAV trajectory; describes how changes from time step to n | |
Position in the vector space; n dimensional vector for feature detection | |
Noise in sensor data | |
Function that describes the SLAM process | |
Data from 0 to n; estimates the current state distribution at time n | |
Estimated likelihood of sensor data given a UAV trajectory | |
Prior estimated probabilities of UAV localization | |
classes, | Sequences of UAV trajectory and odometry dataset |
Timeframes at which landmark data is collected/discarded | |
Data captured by RGB cameras is pixel information of size | |
Distance between the landmark point and the UAV | |
s | Distance traversed between coordinates to |
The orientation difference between two consecutive sensor scans | |
Error in the estimated movement between two frames | |
Angle between the landmark point and the UAV | |
= [s, ] | Transformation that learns changes in s and |
Sensor data | |
Five landmarks in a scene | |
Set of sensor data that captures landmark data from the observation space | |
, | Relative error from sensor data and |
Position covariance | |
Sparsity matrix |
Single-UAV SLAM | Multiple-UAV SLAM or C-SLAM |
---|---|
Usually one UAV is in action in a given scenario, making it suitable for customized applications. | The number of active UAVs in an application may change dynamically based on resource demands as ad-hoc topology can be created and deployed. |
The data captured by all the sensors mounted on a single UAV may be critical for fusion. | Targeted data from selected sensors form specific UAVs may be sufficient for fusion. |
As shown in Figure 2, only one UAV is assigned to all the landmarks in a scene. | Varying number of UAVs can be assigned to each landmark. |
The available bandwidth is usually fixed and limited. This becomes a critical limitation in single UAV SLAM applications. | As shown in Figure 4, efficient allocation and use of available bandwidth is possible. |
One possible advantage in some applications is that the uncertainty factor induced by multiple UAVs is alleviated to some extent. However, uncertainty in sensor data still persists. | C-SLAM can lead to precise sensor data from multiple UAVs, but uncertainty may be induced due to multiple UAVs. |
As depicted in Figure 3, the amount of data gathered by a sensor is limited by the trajectory of one UAV. | C-SLAM allows diverse data collection through multiple UAVs for map construction over the target landmarks. |
Single-UAV SLAM does not need load balancing for resource provisioning among UAVs. Single-UAV SLAM also needs less computational capabilities for data-offloading across edge/cloud platforms. | C-SLAM requires automatic load balancing and dynamic topology reconfiguration. As shown in Figure 4, C-SLAM also requires additional networking functionalities for collaboration among multiple UAVs and cloud infrastructure. |
Due to limited FoV, the sensors may intermittently not be able to capture data along one or more dimensions unless the orientation of a single UAV allows sufficient FoV. Consequently, a target landmark may be viewed from only one angle at a time. | The missed data by one UAV may be compensated over a series of timesteps by other UAVs for a large number of landmarks. Furthermore, a target landmark can be viewed from multiple angles. |
Due to limited battery power, single-UAV SLAM may not be feasible in large trajectory scenarios. | C-SLAM may benefit from dynamic UAV placement based on least-congested or shortest trajectory for each UAV, thus optimally utilizing available battery power. |
The operational costs may be reduced due to less equipment. | More operational costs due to infrastructure requirements. |
Manual control and restoration may be needed in case of UAV failure. | Automated and flexible restoration techniques can be feasible in case one or more UAVs fail. |
Kalman Filters | Extended Kalman Filters | Unscented Kalman Filters | RBPF |
---|---|---|---|
In simple scenarios, if the UAV trajectory is linear, it is relatively easier to analytically obtain UAV state-estimation. | As depicted in Figure 5, EKF is applicable for non-linear UAV trajectory and state-estimation. | UKF simplifies non-linear UAV trajectory using statistical linearization and may require fewer computations compared to EKF. | These filters address non-linear UAV trajectory recursively through Monte-Carlo statistical state estimation. |
Kalman filters are computationally cheaper to implement in Matlab or python using matrix algebra. | EKF may be implemented as a Kalman filter that applies piece-wise linearization of non-linear sensor data. | UKF can further reduce the required computational power. These can be effective for C-SLAM. | Can be used in UAV-SLAM for non-Gaussian tracking of UAV trajectory and landmark data. |
Generally provides optimal solution when data may be non-Gaussian and noise is Gaussian distributed. | Requires sensor data and noise to be Gaussian distributed. | UKF does not require noise and sensor data to be Gaussian distributed. | Suitable for multimodal sensor fusion where data are nonlinear and non-Gaussian, and the posteriors are randomly distributed. |
SLAM Architecture Based on All Types of Sensor Data | Visual SLAM and Aerial Photogrammetry |
---|---|
UAV trajectory and the landmark map are captured through various sensors. | UAV trajectory, localization and landmark map-building rely primarily on visual sensing. |
Different sensors can capture data of varying size to capture the landscape that changes drastically between two timeframes. | Visual SLAM primarily generates pixels or point clouds to depict randomly varying landmark geometry. |
The range up to which the sensors can sense landmarks may be limited, and the measurements may be impacted by noise. | Increased range offered by LiDAR cameras to sense a large number of landmarks that vary in size and distance from UAV. |
The relationships between UAV trajectory and landmark positions can be stored in 1D or 2D. | Visual SLAM and aerial photogrammetry can capture occluded landmarks through point cloud matching. |
The resulting maps usually contain coordinate information about landmark locations. | The resulting maps contain visual information as well as coordinate information about landmark locations. |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Gupta, A.; Fernando, X. Simultaneous Localization and Mapping (SLAM) and Data Fusion in Unmanned Aerial Vehicles: Recent Advances and Challenges. Drones 2022, 6, 85. https://doi.org/10.3390/drones6040085
Gupta A, Fernando X. Simultaneous Localization and Mapping (SLAM) and Data Fusion in Unmanned Aerial Vehicles: Recent Advances and Challenges. Drones. 2022; 6(4):85. https://doi.org/10.3390/drones6040085
Chicago/Turabian StyleGupta, Abhishek, and Xavier Fernando. 2022. "Simultaneous Localization and Mapping (SLAM) and Data Fusion in Unmanned Aerial Vehicles: Recent Advances and Challenges" Drones 6, no. 4: 85. https://doi.org/10.3390/drones6040085
APA StyleGupta, A., & Fernando, X. (2022). Simultaneous Localization and Mapping (SLAM) and Data Fusion in Unmanned Aerial Vehicles: Recent Advances and Challenges. Drones, 6(4), 85. https://doi.org/10.3390/drones6040085