Next Article in Journal
Optimal UAV Hangar Locations for Emergency Services Considering Restricted Areas
Previous Article in Journal
Control Architecture for a Quadrotor Transporting a Cable-Suspended Load of Uncertain Mass
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Towards UAVs in Construction: Advancements, Challenges, and Future Directions for Monitoring and Inspection

1
Department of Civil Engineering, Kyungpook National University, Daegu 41566, Republic of Korea
2
Daegu Urban Development Corporation, Daegu 41594, Republic of Korea
*
Author to whom correspondence should be addressed.
Drones 2023, 7(3), 202; https://doi.org/10.3390/drones7030202
Submission received: 7 March 2023 / Revised: 13 March 2023 / Accepted: 14 March 2023 / Published: 15 March 2023

Abstract

:
The use of UAVs for monitoring and inspection in the construction industry has garnered considerable attention in recent years due to their potential to enhance safety, efficiency, and accuracy. The development and application of various types of drones and sensors in the construction industry have opened up new data collection and analysis possibilities. This paper provides a thorough examination of the latest developments in the use of UAVs for monitoring and inspection in the construction industry, including a review of the current state of UAVs and an exploration of the types of drones and sensors applied and their applications. It also highlights the technological advancements in this field. However, as with any new technology, there are challenges and limitations that need to be addressed, such as regulatory and legal concerns, technical limitations, data processing challenges, training and expertise, and safety. Finally, we offer insights into potential solutions to these challenges, such as innovative sensors and imaging technologies, integration with other construction technologies, and the use of machine learning and AI for data analysis, which are some of the potential areas for future investigation, and highlight the prospects for drone-based construction inspection.

1. Introduction

The construction industry plays a vital role in the global economy, with over USD 10 trillion spent on construction-related tasks annually, and is projected to be worth USD 15 trillion by 2030 [1]. As the industry grows, construction sites and tasks become increasingly complex and diverse, necessitating the introduction of automation and intelligent technologies to enhance operational efficiency, reduce project costs, and ensure the safety of construction workers and infrastructure. Unmanned aerial vehicles (UAVs), also known as drones, are one of the most promising and widely adopted technologies improve construction and infrastructure sustainability [2].
UAVs are aircrafts that can be operated remotely without a human pilot onboard and can be equipped with various sensors and cameras to capture high-resolution images and videos from different angles, providing valuable insights into construction sites. UAVs possess many natural advantages, including accessibility, high efficiency, and cost-effectiveness, which make them ideal tools for construction site monitoring and inspection. They can access hard-to-reach areas and provide close-up inspections that are difficult or impossible to obtain with traditional inspection methods. UAVs can cover large areas quickly and accurately, allowing for real-time monitoring and data collection, making it easier for construction managers to make informed decisions and adjust plans accordingly.
Despite the potential benefits of using UAVs in construction, there are still some challenges and limitations that need to be addressed, including regulatory and legal issues, technical limitations, data processing challenges, training and expertise, and safety concerns. To overcome these issues, a collaborative effort between industry stakeholders, regulatory agencies, and academic researchers is required. We have noted that while there are several reviews on the use of UAVs in the construction industry, most of them focus on specific aspects of the technology or particular applications. There is a need for a comprehensive review that covers the latest developments and technological advancements in the use of UAVs in construction site monitoring and inspection. This paper aims to fill this gap in the literature by providing a thorough examination of the current state of UAVs in the construction industry and identifying the key challenges and limitations that need to be addressed.
This paper is structured as follows. Section 2 reviews in detail about the types of UAVs and sensors used in the construction industry, including a comparative analysis of the various classes. Section 3 presents a review of the technologies related to UAVs in the construction industry. Section 4 discusses the related limitations and challenges of using UAVs in construction. Section 5 presents some potential areas and future directional insights into the field of construction inspection with drone technology. Finally, Section 6 summarizes the main contributions of this paper and highlights potential avenues for future research. By following this structure, this paper aims to comprehensively analyze the latest developments in the UAV-based construction industry and highlight the technological advancements associated with them. The paper identifies the key challenges and limitations that must be carefully considered to maximize the benefits of this technology, explores potential areas for future investigation, and provides valuable insights and recommendations for industry stakeholders, regulatory agencies, and academic researchers in the field of construction inspection using UAV technology, with the conceptual framework shown in Figure 1. By addressing these challenges and maximizing on opportunities, the construction industry can benefit from the advantages of UAV technology in improving safety, efficiency, and accuracy, creating a safer, more sustainable, and more efficient construction industry.

2. UAV Planning in the Construction Industry

The planning of UAV missions for construction site supervision is crucial for the success and effectiveness of UAV-based technology. Determining the appropriate type and number of UAVs, as well as the type and number of sensors to be employed, is necessary to meet the project’s data collection goals and requirements. Additionally, the flight altitude and orientation should be carefully considered to optimize data collection and minimize potential collisions with objects or individuals on the construction site. The location and timing of flights should also be strategically planned to accurately represent the construction site’s current state and meet the project’s needs.

2.1. Types of UAVs Used in the Construction Industry

The use of UAVs in the construction industry has been a growing trend in recent years. In construction, the use of UAVs has increased by nearly 240%, higher than any other commercial sector [3]. UAVs provide such aerial advantages and capabilities as they are effective in data collection and task execution, providing valuable assistance in addressing construction activities. There are several types of UAVs that are commonly used in construction, including fixed-wing UAVs, rotary-wing UAVs, and hybrid UAVs [4].

2.1.1. Fixed-Wing UAVs

Fixed-wing UAVs, shown in Figure 2, are designed to fly like an airplane, with wings that provide lift and a tail section for stability and control. These UAVs are typically larger and more complex than their rotary-wing counterparts, and they require a runway or other smooth, flat surfaces for takeoff and landing [5]. One of the main advantages of fixed-wing UAVs is their long range and endurance. These UAVs can fly for extended periods of time, making them well suited for large-scale mapping and data collection tasks. They are also typically faster than rotary-wing UAVs, which can benefit certain applications.
However, fixed-wing UAVs also have some disadvantages regarding construction applications. They are not as agile or versatile as rotary-wing UAVs, making them more difficult to use in tight or confined spaces. They also require more specialized training and equipment for operation, which can be a barrier for some users [6]. Overall, fixed-wing UAVs can be a valuable tool for construction site supervision and data collection, particularly for large-scale projects. However, their specific advantages and disadvantages should be carefully considered in relation to the project’s needs and the UAV system’s capabilities.

2.1.2. Multi-Rotor UAVs

Multi-rotor UAVs, also known as quadrotors or quadcopters, are rotary-wing UAVs. Their use of multiple rotors, typically four, characterizes these UAVs, as shown in Figure 3a, to lift and propel the aircraft. Multi-rotor UAVs can be classified into different categories based on the number of rotors they use, such as hexacopters (six rotors) as in Figure 3b or octocopters (eight rotors) as in Figure 3c. They can also be classified based on their size and payload capacity, with larger and more powerful multi-rotor UAVs capable of carrying heavier payloads, such as high-resolution cameras or specialized sensors [7]. Regarding range and speed, multi-rotor UAVs are typically limited compared to fixed-wing UAVs. Their range is typically limited to a few kilometers, and their top speed is usually around 60 km/h. However, they are highly agile and can hover in place, making them well suited for tasks requiring precise positioning or close inspection [8].
One of the main advantages of multi-rotor UAVs in the construction industry is their ability to operate in confined or urban environments where larger aircrafts may not be able to fly. They can also be easily deployed and operated by a single person, making them a cost-effective solution for construction site supervision [9]. However, they are typically less efficient and have shorter flight times than fixed-wing UAVs, and their payload capacity is usually limited.

2.1.3. Hybrid UAVs

Hybrid UAVs, also known as hybrid aircraft, are UAVs that combine features of fixed-wing and rotary-wing aircraft. These UAVs combine the long-range and high-speed capabilities of fixed-wing aircrafts with the vertical takeoff and landing (VTOL) capabilities of rotary-wing aircrafts [10]. Two hybrid UAVs exist, including the tilt-rotor aircraft [11] and the tilt-wing aircraft [12]. Tilt-rotor aircrafts, also known as transition aircrafts [13] as in Figure 4a, have rotors that can tilt between a vertical and horizontal position, allowing them to take off and land vertically like a helicopter or fly horizontally like an airplane. Tilt-wing aircrafts have wings that can tilt between a vertical and horizontal position as in Figure 4b, providing the ability to flip, etc. This unique feature enables hybrid UAVs to perform a wide range of missions requiring helicopter-like hovering and fixed-wing aircraft-like high-speed cruising.
One of the main advantages of hybrid UAVs is their versatility. They can operate in various environments and perform a wide range of tasks, including surveillance, mapping, inspection, and delivery. They can also cover long distances quickly, making them ideal for use in remote or inaccessible areas. Additionally, hybrid UAVs can often carry a larger payload than traditional rotary-wing or fixed-wing UAVs, allowing them to carry a variety of sensors and other equipment. There are also a few disadvantages to consider when using hybrid UAVs in construction. One of the main drawbacks is their cost, as these UAVs tend to be more expensive than traditional fixed-wing or rotary-wing UAVs. Additionally, hybrid UAVs may require more maintenance and are generally more complex to operate, requiring specialized training and expertise. Finally, hybrid UAVs may be more vulnerable to certain types of weather conditions, such as strong winds or heavy rain, which can impact their performance [14]. Overall, hybrid UAVs are a promising technology for the construction industry, offering a balance of the capabilities of fixed-wing and rotary-wing UAVs. They are particularly useful for tasks requiring vertical takeoff, landing, and efficient horizontal flight.
In the construction industry, the choice of UAV depends on the mission’s specific requirements; the advantages and disadvantages summarized according to each type are shown in Table 1. Fixed-wing UAVs, which are characterized by their extended range and endurance, are often preferred for long-range missions, such as surveying large construction sites or inspecting infrastructure. Rotary-wing UAVs, which are characterized by their ability to hover and perform precise vertical movements, are better suited for missions requiring precise positioning or close-range observation. Hybrid UAVs, which combine the capabilities of both fixed-wing and rotary-wing UAVs, are the most suitable for missions requiring long-range flight and precise maneuverability. It is important to thoroughly assess the specific needs of the mission and select the most appropriate type of UAV to ensure the success of the operation.

2.2. UAVs Equipped with Sensors Used in the Construction Industry

UAVs used in the construction industry are equipped with various sensors to enable data collection and task execution [15,16,17,18]. These sensors include visible light sensors as in Figure 5a, which capture images in the visible spectrum and are commonly used for mapping and visual inspection tasks. Light detection and ranging (LiDAR) sensors as in Figure 5b use lasers to measure the distance between the UAV and the ground, generating a high-resolution 3D map of the site. Thermal imaging (TI) sensors as in Figure 5c detect heat signatures and can be used to identify energy efficiency issues or locate hidden electrical faults. Global positioning system (GPS) and real-time kinematic (RTK) sensors as in Figure 5d provide precise positioning data, enabling the UAV to accurately map the site and navigate through tight or confined spaces.

2.2.1. Visible Light Sensors

Visible light sensors, also known as red-green-blue (RGB) sensors, are sensors that capture images using visible light wavelengths in the electromagnetic spectrum. These sensors are commonly used in UAVs for construction applications, such as mapping, inspection, and monitoring, as they provide high-resolution images that are useful for visualizing and analyzing the construction site [19]. One of the main advantages of RGB sensors is that they can capture high-resolution images with a high level of detail. This is particularly useful for tasks such as mapping, where the accuracy and precision of the data are critical. RGB sensors can also be used to identify and classify different features on the construction site, such as buildings, roads, and vegetation [20,21,22,23,24,25,26,27,28].
Another advantage of RGB sensors is their ease of use. These sensors are widely available and inexpensive, and they do not require specialized training or equipment. They are also able to capture images in a variety of lighting conditions, making them suitable for use in different environments [29].
RGB sensors can also capture images in real time, which can be beneficial for construction site supervision [30]. With the ability to continuously monitor the site, construction professionals can quickly identify any issues or problems that may arise and take appropriate action. This can help to reduce delays and improve the overall efficiency of the construction process.
However, there are several limitations to consider when using RGB sensors in construction. One limitation is that RGB sensors are only able to capture visible light, which means they are unable to detect objects or features that are outside the visible spectrum. This can be a problem in certain conditions, such as when there is poor lighting or when the site is covered in shadows. In these cases, the images captured by the sensor may be of lower quality or may not show certain features.
Another limitation is that RGB sensors are sensitive to changes in lighting conditions. If the lighting changes significantly between different flights or during a single flight, it can affect the quality and accuracy of the images captured. This can be a problem when trying to create accurate maps or models of the site, as the differences in lighting can cause variations in the appearance of the images.
Finally, RGB sensors are sensitive to reflections and glare, which can affect the accuracy of the images. This can be a problem when trying to capture images of shiny or reflective surfaces, such as glass or metal. In these cases, the sensor may produce distorted or blurry images, which can reduce the usefulness of the data collected.

2.2.2. LiDAR Sensors

LiDAR sensors use lasers to measure distance and create high-resolution 3D models of the surrounding environment. These sensors have several advantages when used on UAVs in the construction industry and have become increasingly popular in recent years for tasks such as site surveying and mapping.
One major benefit of LiDAR sensors for UAVs in construction is their high accuracy. These sensors can generate highly precise 3D models of construction sites, with an accuracy of up to a few centimeters [31,32]. This can be particularly useful for tasks such as topographic surveys, where precise measurements are critical.
Another benefit of LiDAR sensors for UAVs in construction is their efficiency. These sensors can quickly capture large amounts of data and generate 3D models of construction sites in a relatively short time. This can be particularly useful for tasks such as site inspection, where the ability to quickly generate accurate 3D models can save time and reduce costs [33,34].
A third benefit of LiDAR sensors for UAVs in construction can improve safety on construction sites [35]. These sensors can be used to generate 3D models of hazardous areas, such as steep slopes or unstable structures, which can help to identify potential hazards and reduce the risk of accidents [36].
However, there are also several limitations to the use of LiDAR sensors on UAVs in construction that should be considered.
One limitation of LiDAR sensors for UAVs in construction is their cost. These sensors can be expensive to purchase and maintain, making them less accessible for some construction companies. In addition, the cost of operating UAVs equipped with LiDAR sensors can be high, as these systems require specialized training and expertise to operate safely and effectively. This can be a barrier to adoption for some construction companies, particularly those with limited budgets or resources.
Another limitation of LiDAR sensors for UAVs in construction is their limited range. These sensors typically have a maximum range of around 100 m, which can be limiting in certain situations. For example, if a construction site is located in an area with tall buildings or other structures that block the line of sight of the LiDAR sensor, it may be difficult to generate accurate 3D models of the site. This can be a problem for construction companies working on large or complex projects, as it may be necessary to fly multiple UAVs to cover the entire site.

2.2.3. TI Sensors

TI sensors capture the infrared energy emitted by objects and convert it into a visual image [37,38]. One major benefit of UAV-equipped thermal imaging sensors in construction is their ability to detect heat-related issues [39]. These sensors can identify temperature anomalies and detect problems, such as insulation issues [40,41,42], air leakage [43,44,45], and moisture intrusion [46,47]. This can be particularly useful for tasks such as building envelope inspection, where early identification of heat-related issues can save time and reduce costs by avoiding costly repairs and energy consumption.
Another benefit of UAV-equipped thermal imaging sensors in construction is their ability to detect structural issues. These sensors can detect thermal anomalies that may indicate problems such as structural damage or cracks in walls, floors, or roofing [48]. This can be useful for tasks such as building inspection [49], where early identification of structural issues can save time and reduce costs by avoiding costly repairs.
However, like any technology, there are also limitations to the use of UAV-equipped thermal imaging sensors in construction, such as their sensitivity to motion. These sensors are sensitive to vibrations and movements caused by the UAV, which can result in blurriness and reduced image quality [50]. This can be particularly challenging for tasks such as structural analysis, where stable images are critical for accurate analysis.
Another limitation of UAV-equipped thermal imaging sensors in construction is their limited field of view (FOV) [51]. These sensors typically have a narrower FOV compared to other remote sensing technologies such as visual cameras [52], which can limit their effectiveness in large-scale inspections and monitoring projects. This can be particularly challenging for tasks such as site inspection, where a wide FOV is necessary to capture detailed images of the entire site.
Finally, UAV-equipped thermal imaging sensors in construction may have difficulty identifying the source of heat emission; thermal imaging sensors capture infrared radiation emitted from objects, but it can be difficult to pinpoint the exact source of the heat, especially when multiple sources are present. This can be particularly challenging for tasks such as building inspection, where identifying the exact location of the heat loss or insulation problem is critical.

2.2.4. GPS and RTK Sensors

GPS and RTK sensors are commonly used in the construction industry to provide accurate positioning and navigation data for UAVs. One major benefit of GPS and RTK sensors for UAVs in construction is their high accuracy and precision [53]. These sensors use signals from a network of GPS satellites to accurately determine the position of the UAV, with an accuracy of up to a few centimeters [54].
Another advantage of GPS and RTK sensors is their real-time capability [55]. These sensors provide real-time positioning and navigation data, which allow the UAV to quickly and accurately navigate the construction site. This can be useful for tasks such as site inspection, where the ability to quickly generate accurate 3D models can save time and reduce costs.
A third advantage of GPS and RTK sensors for UAVs in construction is their high availability. These sensors use signals from a network of GPS satellites, which are widely available and have a high level of availability. This means that GPS and RTK sensors can be used in various environments and conditions. In addition, GPS and RTK sensors can be easily integrated with other sensors on the UAV, such as cameras and LiDAR sensors [56,57]. This can provide more comprehensive data and a better representation of the construction site.
However, these sensors are also limited when used on UAVs in the construction industry. One limitation of GPS and RTK sensors is their susceptibility to signal interference. These sensors rely on signals from GPS satellites, which can be affected by various factors, such as atmospheric conditions, tall buildings or trees, and other sources of interference [58]. This can result in lower accuracy and reliability of the positioning and navigation data.
Another limitation of GPS and RTK sensors depends on the infrastructure that supports them, such as the availability of reference stations or the quality of the communication link with the base station [59]. This can cause limitations in their usage in remote or rural areas and delay or increase the data processing cost. The advantages and disadvantages of each sensor are shown in Table 2.

2.3. Other Factors of UAVs Used in the Construction Industry

When planning the use of UAVs in the construction industry, it is important to consider various factors such as flight altitude, flight direction, flight path, and number of UAVs [60]. The selection of these factors will depend on the specific task and the desired output and should be carefully considered to ensure the safe and efficient operation of the UAVs, as well as the accuracy and quality of the data collected.
One important factor to consider when planning the use of UAVs in the construction industry is flight altitude [61]. The altitude of the UAV will affect the field of view of the sensors, as well as the resolution of the images and data collected. In general, a higher altitude results in a wider field of view but a lower resolution. In comparison, a lower altitude results in a narrower field of view but a higher resolution. The desired resolution and field of view depends on the task at hand, such as site surveying or inspection, and the altitude should be selected accordingly.
Another factor to consider is flight direction [62]. UAVs can fly in various directions, such as parallel or perpendicular to a feature or in a spiral pattern around a feature. The direction of flight affects the coverage and resolution of the data collected and should be selected based on the task and the desired output.
A third factor to consider is the flight path [63]. The path of the UAV can be pre-planned or generated in real time and can include a variety of waypoints and obstacles. The flight path should be selected to ensure the safe and efficient operation of the UAV and the coverage and resolution of the data collected.
Finally, the number of UAVs used depends on the size of the site and the task at hand. In some cases, a single UAV may be sufficient, while in others, multiple UAVs may be required to cover a larger area or to collect data from multiple sensors simultaneously [64]. The number of UAVs used should be selected based on the site and task, and the number should be kept to a minimum to reduce operational costs and increase safety.

3. UAV-Based Related Technologies in the Construction Industry

UAV-based technologies have been increasingly utilized in the construction industry for their ability to provide accurate, high-resolution data quickly and safely. UAV-based 3D modeling enables construction teams to create detailed models of construction sites, buildings, and structures, yielding improved planning and project management. UAV-based non-destructive testing (NDT) can provide valuable information on the integrity of structures, identifying issues such as cracks, corrosion, and other defects without causing damage to the structure. UAV-based object detection technology can be used in the construction industry for various purposes, including enhancing worker safety and inspecting construction materials and areas. UAVs equipped with sensors can quickly and accurately detect and identify objects such as workers, equipment, and materials on construction sites. This can help ensure that safety protocols are being followed, such as using appropriate safety gear such as helmets, reflective vests, and safety belts. UAVs can also inspect construction materials and areas that may need to be addressed.

3.1. Related Technologies for UAV-Based 3D Modeling in the Construction Industry

Photogrammetry is one of the earliest techniques used for 3D modeling using UAVs in the construction industry. Photogrammetry is the science of making measurements from photographs, and it yield the creation of accurate 3D models of construction sites [65]. Initially, photogrammetry was mainly used to create 2D maps and orthophotos. Still, with advancements in technology and the increased availability of high-resolution cameras, it has become possible to use photogrammetry to create detailed 3D models.
Figure 6 demonstrates the utilization of coordinate information captured by a UAV during aerial photography to establish a three-dimensional spatial coordinate system. This involves determining the spatial geometric relationship between the captured image and its corresponding target and calculating a sparse point cloud of the camera position and target at the time of imaging through the correspondence between image points and captured objects, as shown in Figure 6a. Subsequently, the result of this photography-based 3D modeling approach is obtained, as depicted in Figure 6b. However, traditional photogrammetry-based methods have some limitations. For example, they require a high level of expertise to operate and interpret the results. They also requires a significant amount of manual labor to process the data, and it is not capable of dealing with large-scale datasets [66].
The development of the structure from motion (SfM) algorithm was a major breakthrough in the field of 3D modeling using UAVs [67]. SfM is an algorithm that uses multiple images of the same scene captured from different viewpoints to reconstruct a 3D model of the scene. SfM is particularly useful for UAV-based applications, as it allows highly detailed and accurate 3D models to be created, even from images captured with low-cost cameras. However, SfM algorithms also have limitations, for example, they can struggle with scenes with repetitive patterns, and they also require a high computational power to process data [68,69,70].
Another important development in the field of 3D modeling using UAVs in the construction industry is the integration of photogrammetry with other data, such as LiDAR data [71]. The combination of photogrammetry and LiDAR data enables precise measurements of the construction site, even in challenging environments where direct measurements are difficult to acquire.
Some researchers have proposed methods that integrate 3D laser scanning and photogrammetry for the progress measurement of construction projects. Using both technologies, they can capture more comprehensive and reliable data from different perspectives and reduce errors caused by occlusions or noise. Moreover, they can also improve the efficiency and accuracy of data processing by applying advanced algorithms for point cloud registration and segmentation [72]. These examples show that integrating photogrammetry with LiDAR data can provide significant benefits for 3D modeling in the construction industry. However, some challenges still need to be addressed, such as how to optimize data acquisition strategies, deal with large-scale datasets, ensure data quality and consistency, etc. Therefore, further research is needed to explore more possibilities and solutions for this emerging field.
In recent years, the field of 3D modeling using UAVs in the construction industry has seen significant advances in integrating deep learning techniques. Convolutional neural networks (CNNs) [73], generative adversarial networks (GANs) [74], and recurrent neural networks (RNNs) [75] have been used to improve the accuracy and efficiency of 3D modeling by automating feature extraction and semantic segmentation tasks [76,77,78]. However, deep-learning-based methods also have some limitations. For example, they require a large amount of labeled data to train the models, and they can be computationally expensive to run [79]. Additionally, the results generated by deep learning models can be difficult to interpret, requiring a high level of expertise to design and train the models.
Despite these limitations, developing technologies related to 3D modeling using UAVs in the construction industry has seen significant advancements in recent years. For example, researchers have started to explore integrating multiple data sources, such as photogrammetry, LiDAR, and deep learning, to create more accurate and efficient 3D modeling methods [80]. Additionally, the advancement of computer vision and machine learning techniques has enabled more accurate and automated ways to analyze images and generate 3D models, which also helps to reduce the reliance on manual labor.

3.2. Related Technologies for UAV-Based Non-Destructive Testing (NDT) in the Construction Industry

Traditionally, NDT in the construction industry has been performed using manual inspections, which are time-consuming and can be dangerous, especially when working in hard-to-reach areas. Using UAVs equipped with cameras and sensors has become a more efficient and safer alternative to traditional manual inspections. Early techniques for NDT using UAVs relied on visual inspections, where images and videos captured by UAVs were analyzed by experts to identify potential defects and hazards [81].
In recent years, the development of more advanced sensors, such as thermal imaging cameras and ultrasonic sensors and integrating these sensors with UAVs, has enabled the use of UAVs for more advanced NDT applications in the construction industry. For example, one study proposed using thermal and visible point clouds to generate a higher-resolution thermal point cloud for roof inspection [82]. The combination of visible and thermal point clouds provided high spatial resolution with thermal information, enabling accurate detection of thermal problems. Another study utilized point-cloud-based inspection derived from UAV images to automatically detect damage in bridge decks [83]. A robust and efficient method was employed to extract a point cloud of the bridge deck, which was classified into cracking and undamaged areas using a deep learning approach. Infrared thermography is another technique that has gained popularity in NDT. A recent study developed a novel cloud-to-model tool that converts the emissivity scalar fields extracted from the point cloud into an analysis layer, yielding intuitive interpretation of collected data. The accuracy of the proposed infrared-based approach was compared with that of a point cloud generated using high-resolution digital images [84]. Crack assessment of bridge structures is critical for maintaining safe transportation infrastructure. A study proposed a crack detection method based on geometric correction and calibration algorithms, which used four parallel laser emitters installed on the UAV camera for crack image acquisition. The proposed method showed greater precision for crack width identification, indicating its potential for actual crack detection of bridges [85].
Additionally, the integration of deep learning techniques with the sensor data collected by UAVs has been explored as a way to improve the accuracy and efficiency of NDT. Deep learning algorithms can be trained to automatically detect and classify potential defects and hazards in images and videos, reducing the need for manual labor and improving the accuracy of the results. These algorithms are more accurate and efficient than traditional visual inspections in identifying defects and hazards. For example, one study introduced the strategy of UAV-carried passive infrared thermography combined with transfer learning to realize efficient detection and automatic identification of embankment leakage, which was transformed into image classification. The researchers established an open-air simulation platform to obtain sufficient images for model training. Using these images and the AlexNet-based transfer learning method, an image classification model with excellent performance was trained [86]. Another study proposed a novel convolutional neural network to automatically identify dam-surface seepage from thermograms collected by an unmanned aerial vehicle carrying a thermal imaging camera. The researchers added an auxiliary input branch with two specially designed modules to a U-Net frame to reduce the false-alarm rate caused by “seepage-like” background interference on the dams and accurately identify seepage profiles with clear boundaries from low-resolution thermograms [87]. A recent study presented a method for managing the inspection results of building external walls by mapping defect data from UAV images to building information modeling (BIM) and modeling defects as BIM objects. The researchers developed a deep-learning-based instance segmentation model to detect defects in the captured images and extract their features [88].

3.3. Related Technologies for UAV-Based Object Detection in the Construction Industry

Among the many applications of UAVs in the construction industry, object detection has been one of the most widely researched and implemented [89]. The first generation of traditional computer algorithms for object detection using UAVs in the construction industry was based on image processing techniques. These techniques involve using different image processing methods, such as edge detection, thresholding, and feature extraction, to analyze the images captured by UAVs to detect objects. For example, edge detection can be used to detect the edges of objects [90], thresholding can be used to segment an image into different regions [91], and feature extraction can be used to extract relevant information from the image, such as shape, color, and texture [92,93,94,95]. These techniques are simple and computationally efficient, but the accuracy is low.
In the next generation of traditional computer algorithms, machine learning algorithms were applied to object detection using UAVs in the construction industry. Machine learning algorithms, such as support vector machines (SVMs) and decision trees, have been applied to detect objects in images captured by UAVs. However, large amounts of image data are required to train a detection algorithm to detect each class of construction entity in images. To address this, a three-dimensional reconstruction method has been proposed to generate the image data required for training object detectors. The generated synthetic images are then used as training data, and a histogram of a target object’s oriented gradient (HOG) descriptor is obtained from these images. The descriptor is refined by a support vector machine to increase sensitivity to the target object in test images [96]. Another study proposed a new hybrid vehicle detection scheme that integrates the Viola–Jones (V–J) and linear SVM classifier with HOG feature (HOG + SVM) methods for vehicle detection from low-altitude UAV images. The proposed scheme adopts a roadway orientation adjustment method to align the roads with the horizontal direction. The original V-J or HOG + SVM method can be directly applied to achieve fast detection and high accuracy. An adaptive switching strategy has also been developed to improve detection efficiency, combining V–J and HOG + SVM methods based on their different descending trends in detection speed. The proposed vehicle detection method can be performed on videos captured from moving UAV platforms without needing image registration or an additional road database [97]. Finally, a vehicle detection method from UAVs is proposed, which integrates the scale invariant feature transform (SIFT) and implicit shape model (ISM). Firstly, a set of key points is detected in the testing image using SIFT. Secondly, feature descriptors around the key points are generated using the ISM. SVMs are applied during the key points selection. The method is evaluated using a video shoot by a UAV, and the results show its performance and effectiveness [98]. These algorithms include using SVMs and decision trees, which are trained to detect objects in UAV images.
However, these techniques have limitations, such as robustness against changes in lighting conditions and occlusions and limited scalability. These limitations have led to the development of more advanced techniques, such as deep learning algorithms, which are more robust and accurate in identifying objects.
With the advent of deep learning, a new generation of object detection algorithms has been proposed and applied to UAV-based object detection. These deep learning algorithms, such as CNN, R-CNN [99], and YOLO series [100,101,102,103,104,105,106], have been trained to automatically detect objects in images and videos captured by UAVs, reducing the need for manual labor and improving the accuracy of the results.
CNNs are the most basic form of deep learning algorithms for object detection and can be used to extract features from images and videos. They have been widely used in image classification and object detection tasks. The R-CNN algorithm is an improvement to CNNs, in which a region proposal network (RPN) is added to generate region proposals, which are then classified using CNNs. This approach improves the accuracy of object detection. Several studies have utilized the R-CNN algorithm with UAV images to detect various objects. One study improved the Faster R-CNN algorithm using deformable convolution to adapt to arbitrarily shaped collapsed buildings. In addition, a new method was proposed to estimate the intersected proportion of objects (IPO) to describe the degree of intersection of bounding boxes, leading to better precision and recall for detecting collapsed buildings [107]. Another study extended the author’s developed techniques to identify and quantify bridge damage based on UAV images. The scope of the research included image acquisition, a classification system of cracks based on deep learning, and algorithms of detection and quantification using improved image processing techniques [108]. A third study proposed a method for detecting and measuring cracks in unreachable parts of large crane structures using the Faster R-CNN algorithm with UAV images. Crack length, width, area, and aspect ratio parameters were identified by various methods, including maximum entropy threshold segmentation, Canny edge detection, projection feature extraction, and skeleton extraction methods [109]. Finally, an edge-computed and controlled outdoor autonomous UAV system was proposed to monitor the safety helmet wearing of workers on construction sites. The main focus of this work was the detection and counting of workers with safety helmets of specified colors and those without safety helmets using the R-CNN algorithm [110]. Other recent techniques similar to the above-proposed helmet detection system have been presented by Liang and Seo [111], who proposed an automated approach to detect helmeted workers on construction sites using UAV low-altitude remote sensing. The proposed system utilizes a deep learning model based on the Swin Transformer to perform periodic and efficient helmet-wearing inspections on construction sites. The single-stage end-to-end helmet detection network is designed to accurately classify helmet usage and color type in real construction sites.
Experimental results show that the proposed method achieves a mean average precision (mAP) of 92.87% on the GDUT-Hardhat Wearing Detection (GDUT-HWD) dataset and improves the average precision (AP) for small-sized targets up to 88.7%. Figure 7 provides a visualization of the network detection process. Despite the challenges posed by occlusion and complex environments, the proposed approach and similar techniques demonstrate the potential of UAVs and deep learning in automated site supervision.
The YOLO algorithm is a real-time object detection algorithm that uses a single convolutional neural network to predict the class and location of objects in an image or video. This algorithm has the advantage of being fast and accurate, but it is limited in the number of objects it can detect. However, several recent studies have proposed modifications and enhancements to the YOLO algorithm to improve its performance with its application in the construction industry with UAV-based applications. For instance, a YOLO-GNS algorithm has been proposed for special vehicle detection from the UAV perspective, which introduces the single stage headless (SSH) context structure to improve feature extraction and reduce computational cost. This algorithm has shown a 4.4% increase in average detection accuracy and a 1.6 increase in detection frame rate compared to other derivatives [112]. Another study proposed an intelligent object recognition model based on YOLO and GAN to improve the resolution of identified images. This study adjusted the structure and parameters of the recognition model and image resolution enhancement model through simulation experiments to improve the accuracy and robustness of object recognition [113]. In the construction industry, a case study has been presented on developing an image dataset specifically for construction machines named the Alberta Construction Image Dataset (ACID). To validate the feasibility of the ACID, four existing deep learning object detection algorithms, including YOLO-v3, Inception-SSD, R-FCN-ResNet101, and Faster-RCNN-ResNet101, were trained using this dataset, achieving a mean average precision (mAP) of up to 89.2% [114]. YOLO has also been used in a crack detection and location method for steel structures and concrete buildings. The method involves pre-segmenting UAV images, establishing different crack segmentation datasets, training YOLO V3 and DeepLab V3+ models, and combining images with UAV flight records for panoramic crack location and presentation. These methods have effectively monitored and detected cracks in various structures [115].
Integrating deep learning techniques with sensor data collected by UAVs has improved object detection’s accuracy and efficiency, addressing traditional algorithms’ limitations. Deep learning algorithms are more robust and can handle variations in lighting conditions and occlusions, and they have been shown to have higher accuracy and can handle larger datasets than traditional computer algorithms. In conclusion, developing deep learning algorithms for object detection using UAVs in the construction industry has been a significant advancement in the field, providing efficient, accurate, and non-destructive ways of monitoring and analyzing construction sites. However, like any technology, deep learning algorithms for object detection using UAVs also have limitations that must be considered.
One of the main limitations of deep learning algorithms is the requirement of large amounts of high-quality labeled data. Training deep learning algorithms requires large amounts of labeled data, which can be time-consuming and expensive to collect. Collecting labeled data can be challenging in the construction industry due to the dynamic nature of construction sites and the lack of publicly available datasets.
Another limitation is the computational requirements of deep learning algorithms. These algorithms require powerful hardware and can take significant time to train, which can be a bottleneck for small and medium-sized companies with limited resources. Furthermore, deep learning algorithms can be sensitive to the quality of the data, which means that even small errors in data can lead to significant errors in the predictions.

4. Challenges and Limitations

Despite the potential benefits of using UAVs for construction inspection, there are several challenges and limitations that must be considered when implementing this technology. These include regulatory and legal issues, technical limitations, data processing challenges, training and expertise, and safety concerns.

4.1. Regulatory and Legal Issues

Using UAVs for construction inspection is subject to a complex set of regulations and laws, which can vary depending on country or region. Regulatory bodies typically impose requirements related to pilot certification, UAV registration, flight restrictions, and data privacy. These regulations can pose significant challenges for construction companies and inspection firms that are looking to use UAVs for construction inspection.
For example, in the United States, the Federal Aviation Administration (FAA) regulates the use of drones through Part 107 rules, which set out the requirements for obtaining a remote pilot certificate and registering drones with the FAA [116]. In addition, the FAA sets out flight restrictions, such as a maximum altitude of 400 feet and a requirement to maintain a visual line of sight with the drone at all times. Violating these rules can result in significant fines or even criminal charges.
Similarly, in the European Union, drones are regulated by the European Aviation Safety Agency (EASA), which sets out requirements for drone registration, pilot training, and operational procedures [117]. The EASA also imposes flight restrictions, such as a requirement to maintain a safe distance from people and property, and restrictions on flying over certain areas, such as airports or prisons [118].
Regulatory and legal issues can also impact the ability to obtain necessary permits and approvals to operate drones for construction inspection. For example, in some countries, obtaining a permit to fly drones over urban areas or populated areas can be challenging due to concerns about privacy and safety [119,120,121]. In addition, construction companies and inspection firms may need to obtain additional permits or approval from local authorities, depending on the location and nature of the construction project.
Regulatory and legal issues pose significant challenges to the use of UAVs in construction inspection. Understanding and complying with relevant regulations and laws is essential to ensure the safe and effective operation of UAVs. Construction companies and inspection firms should carefully consider the regulatory and legal landscape in their region and work closely with regulatory bodies and local authorities to obtain the necessary permits and approval.

4.2. Technical Limitations

One of the primary technical limitations of UAVs is their limited flight time. Most commercial UAVs have a flight time of around 20–30 min, which can limit the amount of data that can be collected during a single flight [122]. This can be particularly challenging for large construction projects that require extensive inspection and monitoring.
Another technical limitation of UAVs is their limited range. UAVs are typically limited in their ability to fly long distances or to maintain a strong signal connection with the controller or base station [123]. This can make it difficult to cover large construction sites or to fly UAVs in areas with poor signal coverage.
Weather conditions can also impact the effectiveness of UAV inspections. Rain, high winds, and other adverse weather conditions can make it difficult or unsafe to fly UAVs, which can impact the ability to obtain timely and accurate data [124]. In addition, UAVs are limited in their ability to access certain areas of construction sites. For example, UAVs may not be able to access tight spaces, such as tunnels or narrow corridors, or to fly indoors in areas with limited visibility or signal interference [125].
Finally, the quality of data obtained through UAV inspections can be impacted by technical limitations, such as camera resolution, sensor accuracy, and data processing capabilities. Poor data quality can lead to inaccurate or incomplete assessments of construction site conditions, which can impact decision making and project outcomes.
Technical limitations can impact the effectiveness and safety of using UAVs for construction inspection. Construction companies and inspection firms should carefully consider the technical capabilities of UAVs, as well as potential weather conditions and other environmental factors, when planning UAV inspections. They should also ensure that UAVs are equipped with high-quality cameras and sensors and that data processing capabilities are sufficient to analyze and interpret data effectively.

4.3. Data Processing Challenges

While UAVs can provide valuable data for construction inspection, processing and analyzing that data can be a complex and time-consuming process. Data processing challenges can impact the accuracy and usefulness of the data obtained, as well as the overall efficiency of the inspection process.
One of the primary data processing challenges associated with UAV inspections is managing the large amounts of data that can be generated [126]. UAVs can capture high-resolution images, video, and other data at a rapid pace, which can quickly result in large datasets that need to be processed and analyzed. This can require significant storage and computing resources, as well as specialized software tools for data management and analysis.
Another data processing challenge is data accuracy and consistency. UAVs can capture data from multiple perspectives and at different times, which can lead to inconsistencies in data quality and accuracy. In addition, data may need to be corrected for factors such as camera distortion or sensor errors, which can further impact data accuracy [127].
Data interpretation is another challenge associated with UAV inspections. The data obtained may need to be processed and analyzed by experts in order to be interpreted accurately. This can require specialized knowledge of construction processes and materials, as well as expertise in data analysis and interpretation.
Data processing challenges can impact the accuracy, usefulness, and efficiency of using UAVs for construction inspection. Construction companies and inspection firms need to carefully consider the storage and computing resources required for data management, as well as the expertise needed for data analysis and interpretation. They should also ensure that data privacy and security regulations are followed to protect the privacy and confidentiality of individuals and businesses.

4.4. Training and Expertise

Using UAVs for construction inspection requires specialized training and expertise in order to ensure safety and accuracy. Without proper training and expertise, there is a risk of accidents or errors, as well as a risk to the safety of the UAV and surrounding environment.
One of the primary challenges associated with using UAVs for construction inspection is the need for specialized training. UAV operators need to be trained in the safe operation of UAVs, as well as in the specific techniques and procedures needed for construction inspection [128]. This may include training in data management and analysis, as well as in the interpretation of data obtained through UAV inspections.
Another challenge is the need for specialized expertise in construction processes and materials. UAV operators need to be familiar with the specific construction processes and materials being used in order to effectively interpret data obtained through UAV inspections [129]. This may require collaboration with experts in construction engineering, materials science, and other related fields.
In addition, ongoing training and certification are important considerations for UAV inspections. UAV technology is constantly evolving, and operators and inspection firms need to stay up to date on the latest advances in order to ensure safety and accuracy [130]. This may include ongoing training and certification programs, as well as continuing education in construction engineering and other related fields.
In summary, training and expertise are critical considerations for using UAVs for construction inspection. Construction companies and inspection firms need to invest in specialized training and equipment, as well as collaborate with experts in construction engineering and materials science, to ensure safety and accuracy in UAV inspections. They should also prioritize ongoing training and certification to stay up to date on the latest advances in UAV technology and construction processes.

4.5. Safety

UAV-based construction inspections can present safety concerns, both for personnel and equipment. These safety concerns can impact the overall effectiveness and efficiency of using UAVs for construction inspection, as well as the safety of the construction site and surrounding areas.
One of the primary safety challenges associated with using UAVs for construction inspection is the risk of accidents. The sound emitted by UAVs can distract construction workers, and can collide with other objects or people, or can malfunction and crash, potentially causing injury or damage [131,132,133].
Weather conditions can also present safety challenges for UAV inspections. High winds, rain, and other weather conditions can impact the stability and control of UAVs, potentially leading to accidents or equipment damage [134].
In addition to safety concerns, security is also an important aspect to consider when using UAVs for construction inspection. As Krichen et al. and Ko et al. have pointed out, there are potential risks of cybersecurity breaches and malicious use that can compromise communication between the UAV and the control station [135,136]. This can lead to unauthorized access, data leakage, or even hijacking of the UAV.
While the use of UAVs for construction inspection offers many potential benefits, there are several challenges and limitations that must be considered when implementing this technology. Regulatory and legal issues, technical limitations, data processing challenges, training and expertise, and safety concerns are all factors that can impact the use of UAVs for construction inspection. Addressing these challenges and limitations through careful planning and implementation can help ensure the successful use of UAVs for construction monitoring and inspection in the future.

5. Future Research Directions

The use of UAVs for construction inspection is a rapidly evolving field, with new technologies and applications emerging all the time. As such, there are a number of potential areas for future research in this area that can help to improve the effectiveness and efficiency of using UAVs for construction inspection.
One potential area for future research is the development of more advanced sensors and imaging technologies for UAVs. This could include sensors that can detect temperature changes or identify different types of materials more accurately, as well as imaging technologies that can provide more detailed and accurate images of construction sites. Perhaps the development of UAVs dedicated to construction use, with proprietary sensors and built-in technology systems integrated into the UAVs, will reduce the learning costs for operators.
Another area for future research is the integration of UAVs with other construction technologies, such as BIM software or virtual and augmented reality tools. This could help to streamline the construction inspection process, making it easier for inspectors to identify potential issues and collaborate with other stakeholders.
Machine learning and artificial intelligence (AI) also offer potential avenues for future research in this area. By analyzing large amounts of data collected by UAVs, machine learning algorithms and AI tools could help to identify patterns and trends that might be difficult for human inspectors to detect, which more specifically includes developing more datasets applicable to the UAV’s perspective and improving the accuracy and efficiency of the inspection process.
Another potential area for future research is the development of more robust and reliable communication and data management systems for UAVs. This could include systems that can operate in remote or challenging environments and tools for securely and efficiently transmitting data from UAVs to inspectors and other stakeholders.
Finally, future research could explore the potential for UAVs to be used in new and innovative ways in the construction industry, such as for site safety monitoring or for environmental monitoring and assessment. By expanding the scope of UAV applications in the construction industry, researchers could help to unlock new opportunities for improving safety, efficiency, and sustainability.

6. Conclusions

The application of UAV-based construction inspection has the potential to revolutionize the construction industry by enhancing safety, efficiency, and accuracy. This review paper has comprehensively analyzed the latest developments in UAV-based technological advancements. Nonetheless, it is apparent that implementing UAVs in construction inspections is not without its challenges and limitations. This paper has identified key issues, such as regulatory and legal concerns, technical limitations, data processing challenges, training and expertise, and safety, which must be carefully considered to maximize the benefits of this technology. Despite these challenges, there are numerous opportunities for further research and development in this area. Innovative sensors and imaging technologies, integration with other construction technologies, and the use of machine learning and AI for data analysis are some of the potential areas for future investigation. By addressing these challenges and maximizing opportunities, the construction industry can benefit from the advantages of UAV technology in improving safety, efficiency, and accuracy. With concerted efforts and a collaborative approach, we can create a safer, more sustainable, and more efficient construction industry.

Author Contributions

Conceptualization, H.L. and S.S.; methodology, H.L., W.B. and J.K.; software, H.L.; writing—original draft preparation, H.L.; writing—review and editing, H.L., S.-C.L., W.B., J.K. and S.S.; visualization, H.L.; supervision, S.S. and S.-C.L.; project administration, S.S. and S.-C.L.; funding acquisition, S.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (2016R1D1A1B02011625).

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

Nomenclature

UAVsUnmanned Aerial Vehicles
VTOLVertical Takeoff and Landing
LiDARLight Detection and Ranging
GPSGlobal Positioning System
RTKReal-Time Kinematic
RGBRed-Green-Blue
FOVField Of View
NDTNon-Destructive Testing
SfMStructure from Motion
CNNsConvolutional Neural Networks
GANsGenerative Adversarial Networks
RNNsRecurrent Neural Networks
BIMBuilding Information Modeling
SVMsSupport Vector Machines
HOGHistogram of Oriented Gradients
V-JViola–Jones
SIFTScale Invariant Feature Transform
ISMImplicit Shape Model
RPNRegion Proposal Network
IPOIntersected Proportion of Objects
APAverage Precision
mAPmean Average Precision
SSHSingle Stage Headless
EASAEuropean Aviation Safety Agency
AIArtificial Intelligence

References

  1. Nikmehr, B.; Hosseini, M.R.; Martek, I.; Zavadskas, E.K.; Antucheviciene, J. Digitalization as a strategic means of achieving sustainable efficiencies in construction management: A critical review. Sustainability 2021, 13, 5040. [Google Scholar] [CrossRef]
  2. Outay, F.; Mengash, H.A.; Adnan, M. Applications of unmanned aerial vehicle (UAV) in road safety, traffic and highway infrastructure management: Recent advances and challenges. Transp. Res. Part A Policy Pract. 2020, 141, 116–129. [Google Scholar] [CrossRef] [PubMed]
  3. Greenwood, W.W.; Lynch, J.P.; Zekkos, D. Applications of UAVs in civil infrastructure. J. Infrastruct. Syst. 2019, 25, 04019002. [Google Scholar] [CrossRef]
  4. Hassanalian, M.; Abdelkefi, A. Classifications, applications, and design challenges of drones: A review. Prog. Aerosp. Sci. 2017, 91, 99–131. [Google Scholar] [CrossRef]
  5. Cai, G.; Lum, K.-Y.; Chen, B.M.; Lee, T.H. A brief overview on miniature fixed-wing unmanned aerial vehicles. In Proceedings of the IEEE ICCA 2010, Xiamen, China, 9–11 June 2010; IEEE: Piscataway, NJ, USA, 2010; pp. 285–290. [Google Scholar] [CrossRef]
  6. Kim, H.J.; Kim, M.; Lim, H.; Park, C.; Yoon, S.; Lee, D.; Choi, H.; Oh, G.; Park, J.; Kim, Y. Fully autonomous vision-based net-recovery landing system for a fixed-wing UAV. IEEE/ASME Trans. Mechatron. 2013, 18, 1320–1333. [Google Scholar] [CrossRef]
  7. Yang, H.; Lee, Y.; Jeon, S.Y.; Lee, D. Multi-rotor drone tutorial: Systems, mechanics, control and state estimation. Intell. Serv. Robot. 2017, 10, 79–93. [Google Scholar] [CrossRef]
  8. Wen, M.C.; Kang, S.C. Augmented reality and unmanned aerial vehicle assist in construction management. In Proceedings of the Computing in Civil and Building Engineering, Orlando, FL, USA, 23–25 June 2014; pp. 1570–1577. [Google Scholar]
  9. Chen, S.; FLaefer, D.; Mangina, E. State of technology review of civilian UAVs. Recent Pat. Eng. 2016, 10, 160–174. [Google Scholar] [CrossRef] [Green Version]
  10. Czyba, R.; Lemanowicz, M.; Gorol, Z.; Kudala, T. Construction prototyping, flight dynamics modeling, and aerodynamic analysis of hybrid VTOL unmanned aircraft. J. Adv. Transp. 2018, 2018, 7040531. [Google Scholar] [CrossRef]
  11. Hegde, N.T.; George, V.I.; Nayak, C.G.; Kumar, K. Design, dynamic modelling and control of tilt-rotor UAVs: A review. Int. J. Intell. Unmanned Syst. 2019, 8, 143–161. [Google Scholar] [CrossRef]
  12. Rothhaar, P.M.; Murphy, P.C.; Bacon, B.J.; Gregory, I.M.; Grauer, J.A.; Busan, R.C.; Croom, M.A. NASA Langley distributed propulsion VTOL tiltwing aircraft testing, modeling, simulation, control, and flight test development. In Proceedings of the 14th AIAA Aviation Technology, Integration, and Operations Conference, Atlanta, GA, USA, 16–20 June 2014; p. 2999. [Google Scholar]
  13. Kamal, A.; Ramirez-Serrano, A. Conceptual design of a highly-maneuverable transitional VTOL UAV with new maneuver and control capabilities. In Proceedings of the AIAA Scitech 2020 Forum, Orlando, FL, USA, 6–10 January 2020; p. 1733. [Google Scholar]
  14. Saeed, A.S.; Younes, A.B.; Cai, C.; Cai, G. A survey of hybrid unmanned aerial vehicles. Prog. Aerosp. Sci. 2018, 98, 91–105. [Google Scholar] [CrossRef]
  15. Wilson, A.N.; Kumar, A.; Jha, A.; Cenkeramaddi, L.R. Embedded Sensors, Communication Technologies, Computing Platforms and Machine Learning for UAVs: A Review. IEEE Sens. J. 2021, 22, 1807–1826. [Google Scholar] [CrossRef]
  16. Pajares, G. Overview and current status of remote sensing applications based on unmanned aerial vehicles (UAVs). Photogramm. Eng. Remote Sens. 2015, 81, 281–330. [Google Scholar] [CrossRef] [Green Version]
  17. Omar, T.; Nehdi, M.L. Remote sensing of concrete bridge decks using unmanned aerial vehicle infrared thermography. Autom. Constr. 2017, 83, 360–371. [Google Scholar] [CrossRef]
  18. Motawa, I.; Kardakou, A. Unmanned aerial vehicles (UAVs) for inspection in construction and building industry. In Proceedings of the the 16th International Operation and Maintenance Conference, Cairo, Egypt, 18–20 November 2018. [Google Scholar]
  19. Ham, Y.; Han, K.K.; Lin, J.J.; Golparvar-Fard, M. Visual monitoring of civil infrastructure systems via camera-equipped Unmanned Aerial Vehicles (UAVs): A review of related works. Vis. Eng. 2016, 4, 1. [Google Scholar] [CrossRef] [Green Version]
  20. Boonpook, W.; Tan, Y.; Xu, B. Deep learning-based multi-feature semantic segmentation in building extraction from images of UAV photogrammetry. Int. J. Remote Sens. 2021, 42, 1–19. [Google Scholar] [CrossRef]
  21. Wang, Y.; Li, S.; Teng, F.; Lin, Y.; Wang, M.; Cai, H. Improved mask R-CNN for rural building roof type recognition from uav high-resolution images: A case study in hunan province, China. Remote Sens. 2022, 14, 265. [Google Scholar] [CrossRef]
  22. Kestur, R.; Farooq, S.; Abdal, R.; Mehraj, E.; Narasipura, O.S.; Mudigere, M. UFCN: A fully convolutional neural network for road extraction in RGB imagery acquired by remote sensing from an unmanned aerial vehicle. J. Appl. Remote Sens. 2018, 12, 016020. [Google Scholar] [CrossRef]
  23. Varia, N.; Dokania, A.; Senthilnath, J. DeepExt: A convolution neural network for road extraction using RGB images captured by UAV. In Proceedings of the 2018 IEEE Symposium Series on Computational Intelligence (SSCI), Bangalore, India, 18–21 November 2018; pp. 1890–1895. [Google Scholar]
  24. Senthilnath, J.; Varia, N.; Dokania, A.; Anand, G.; Benediktsson, J.A. Deep TEC: Deep transfer learning with ensemble classifier for road extraction from UAV imagery. Remote Sens. 2020, 12, 245. [Google Scholar] [CrossRef] [Green Version]
  25. Feng, Q.; Liu, J.; Gong, J. UAV remote sensing for urban vegetation mapping using random forest and texture analysis. Remote Sens. 2015, 7, 1074–1094. [Google Scholar] [CrossRef] [Green Version]
  26. Senthilnath, J.; Kandukuri, M.; Dokania, A.; Ramesh, K.N. Application of UAV imaging platform for vegetation analysis based on spectral-spatial methods. Comput. Electron. Agric. 2017, 140, 8–24. [Google Scholar] [CrossRef]
  27. Pádua, L.; Marques, P.; Hruška, J.; Adão, T.; Peres, E.; Morais, R.; Sousa, J.J. Multi-temporal vineyard monitoring through UAV-based RGB imagery. Remote Sens. 2018, 10, 1907. [Google Scholar] [CrossRef] [Green Version]
  28. Ocer, N.E.; Kaplan, G.; Erdem, F.; Kucuk Matci, D.; Avdan, U. Tree extraction from multi-scale UAV images using Mask R-CNN with FPN. Remote Sens. Lett. 2020, 11, 847–856. [Google Scholar] [CrossRef]
  29. Adade, R.; Aibinu, A.M.; Ekumah, B.; Asaana, J. Unmanned Aerial Vehicle (UAV) applications in coastal zone management—A review. Environ. Monit. Assess. 2021, 193, 154. [Google Scholar] [CrossRef] [PubMed]
  30. Lee, S.; Song, Y.; Kil, S.H. Feasibility analyses of real-time detection of wildlife using UAV-derived thermal and rgb images. Remote Sens. 2021, 13, 2169. [Google Scholar] [CrossRef]
  31. Moon, D.; Chung, S.; Kwon, S.; Seo, J.; Shin, J. Comparison and utilization of point cloud generated from photogrammetry and laser scanning: 3D world model for smart heavy equipment planning. Autom. Constr. 2019, 98, 322–331. [Google Scholar] [CrossRef]
  32. Room, M.H.M.; Anuar, A. Integration of Lidar system, mobile laser scanning (MLS) and unmanned aerial vehicle system for generation of 3d building model application: A review. In Proceedings of the IOP Conference Series: Earth and Environmental Science, 11th IGRSM International Conference and Exhibition on Geospatial & Remote Sensing, Kuala Lumpur, Malaysia, 8–9 March 2022; Volume 1064, p. 012042. [Google Scholar]
  33. Kwon, S.; Park, J.W.; Moon, D.; Jung, S.; Park, H. Smart merging method for hybrid point cloud data using UAV and LIDAR in earthwork construction. Procedia Eng. 2017, 196, 21–28. [Google Scholar] [CrossRef]
  34. Chen, Z.; Zhang, W.; Huang, R.; Dong, Z.; Chen, C.; Jiang, L.; Wang, H. 3D model-based terrestrial laser scanning (TLS) observation network planning for large-scale building facades. Autom. Constr. 2022, 144, 104594. [Google Scholar] [CrossRef]
  35. Park, J.K.; Lee, K.W. Efficiency Analysis of Construction Automation Using 3D Geospatial Information. Sens. Mater 2022, 34, 415–425. [Google Scholar] [CrossRef]
  36. Room, M.H.M.; Ahmad, A. Fusion of Uav-Based LIDAR and Mobile Laser Scanning Data for Construction of 3d Building Model. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2023, 48, 297–302. [Google Scholar] [CrossRef]
  37. Miethig, B.; Liu, A.; Habibi, S.; Mohrenschildt, M.V. Leveraging thermal imaging for autonomous driving. In Proceedings of the 2019 IEEE Transportation Electrification Conference and Expo (ITEC), Detroit, MI, USA, 19–21 June 2019; pp. 1–5. [Google Scholar]
  38. Costello, J.; Stewart, I.B.; Donnelly, A.E.; Selfe, J.; Karki, A.I. Use of thermal imaging in sports medicine research: A short report: Short article. Int. Sport. J. 2013, 14, 94–98. [Google Scholar]
  39. Ottaviani, M.; Giammichele, L.; Fioretti, F.; Ricci, R. Thermal and visual remote sensing of residential buildings by UAV. In Proceedings of the IOP Conference Series: Earth and Environmental Science, 7th AIGE/IIETA International Conference and 16th AIGE Conference on: Energy Conversion, Management, Recovery, Saving, Storage and Renewable Systems (AIGE 2022), Parma, Italy, 8–9 June 2022; Volume 1106, p. 012019. [Google Scholar]
  40. Borrmann, D.; Elseberg, J.; Nüchter, A. Thermal 3D mapping of building façades. In Intelligent Autonomous Systems 12: Volume 1, Proceedings of the 12th International Conference IAS-12, Jeju Island, Republic of Korea, 26–29 June 2012; Springer: Berlin/Heidelberg, Germany, 2013; pp. 173–182. [Google Scholar]
  41. Amon, F.; Pearson, C. Thermal imaging in firefighting and thermography applications. Radiom. Temp. Meas. II. Appl. 2009, 43, 279–331. [Google Scholar]
  42. Kylili, A.; Fokaides, P.A.; Christou, P.; Kalogirou, S.A. Infrared thermography (IRT) applications for building diagnostics: A review. Appl. Energy 2014, 134, 531–549. [Google Scholar] [CrossRef]
  43. Mahmoodzadeh, M.; Gretka, V.; Wong, S.; Froese, T.; Mukhopadhyaya, P. Evaluating patterns of building envelope air leakage with infrared thermography. Energies 2020, 13, 3545. [Google Scholar] [CrossRef]
  44. Taylor, T.; Counsell, J.; Gill, S. Energy efficiency is more than skin deep: Improving construction quality control in new-build housing using thermography. Energy Build. 2013, 66, 222–231. [Google Scholar] [CrossRef]
  45. Lucchi, E. Applications of the infrared thermography in the energy audit of buildings: A review. Renew. Sustain. Energy Rev. 2018, 82, 3077–3090. [Google Scholar] [CrossRef]
  46. Lerma, J.L.; Cabrelles, M.; Portales, C. Multitemporal thermal analysis to detect moisture on a building façade. Constr. Build. Mater. 2011, 25, 2190–2197. [Google Scholar] [CrossRef]
  47. Abdel-Qader, I.; Yohali, S.; Abudayyeh, O.; Yehia, S. Segmentation of thermal images for non-destructive evaluation of bridge decks. Ndt E Int. 2008, 41, 395–405. [Google Scholar] [CrossRef]
  48. Warsi, Z.H.; Irshad, S.M.; Khan, F.; Shahbaz, M.A.; Junaid, M.; Amin, S.U. Sensors for structural health monitoring: A review. In Proceedings of the 2019 Second International Conference on Latest Trends in Electrical Engineering and Computing Technologies (INTELLECT), Karachi, Pakistan, 13–14 November 2019; pp. 1–6. [Google Scholar]
  49. Zhang, R.; Li, H.; Duan, K.; You, S.; Liu, K.; Wang, F.; Hu, Y. Automatic detection of earthquake-damaged buildings by integrating UAV oblique photography and infrared thermal imaging. Remote Sens. 2020, 12, 2621. [Google Scholar] [CrossRef]
  50. Alam, M.S.; Bognar, J.G.; Hardie, R.C.; Yasuda, B.J. Infrared image registration and high-resolution reconstruction using multiple translationally shifted aliased video frames. IEEE Trans. Instrum. Meas. 2000, 49, 915–923. [Google Scholar] [CrossRef] [Green Version]
  51. Goss, T.M.; Barnard, P.W.; Fildis, H.; Erbudak, M.; Senger, T.; Alpman, M.E. Field of view selection for optimal airborne imaging sensor performance. In Proceedings of the Infrared Imaging Systems: Design, Analysis, Modeling, and Testing XXV, Baltimore, MA, USA, 6–8 May 2014; Volume 9071, pp. 9–18. [Google Scholar]
  52. O’Shaughnessy, S.A.; Hebel, M.A.; Evett, S.R.; Colaizzi, P.D. Evaluation of a wireless infrared thermometer with a narrow field of view. Comput. Electron. Agric. 2011, 76, 59–68. [Google Scholar] [CrossRef] [Green Version]
  53. Videras Rodríguez, M.; Melgar, S.G.; Cordero, A.S.; Márquez, J.M.A. A critical review of unmanned aerial vehicles (Uavs) use in architecture and urbanism: Scientometric and bibliometric analysis. Appl. Sci. 2021, 11, 9966. [Google Scholar] [CrossRef]
  54. Abdelfatah, R.; Moawad, A.; Alshaer, N.; Ismail, T. UAV tracking system using integrated sensor fusion with RTK-GPS. In Proceedings of the 2021 International Mobile, Intelligent, and Ubiquitous Computing Conference (MIUCC), Cairo, Egypt, 26–27 May 2021; pp. 352–356. [Google Scholar]
  55. Eling, C.; Klingbeil, L.; Kuhlmann, H. Real-time single-frequency GPS/MEMS-IMU attitude determination of lightweight UAVs. Sensors 2015, 15, 26212–26235. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  56. Yang, J.C.; Lin, C.J.; You, B.Y.; Yan, Y.L.; Cheng, T.H. Rtlio: Real-time lidar-inertial odometry and mapping for UAVS. Sensors 2021, 21, 3955. [Google Scholar] [CrossRef] [PubMed]
  57. Zimmermann, F.; Eling, C.; Klingbeil, L.; Kuhlmann, H. Precise Positioning of Uavs-Dealing with Challenging Rtk-Gps Measurement Conditions during Automated Uav Flights. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, 4, 95–102. [Google Scholar] [CrossRef] [Green Version]
  58. Jacob-Loyola, N.; Muñoz-La Rivera, F.; Herrera, R.F.; Atencio, E. Unmanned aerial vehicles (UAVs) for physical progress monitoring of construction. Sensors 2021, 21, 4227. [Google Scholar] [CrossRef]
  59. Czyża, S.; Szuniewicz, K.; Kowalczyk, K.; Dumalski, A.; Ogrodniczak, M.; Zieleniewicz, Ł. Assessment of Accuracy in Unmanned Aerial Vehicle (UAV) Pose Estimation with the REAL-Time Kinematic (RTK) Method on the Example of DJI Matrice 300 RTK. Sensors 2023, 23, 2092. [Google Scholar] [CrossRef]
  60. Song, C.; Chen, Z.; Wang, K.; Luo, H.; Cheng, J.C. BIM-supported scan and flight planning for fully autonomous LiDAR-carrying UAVs. Autom. Constr. 2022, 142, 104533. [Google Scholar] [CrossRef]
  61. Zhai, W.; Han, B.; Li, D.; Duan, J.; Cheng, C. A low-altitude public air route network for UAV management constructed by global subdivision grids. PLoS ONE 2021, 16, e0249680. [Google Scholar] [CrossRef]
  62. Bouras, A.; Bouzid, Y.; Guiatni, M. Multi-uavs coverage path planning. In Proceedings of the 4th International Conference on Electrical Engineering and Control Applications: ICEECA 2019, Constantine, Algeria, 17–19 December 2019; pp. 23–36. [Google Scholar]
  63. Aggarwal, S.; Kumar, N. Path planning techniques for unmanned aerial vehicles: A review, solutions, and challenges. Comput. Commun. 2020, 149, 270–299. [Google Scholar] [CrossRef]
  64. Shi, J.; Tan, L.; Lian, X.; Xu, T.; Zhang, H.; Zhang, Y. A multi-unmanned aerial vehicle dynamic task assignment method based on bionic algorithms. Comput. Electr. Eng. 2022, 99, 107820. [Google Scholar] [CrossRef]
  65. Barba, S.; Barbarella, M.; Di Benedetto, A.; Fiani, M.; Gujski, L.; Limongiello, M. Accuracy assessment of 3D photogrammetric models from an unmanned aerial vehicle. Drones 2019, 3, 79. [Google Scholar] [CrossRef] [Green Version]
  66. Deliry, S.I.; Avdan, U. Accuracy of unmanned aerial systems photogrammetry and structure from motion in surveying and mapping: A review. J. Indian Soc. Remote Sens. 2021, 49, 1997–2017. [Google Scholar] [CrossRef]
  67. Nooralishahi, P.; Ibarra-Castanedo, C.; Deane, S.; López, F.; Pant, S.; Genest, M.; Avdelidis, N.P.; Maldague, X.P. Drone-based non-destructive inspection of industrial sites: A review and case studies. Drones 2021, 5, 106. [Google Scholar] [CrossRef]
  68. Oliensis, J. A critique of structure-from-motion algorithms. Comput. Vis. Image Underst. 2000, 80, 172–214. [Google Scholar] [CrossRef] [Green Version]
  69. Bianco, S.; Ciocca, G.; Marelli, D. Evaluating the performance of structure from motion pipelines. J. Imaging 2018, 4, 98. [Google Scholar] [CrossRef] [Green Version]
  70. Shalaby, A.; Elmogy, M.; El-Fetouh, A.A. Algorithms and applications of structure from motion (SFM): A survey. Algorithms 2017, 6. [Google Scholar] [CrossRef]
  71. El-Omari, S.; Moselhi, O. Integrating 3D laser scanning and photogrammetry for progress measurement of construction work. Autom. Constr. 2008, 18, 1–9. [Google Scholar] [CrossRef]
  72. Lo, Y.; Zhang, C.; Ye, Z.; Cui, C. Monitoring road base course construction progress by photogrammetry-based 3D reconstruction. Int. J. Constr. Manag. 2022, 1–15. [Google Scholar] [CrossRef]
  73. Krizhevsky, A.; Sutskever, I.; Hinton, G.E. Imagenet classification with deep convolutional neural networks. Commun. ACM 2017, 60, 84–90. [Google Scholar] [CrossRef] [Green Version]
  74. Goodfellow, I.; Pouget-Abadie, J.; Mirza, M.; Xu, B.; Warde-Farley, D.; Ozair, S.; Courville, A.; Bengio, Y. Generative adversarial networks. Commun. ACM 2020, 63, 139–144. [Google Scholar] [CrossRef]
  75. Zaremba, W.; Sutskever, I.; Vinyals, O. Recurrent neural network regularization. arXiv 2014, arXiv:1409.2329. [Google Scholar]
  76. Parusheva, S. Digitalization and Digital Transformation in Construction-Benefits and Challenges. Inf. Commun. Technol. Bus. Educ. 2019, 126–134. [Google Scholar]
  77. Jiang, Y.; Han, S.; Bai, Y. Building and infrastructure defect detection and visualization using drone and deep learning technologies. J. Perform. Constr. Facil. 2021, 35, 04021092. [Google Scholar] [CrossRef]
  78. Kalfarisi, R.; Wu, Z.Y.; Soh, K. Crack detection and segmentation using deep learning with 3D reality mesh model for quantitative assessment and integrated visualization. J. Comput. Civ. Eng. 2020, 34, 04020010. [Google Scholar] [CrossRef]
  79. Kung, R.Y.; Pan, N.H.; Wang, C.C.; Lee, P.C. Application of deep learning and unmanned aerial vehicle on building maintenance. Adv. Civ. Eng. 2021, 2021, 5598690. [Google Scholar] [CrossRef]
  80. Roca, D.; Armesto, J.; Lagüela, S.; Díaz-Vilariño, L. Lidar-equipped uav for building information modelling. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2014, 40, 523–527. [Google Scholar] [CrossRef] [Green Version]
  81. Mahajan, G. Applications of drone technology in construction industry: A study 2012–2021. Int. J. Eng. Adv. Technol. 2021, 11, 224–239. [Google Scholar] [CrossRef]
  82. Dahaghin, M.; Samadzadegan, F.; Dadrass Javan, F. Precise 3D extraction of building roofs by fusion of UAV-based thermal and visible images. Int. J. Remote Sens. 2021, 42, 7002–7030. [Google Scholar] [CrossRef]
  83. Truong-Hong, L.; Chen, S.; Cao, V.L.; Laefer, D.F. Automatic bridge deck damage using low cost UAV-based images. In Proceedings of the TU1406 Quality Specifications for Roadway Bridges Standardization at a European Level, Barcelona, Spain, 27–28 September 2018. [Google Scholar]
  84. Angelosanti, M.; Kulkarni, N.N.; Sabato, A. Combination of Building Information Modeling and Infrared Point Cloud for Nondestructive Evaluation. In Proceedings of the 2022 IEEE International Workshop on Metrology for Living Environment (MetroLivEn), Cosenza, Italy, 25–27 May 2022; pp. 269–273. [Google Scholar]
  85. Li, J.; Li, X.; Liu, K.; Yao, Z. Crack Identification for Bridge Structures Using an Unmanned Aerial Vehicle (UAV) Incorporating Image Geometric Correction. Buildings 2022, 12, 1869. [Google Scholar] [CrossRef]
  86. Zhou, R.; Wen, Z.; Su, H. Automatic recognition of earth rock embankment leakage based on UAV passive infrared thermography and deep learning. ISPRS J. Photogramm. Remote Sens. 2022, 191, 85–104. [Google Scholar] [CrossRef]
  87. Wang, Z.F.; Yu, Y.F.; Wang, J.; Zhang, J.Q.; Zhu, H.L.; Li, P.; Xu, L.; Jiang, H.-N.; Sui, Q.-M.; Jia, L.; et al. Convolutional neural-network-based automatic dam-surface seepage defect identification from thermograms collected from UAV-mounted thermal imaging camera. Constr. Build. Mater. 2022, 323, 126416. [Google Scholar] [CrossRef]
  88. Tan, Y.; Li, G.; Cai, R.; Ma, J.; Wang, M. Mapping and modelling defect data from UAV captured images to BIM for building external wall inspection. Autom. Constr. 2022, 139, 104284. [Google Scholar] [CrossRef]
  89. Akinosho, T.D.; Oyedele, L.O.; Bilal, M.; Ajayi, A.O.; Delgado, M.D.; Akinade, O.O.; Ahmed, A.A. Deep learning in the construction industry: A review of present status and future innovations. J. Build. Eng. 2020, 32, 101827. [Google Scholar] [CrossRef]
  90. Mostafa, K.; Hegazy, T. Review of image-based analysis and applications in construction. Autom. Constr. 2021, 122, 103516. [Google Scholar] [CrossRef]
  91. Yeh, C.H.; Lin, C.Y.; Muchtar, K.; Lai, H.E.; Sun, M.T. Three-pronged compensation and hysteresis thresholding for moving object detection in real-time video surveillance. IEEE Trans. Ind. Electron. 2017, 64, 4945–4955. [Google Scholar] [CrossRef]
  92. Ben-Musa, A.S.; Singh, S.K.; Agrawal, P. Object detection and recognition in cluttered scene using Harris Corner Detection. In Proceedings of the 2014 International Conference on Control, Instrumentation, Communication and Computational Technologies (ICCICCT), Kanyakumari, India, 10–11 July 2014; pp. 181–184. [Google Scholar]
  93. Ercan, M.F.; Wang, R.B. Deep learning for accurate corner detection in computer vision-based inspection. In Proceedings of the Computational Science and Its Applications–ICCSA 2021: 21st International Conference, Cagliari, Italy, 13–16 September 2021; Part II 21. pp. 45–54. [Google Scholar]
  94. Serre, T.; Wolf, L.; Poggio, T. Object recognition with features inspired by visual cortex. In Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05), San Diego, CA, USA, 20–25 June 2005; Volume 2, pp. 994–1000. [Google Scholar]
  95. Zhu, Z.; Brilakis, I. Concrete column recognition in images and videos. J. Comput. Civ. Eng. 2010, 24, 478–487. [Google Scholar] [CrossRef]
  96. Kim, H.; Kim, H. 3D reconstruction of a concrete mixer truck for training object detectors. Autom. Constr. 2018, 88, 23–30. [Google Scholar] [CrossRef]
  97. Xu, Y.; Yu, G.; Wang, Y.; Wu, X.; Ma, Y. A hybrid vehicle detection method based on viola-jones and HOG + SVM from UAV images. Sensors 2016, 16, 1325. [Google Scholar] [CrossRef] [Green Version]
  98. Chen, X.; Meng, Q. Vehicle detection from UAVs by using SIFT with implicit shape model. In Proceedings of the 2013 IEEE International Conference on Systems, Man, and Cybernetics, Manchester, UK, 13–16 October 2013; pp. 3139–3144. [Google Scholar]
  99. Girshick, R.; Donahue, J.; Darrell, T.; Malik, J. Rich feature hierarchies for accurate object detection and semantic segmentation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Columbus, OH, USA, 23-28 June 2014; pp. 580–587. [Google Scholar]
  100. Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You only look once: Unified, real-time object detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 779–788. [Google Scholar]
  101. Redmon, J.; Farhadi, A. YOLO9000: Better, faster, stronger. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 7263–7271. [Google Scholar]
  102. Redmon, J.; Farhadi, A. Yolov3: An incremental improvement. arXiv 2018, arXiv:1804.02767. [Google Scholar]
  103. Bochkovskiy, A.; Wang, C.Y.; Liao, H.Y.M. Yolov4: Optimal speed and accuracy of object detection. arXiv 2020, arXiv:2004.10934. [Google Scholar]
  104. Zhu, X.; Lyu, S.; Wang, X.; Zhao, Q. TPH-YOLOv5: Improved YOLOv5 based on transformer prediction head for object detection on drone-captured scenarios. In Proceedings of the IEEE/CVF International Conference on Computer Vision, Montreal, BC, Canada, 11–17 October 2021; pp. 2778–2788. [Google Scholar]
  105. Wang, C.Y.; Bochkovskiy, A.; Liao, H.Y.M. YOLOv7: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors. arXiv 2022, arXiv:2207.02696. [Google Scholar]
  106. Ge, Z.; Liu, S.; Wang, F.; Li, Z.; Sun, J. Yolox: Exceeding yolo series in 2021. arXiv 2021, arXiv:2107.08430. [Google Scholar]
  107. Ding, J.; Zhang, J.; Zhan, Z.; Tang, X.; Wang, X. A Precision Efficient Method for Collapsed Building Detection in Post-Earthquake UAV Images Based on the Improved NMS Algorithm and Faster R-CNN. Remote Sens. 2022, 14, 663. [Google Scholar] [CrossRef]
  108. Lee, J.H.; Yoon, S.S.; Kim, I.H.; Jung, H.J. Diagnosis of crack damage on structures based on image processing techniques and R-CNN using unmanned aerial vehicle (UAV). In Proceedings of the Sensors and Smart Structures Technologies for Civil, Mechanical, and Aerospace Systems 2018, Denver, CO, USA, 5–8 March 2018; Volume 10598, pp. 265–272. [Google Scholar]
  109. Zhou, Q.; Ding, S.; Qing, G.; Hu, J. UAV vision detection method for crane surface cracks based on Faster R-CNN and image segmentation. J. Civ. Struct. Health Monit. 2022, 12, 845–855. [Google Scholar] [CrossRef]
  110. Sharma, S.; Susmitha, A.V.V.; Van, L.D.; Tseng, Y.C. An edge-controlled outdoor autonomous UAV for colorwise safety helmet detection and counting of workers in construction sites. In Proceedings of the 2021 IEEE 94th Vehicular Technology Conference (VTC2021-Fall), Nanjing, China, 27–30 September 2021; pp. 1–5. [Google Scholar]
  111. Liang, H.; Seo, S. UAV Low-Altitude Remote Sensing Inspection System Using a Small Target Detection Network for Helmet Wear Detection. Remote Sens. 2023, 15, 196. [Google Scholar] [CrossRef]
  112. Qiu, Z.; Bai, H.; Chen, T. Special Vehicle Detection from UAV Perspective via YOLO-GNS Based Deep Learning Network. Drones 2023, 7, 117. [Google Scholar] [CrossRef]
  113. Li, B.; Gan, Z.; Neretin, E.S.; Yang, Z. Object Recognition Through UAV Observations Based on Yolo and Generative Adversarial Network. In Proceedings of the IoT as a Service: 6th EAI International Conference, IoTaaS 2020, Xi’an, China, 19–20 November 2020; Proceedings 6. pp. 439–449. [Google Scholar]
  114. Xiao, B.; Kang, S.C. Development of an image data set of construction machines for deep learning object detection. J. Comput. Civ. Eng. 2021, 35, 05020005. [Google Scholar] [CrossRef]
  115. Han, Q.; Liu, X.; Xu, J. Detection and location of steel structure surface cracks based on unmanned aerial vehicle images. J. Build. Eng. 2022, 50, 104098. [Google Scholar] [CrossRef]
  116. Lightfoot, T.R. Bring on the Drones: Legal and Regulatory Issues in Using Unmanned Aircraft Systems. Nat. Resour. Environ. 2018, 32, 41–45. [Google Scholar]
  117. Bassi, E. From here to 2023: Civil drones operations and the setting of new legal rules for the European single sky. J. Intell. Robot. Syst. 2020, 100, 493–503. [Google Scholar] [CrossRef]
  118. Huttunen, M. Civil unmanned aircraft systems and security: The European approach. J. Transp. Secur. 2019, 12, 83–101. [Google Scholar] [CrossRef] [Green Version]
  119. Watkins, S.; Burry, J.; Mohamed, A.; Marino, M.; Prudden, S.; Fisher, A.; Kloet, N.; Jakobi, T.; Clothier, R. Ten questions concerning the use of drones in urban environments. Build. Environ. 2020, 167, 106458. [Google Scholar] [CrossRef]
  120. Vattapparamban, E.; Güvenç, I.; Yurekli, A.I.; Akkaya, K.; Uluağaç, S. Drones for smart cities: Issues in cybersecurity, privacy, and public safety. In Proceedings of the 2016 International Wireless Communications and Mobile Computing Conference (IWCMC), Paphos, Cyprus, 5–9 September 2016; pp. 216–221. [Google Scholar]
  121. Finn, R.L.; Wright, D. Unmanned aircraft systems: Surveillance, ethics and privacy in civil applications. Comput. Law Secur. Rev. 2012, 28, 184–194. [Google Scholar] [CrossRef]
  122. Biczyski, M.; Sehab, R.; Whidborne, J.F.; Krebs, G.; Luk, P. Multirotor sizing methodology with flight time estimation. J. Adv. Transp. 2020, 2020, 9689604. [Google Scholar] [CrossRef] [Green Version]
  123. Zeng, Y.; Zhang, R.; Lim, T.J. Wireless communications with unmanned aerial vehicles: Opportunities and challenges. IEEE Commun. Mag. 2016, 54, 36–42. [Google Scholar] [CrossRef] [Green Version]
  124. York, D.D.; Al-Bayati, A.J.; Al-Shabbani, Z.Y. Potential Applications of UAV within the Construction Industry and the Challenges Limiting Implementation. In Proceedings of the Construction Research Congress 2020: Project Management and Controls, Materials, and Contracts, Tempe, AZ, USA, 8–10 March 2020; American Society of Civil Engineers: Reston, VA, USA, 2021; pp. 31–39. [Google Scholar]
  125. Norton, A.; Ahmadzadeh, R.; Jerath, K.; Robinette, P.; Weitzen, J.; Wickramarathne, T.; Yanco, H.; Choi, M.; Donald, R.; Donoghue., B.; et al. DECISIVE Test Methods Handbook: Test Methods for Evaluating sUAS in Subterranean and Constrained Indoor Environments, Version 1.1. arXiv 2022, arXiv:2211.01801. [Google Scholar]
  126. Martinez, J.G.; Gheisari, M.; Alarcón, L.F. UAV integration in current construction safety planning and monitoring processes: Case study of a high-rise building construction project in Chile. J. Manag. Eng. 2020, 36, 05020005. [Google Scholar] [CrossRef]
  127. Congalton, R.G. Remote sensing and geographic information system data integration: Error sources and. Photogramm. Eng. Remote Sens. 1991, 57, 677–687. [Google Scholar]
  128. Jeelani, I.; Gheisari, M. Safety challenges of UAV integration in construction: Conceptual analysis and future research roadmap. Saf. Sci. 2021, 144, 105473. [Google Scholar] [CrossRef]
  129. Rachmawati, T.S.N.; Kim, S. Unmanned Aerial Vehicles (UAV) Integration with Digital Technologies toward Construction 4.0: A Systematic Literature Review. Sustainability 2022, 14, 5708. [Google Scholar] [CrossRef]
  130. Subramanya, K.; Kermanshachi, S.; Patel, R.K. The Future of Highway and Bridge Construction: Digital Project Delivery Using Integrated Advanced Technologies. In Proceedings of the International Conference on Transportation and Development 2022, Seattle, WA, USA, 31 May–3 June 2022; pp. 14–25. [Google Scholar]
  131. McCabe, B.Y.; Hamledari, H.; Shahi, A.; Zangeneh, P.; Azar, E.R. Roles, benefits, and challenges of using UAVs for indoor smart construction applications. In Proceedings of the Computing in Civil Engineering 2017, Seattle, DC, USA, 25–27 June 2017; pp. 349–357. [Google Scholar]
  132. Izadi Moud, H.; Flood, I.; Zhang, X.; Abbasnejad, B.; Rahgozar, P.; McIntyre, M. Quantitative assessment of proximity risks associated with unmanned aerial vehicles in construction. J. Manag. Eng. 2021, 37, 04020095. [Google Scholar] [CrossRef]
  133. Irizarry, J.; Gheisari, M.; Walker, B.N. Usability assessment of drone technology as safety inspection tools. J. Inf. Technol. Constr. (ITcon) 2012, 17, 194–212. [Google Scholar]
  134. Mosly, I. Applications and issues of unmanned aerial systems in the construction industry. Safety 2017, 21, 31. [Google Scholar] [CrossRef] [Green Version]
  135. Krichen, M.; Adoni, W.Y.H.; Mihoub, A.; Alzahrani, M.Y.; Nahhal, T. Security Challenges for Drone Communications: Possible Threats, Attacks and Countermeasures. In Proceedings of the 2022 2nd International Conference of Smart Systems and Emerging Technologies (SMARTTECH), Riyadh, Saudi Arabia, 9–11 May 2022; pp. 184–189. [Google Scholar]
  136. Ko, Y.; Kim, J.; Duguma, D.G.; Astillo, P.V.; You, I.; Pau, G. Drone secure communication protocol for future sensitive applications in military zone. Sensors 2021, 21, 2057. [Google Scholar] [CrossRef]
Figure 1. Conceptual framework of advancements, challenges, and future directions for UAVs in construction.
Figure 1. Conceptual framework of advancements, challenges, and future directions for UAVs in construction.
Drones 07 00202 g001
Figure 2. Some examples of fixed-wing UAVs.
Figure 2. Some examples of fixed-wing UAVs.
Drones 07 00202 g002
Figure 3. Examples of multi-rotor UAVs: (a) quadcopter; (b) hexacopters; and (c) octocopters.
Figure 3. Examples of multi-rotor UAVs: (a) quadcopter; (b) hexacopters; and (c) octocopters.
Drones 07 00202 g003
Figure 4. Examples of hybrid UAVs: (a) tilt-rotor aircraft and (b) tilt-wing aircraft.
Figure 4. Examples of hybrid UAVs: (a) tilt-rotor aircraft and (b) tilt-wing aircraft.
Drones 07 00202 g004
Figure 5. Examples of UAVs equipped with sensors in the construction industry: (a) visible light sensors; (b) LiDAR; (c) TI; and (d) GPS and RTK.
Figure 5. Examples of UAVs equipped with sensors in the construction industry: (a) visible light sensors; (b) LiDAR; (c) TI; and (d) GPS and RTK.
Drones 07 00202 g005
Figure 6. Photography-based 3D modeling process using UAV. (a) Aerial triangulation sparse point clouds and (b) 3D modeling results.
Figure 6. Photography-based 3D modeling process using UAV. (a) Aerial triangulation sparse point clouds and (b) 3D modeling results.
Drones 07 00202 g006
Figure 7. The visualization results of Liang et al.’s proposed automated detection method for helmeted workers using UAVs on construction sites [111].
Figure 7. The visualization results of Liang et al.’s proposed automated detection method for helmeted workers using UAVs on construction sites [111].
Drones 07 00202 g007
Table 1. Comparison of Advantages and Disadvantages of Fixed-wing, Multi-rotor, and Hybrid UAVs for Construction Industry Applications.
Table 1. Comparison of Advantages and Disadvantages of Fixed-wing, Multi-rotor, and Hybrid UAVs for Construction Industry Applications.
Type of UAVAdvantagesDisadvantagesReferences
Fixed-wing UAVs
  • Long-range and endurance
  • Fast speed
  • Not as agile or versatile as rotary-wing UAVs
  • Require specialized training and equipment
Cai et al. [5]
Kim et al. [6]
Multi-rotor UAVs
  • Can operate in confined or urban environments
  • Highly agile and can hover in place
  • Cost-effective
  • Limited range and speed compared to fixed-wing UAVs
  • Less efficient and shorter flight times
  • Payload capacity is usually limited
Yang et al. [7]
Wen et al. [8]
Chen et al. [9]
Hybrid UAVs
  • Versatile and can perform a wide range of tasks
  • Can cover long distances quickly
  • Can carry a larger payload
  • More expensive than traditional fixed-wing or rotary-wing UAVs
  • More complex to operate and require specialized training and expertise
  • More vulnerable to certain weather conditions
Czyba et al. [10]
Saeed et al. [14]
Table 2. Comparison of the advantages and disadvantages of each sensor in the construction industry applications.
Table 2. Comparison of the advantages and disadvantages of each sensor in the construction industry applications.
Sensor TypeAdvantagesDisadvantagesReferences
Visible light sensors
  • Can capture high-resolution images with a high level of detail
  • Easy to use and widely available
  • Can capture images in real time
  • Suitable for use in different environments
  • Only able to capture visible light
  • Sensitive to changes in lighting conditions
  • Sensitive to reflections and glare
Ham et al. [19]
Adade et al. [29]
Lee et al. [30]
LiDAR sensors
  • High accuracy
  • Efficiency in quickly capturing large amounts of data and generating 3D models
  • Ability to identify potential hazards
  • Can generate highly precise 3D models
  • Expensive to purchase and maintain
  • Require specialized training and expertise
  • Limited range of around 100 m
Moon et al. [31]
Park et al. [35]
Room et al. [36]
TI sensors
  • Ability to detect heat-related and structural issues
  • Can identify temperature anomalies
  • Limited to detecting heat-related issues and temperature anomalies
Warsi et al. [48]
Zhang et al. [49]
Goss et al. [51]
GPS and RTK sensors
  • Ability to determine location and elevation
  • Can be used to create accurate 3D models of construction sites
  • Can be used for monitoring site progress and equipment tracking
  • Can increase efficiency and productivity on construction sites
  • Limited accuracy in urban canyons and under dense canopy cover
  • Susceptible to signal jamming and interference
  • RTK sensors require a base station and may have limited range
Abdelfatah et al. [54]
Eling et al. [55]
Yang et al. [56]
Czyża et al. [59]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Liang, H.; Lee, S.-C.; Bae, W.; Kim, J.; Seo, S. Towards UAVs in Construction: Advancements, Challenges, and Future Directions for Monitoring and Inspection. Drones 2023, 7, 202. https://doi.org/10.3390/drones7030202

AMA Style

Liang H, Lee S-C, Bae W, Kim J, Seo S. Towards UAVs in Construction: Advancements, Challenges, and Future Directions for Monitoring and Inspection. Drones. 2023; 7(3):202. https://doi.org/10.3390/drones7030202

Chicago/Turabian Style

Liang, Han, Seong-Cheol Lee, Woosung Bae, Jeongyun Kim, and Suyoung Seo. 2023. "Towards UAVs in Construction: Advancements, Challenges, and Future Directions for Monitoring and Inspection" Drones 7, no. 3: 202. https://doi.org/10.3390/drones7030202

APA Style

Liang, H., Lee, S. -C., Bae, W., Kim, J., & Seo, S. (2023). Towards UAVs in Construction: Advancements, Challenges, and Future Directions for Monitoring and Inspection. Drones, 7(3), 202. https://doi.org/10.3390/drones7030202

Article Metrics

Back to TopTop