Next Article in Journal
EYOLOv3: An Efficient Real-Time Detection Model for Floating Object on River
Previous Article in Journal
Applications of Deep Learning to Neurodevelopment in Pediatric Imaging: Achievements and Challenges
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Robots in Inspection and Monitoring of Buildings and Infrastructure: A Systematic Review

Myers-Lawson School of Construction, Virginia Tech, Blacksburg, VA 24061, USA
*
Author to whom correspondence should be addressed.
Appl. Sci. 2023, 13(4), 2304; https://doi.org/10.3390/app13042304
Submission received: 14 January 2023 / Revised: 8 February 2023 / Accepted: 8 February 2023 / Published: 10 February 2023
(This article belongs to the Special Issue Recent Advances in Mechatronic and Robotic Systems)

Abstract

:
Regular inspection and monitoring of buildings and infrastructure, that is collectively called the built environment in this paper, is critical. The built environment includes commercial and residential buildings, roads, bridges, tunnels, and pipelines. Automation and robotics can aid in reducing errors and increasing the efficiency of inspection tasks. As a result, robotic inspection and monitoring of the built environment has become a significant research topic in recent years. This review paper presents an in-depth qualitative content analysis of 269 papers on the use of robots for the inspection and monitoring of buildings and infrastructure. The review found nine different types of robotic systems, with unmanned aerial vehicles (UAVs) being the most common, followed by unmanned ground vehicles (UGVs). The study also found five different applications of robots in inspection and monitoring, namely, maintenance inspection, construction quality inspection, construction progress monitoring, as-built modeling, and safety inspection. Common research areas investigated by researchers include autonomous navigation, knowledge extraction, motion control systems, sensing, multi-robot collaboration, safety implications, and data transmission. The findings of this study provide insight into the recent research and developments in the field of robotic inspection and monitoring of the built environment and will benefit researchers, and construction and facility managers, in developing and implementing new robotic solutions.

1. Introduction

The built environment consists of human-made buildings and infrastructures such as commercial and residential buildings, bridges, roads, tunnels, storage tanks, and pipelines. These structures must be routinely inspected and monitored both during and after construction. The structure is monitored and assessed through regular inspections performed by different stakeholders including owners, project managers, architects, engineers, contractors, sub-contractors, end users, and facility managers [1]. Irrespective of who is performing the inspection, manual inspection is a time-consuming process and adds to the project cost [2]. Manual inspection is also characterized by a high degree of variability in the quality of assessment and subjectivity [3,4].
Some inspection tasks are difficult for humans to perform due to inaccessibility, e.g., confined spaces such as inside air-conditioning ducts [5,6], water-filled pipelines and tunnels [7], offshore structures [8], and small spaces in walls [9]. Some situations might be hazardous for humans, such as inspection at heights [10], structures subjected to natural disasters [11], or a bridge deck [12]. Different inspection and monitoring activities differ in their requirements and face different challenges. Therefore, not all types of inspection can be performed by the same robot. This review paper identifies robot types and various application subdomains within the domain of inspection and monitoring of the built environment and discusses their challenges.
Robots have been used in building and infrastructure projects in many ways—e.g., for concrete production, automated brickwork, steel welding, concrete distribution, steel reinforcement positioning, concrete finishing, tile placement, fireproof coating, painting, earthmoving, material handling, and road maintenance [13]. Robots with many different locomotion types and sensors have been used for the inspection of the built environment [3]. Some examples are unmanned aerial vehicles (UAV), unmanned ground vehicles (UGV), marine vehicles, wall-climbing robots, and cable-crawling robots, among others. Robotic inspection provides a safer alternative to manual inspection [14,15]. Automated robotic inspection improves the frequency of inspections and reduces subjectivity in detecting errors [16,17,18].
As described in this review, challenges associated with developing and implementing robots for inspection and monitoring of the built environment are substantial. One of the biggest challenges is that buildings and infrastructure differ widely in design and intended use. Programming and designing a robot to operate in a wide range of input environments is a significantly difficult task [3]. Furthermore, various inspection subtasks need completely different robot functionalities, such as defect detection [19], progress estimation [20], and resource tracking [21], among many others. During this review, it was identified that previous research can be categorized into distinct research areas based on what challenges it addressed. All these research areas are identified, and the findings are presented in this paper.
Rakha and Gorodetsky [22] presented a comprehensive review of the use of drones for building inspection. However, the current review includes all types of robots, including but not limited to drones. Lattanzi and Miller [3] reviewed the literature on the robotic inspection of infrastructure. They studied different types of robots based on their mobility. They also studied different methods developed by past researchers for improving the autonomy and damage perceptions of these robots. However, there have been numerous advances In the field of robotics, and in their supporting technologies, such as computer vision and deep learning recently. In fact, a bibliometric analysis of the papers reviewed in this study revealed that more than 60% of the papers analyzed were published after 2017, when [3] published their review. The results of the bibliometric analysis are presented in Section 3. Furthermore, Bock and Linner published the handbook on Construction Robots [23]. They studied various robots in use in construction. While they briefly discussed inspection robots, they did not focus on the various research areas on the implementation of robots in inspection and monitoring of buildings and infrastructure, nor did they discuss specific application sub-domains of inspection and monitoring. They also did not perform a comprehensive review of all research conducted on this topic. This article synthesizes the findings of previous studies and offers a comprehensive review of current research on robotic inspection of the built environment. The goal of this research is to consolidate information from the most recent body of literature using the qualitative content analysis methodology and answer the following research questions:
  • What types of robots have been studied in the literature on robotic inspection of buildings and infrastructure based on their locomotion?
  • What are the prevalent application domains for the robotic inspection of buildings and infrastructure?
  • What are the prevalent research areas in the robotic inspection of buildings and infrastructure?
  • What research gaps currently exist in the robotic inspection of buildings and infrastructure?
This paper is structured as follows. First, the methodology used for this review is explained in Section 2. After that, the findings of the bibliometric analysis performed on the reviewed papers are presented in Section 3. Then, a critical review of different types of robots found in the literature is presented in Section 4. Next, various application domains are identified from the literature and are presented in Section 5. Then, common research areas are identified that have been addressed by different researchers in Section 6. These are the different challenges faced in the implementation of robotics for the inspection and monitoring of the built environment. After that, based on the research gaps in the literature identified during the review, future directions for research are provided in Section 7. Finally, the paper is concluded in Section 8 with a brief discussion of the implications and limitations of this study.

2. Research Methodology

This review is based on bibliometric analysis and content analysis of the literature body on the topic of using robots for inspection and monitoring. Previous literature review studies [3,24] have used Web of Science (WoS) and Scopus databases to search for relevant papers on a topic. Both of these databases have a large range of high-quality academic publications in engineering and management domains [24]. To further ensure that all relevant papers are included in the review, the authors also used Google Scholar to find any relevant papers not indexed in WoS and Scopus. The keywords (robot* OR uav OR drone OR “unmanned aerial vehicle”) AND (“construction inspection” OR “infrastructure inspection” or “construction monitoring” OR “progress monitoring” OR “building inspection” or “inspection of building” or “monitoring of building”) were used to search in all the fields of the publications. The star (*) symbol was used as a wildcard to include words such as “robots”, “robotic”, and “robotics”. After compiling the papers from all sources, 1538 results were generated in total (308 from WoS, 553 from Scopus, and 677 from Google Scholar). After removing the duplicates, 1264 sources were retrieved. Conditional formatting function in MS Excel was used to remove duplicate entries.
Preliminary shortlisting was performed by reading the titles and the abstracts of the papers. Only those publications were kept that utilized robots in some way for building and infrastructure inspection and monitoring. Papers on the use of robots for labor or construction jobs were not included. Research on general-purpose robots that were not specific to the built environment application was also removed. For the purpose of this review, a “robot” is any mobile system that can operate autonomously or manually and has sensors to navigate and collect data. Some examples of robots in the context of this study are unmanned aerial vehicles (UAVs), unmanned ground vehicles (UGVs), marine vehicles, microbots, wall-climbing robots, cable-suspended robots, and legged robots. Apart from the actual applications of robots, research on enabling technologies (such as computer vision) that make the use of robots more efficient and safer in the future while being independent of the type of robot has also been included in this review.
After preliminary filtering, 575 sources were shortlisted for further review. This was followed by a secondary filtering, in which the shortlisted papers were analyzed by reading the major sections to ascertain the overall theme. The criteria discussed above were used to include or exclude the papers. Finally, 269 papers remained for in-depth analysis. The remaining papers were reviewed using the qualitative content analysis methodology used by past researchers [24,25] to identify the application domains and research areas commonly studied in the literature. In this step, papers were tagged with custom keywords for classification. The resulting classes and categories are discussed in more detail in the following sections. The methodology for this review is illustrated graphically in Figure 1.

3. Bibliometric Analysis

Bibliometry, or bibliometric analysis, is the application of mathematics and statistics to uncover structures and patterns of research in a certain area [26]. For this purpose, Biblioshiny, an open-source application running on R, was used. Biblioshiny is the web version of the bibliometrix tool developed by Aria and Cuccurullo [27]. It provides an intuitive web interface for performing bibliometric analysis [28]. Table 1 presents the descriptive statistics of the reviewed sources. The review included 269 documents from 185 different sources. Among these, 100 (37.17%) were journal articles, 163 (60.59%) were conference papers, and 6 (2.23%) were book chapters. The documents were written by 663 different authors. For this review, only documents written in English were considered.
Figure 2 shows the publication output in the area of inspection and monitoring of buildings and infrastructure by the year of publication. The earliest source included in this review was published in 1991 and titled, “A mobile robot for inspection of power transmission lines” [29]. Research in this area gained more popularity only after 2012, as can be seen in Figure 2. The median of the data was found to be in the year 2017, which means more than half of the papers were published in or after 2017. This again emphasizes the significance of our review paper compared to past reviews. From 1991–2021, an average year-on-year growth of 17% was noted in the publication output on this topic. This shows that more researchers are being attracted to this area of research. The consistently high output of publications in recent years shows that the topic of robotic inspection of the built environment is still relevant. This finding supports the need for this review.

4. Types of Robots

This section critically reviews various types of robots used in research. Figure 3 shows the types of robots identified. The numbers in the figure denote the number of papers reporting the use of that type of robot. Most researchers used UAVs, and because of their ability to reach places that humans cannot, they are found to be useful tools for many inspection applications [30,31]. They provide valuable support to quickly and safely access exterior facades of high-rise buildings and hard-to-reach places of bridge decks [32]. The second-most-common type of robot found to be used is the UGV. Unlike UAVs, which offer small payload capacities, ground-based robots such as wheeled robots offer much higher payload capacities and are useful tools in the longer inspection of buildings [3].
Some researchers used multiple types of robots in collaboration, e.g., a UAV with a ground robot [33,34]. Other researchers developed custom hybrid robots with more than one locomotion system [20,35,36,37]. These hybrid robots provide better reach and are more versatile than simple robots. For example—a wheeled robot with a rotor attached developed by Lee et al. [20] can jump around obstacles and inspect both the interior and exterior of a building.

4.1. Unmanned Aerial Vehicle (UAV)

UAVs are by far the most commonly used type of robot for inspection and monitoring of the built environment, as found in the literature. According to Association for Unmanned Vehicle Systems International (AUVSI), the market for UAVs is estimated to be $11.3 billion in the U.S. alone, which will grow to $140 billion in the next 10 years [38]. Examples of commercial UAVs used in research are DJI Phantom 3 [39], DJI Phantom 4 [40], Parrot AR.Drone 2.0 [41], DJI M600 Pro [42], DJI Matrice 100 [43], FlyTop FlyNovex [44], DJI Mavic Mini [45], DJI Mavic Pro [46], and Tarot FY680 [47]. The use of UAVs, also known as drones, started in military operations; however, they are now increasingly being used in the construction and maintenance of civil infrastructures for inspection and monitoring [30,48,49]. Their versatility and low operating and maintenance costs make them appealing to a wide range of sectors, including construction [50,51]. UAVs are a preferred tool for data collection because of their maneuverability and higher angles of measurement [52,53]. They are also very lightweight and take little time to set up. They can reach places where it is difficult to reach for humans [51,54,55]. UAVs can also undertake at-height inspections isolating humans from fall hazards [56,57]. UAVs are also faster than human inspectors [58]. High-rise towers with glass facades often get damaged after extreme weather events. These facades need to be inspected before re-occupancy of the building is allowed [59]. Due to their speed, UAVs can provide accurate information much faster and more frequently than humans [58,60]. UAVs can thereby reduce the cost and risk of inspection of the built environment [61].
UAVs are classified into two categories—fixed-wing and rotary-wing UAVs. While fixed-wing UAVs are faster, they cannot hover or take-off vertically. Rotary-wing UAVs can take off vertically from any location, eliminating the need for a horizontal plane on which to gain speed [62]. The most popular rotary-wing UAV is a quadrotor with four rotors [63]. They are agile and can hover in one place. They are, however, slower and have a shorter range [62]. As a result, rotary-wing UAVs are better suited for building applications. However, fixed-wing UAVs can be useful in long, linear infrastructure projects such as highways and railways. A relatively rare UAV type is a lighter-than-air-platform; examples are balloons and kites [64,65]. These systems are substantially slower and have very little wind resistance.
Typically, UAVs carry digital cameras as payloads to capture visual data from the site [66,67]. They can also carry other equipment, such as thermal cameras [68,69]. However, images or videos captured from UAVs can be noisy and require post-processing [66]. The design and payload of the UAV may vary based on the desired application [70]. The common use of UAVs in building inspections is to collect visual data by utilizing their onboard cameras [15,71]. UAVs are also commonly used for bridge and powerline inspections, because of the high risk of fall accidents in the manual inspection of bridges [72,73]. Lyu et al. [74] identified the following requirements for a UAV system for building inspection:
  • UAVs can hover anywhere to take photos.
  • UAVs can zoom in and focus on a small region of interest.
  • UAV operation should be simple, and the operator can get started without professional training.
  • UAVs should have a long enough flight time to improve operational efficiency.
  • UAVs should be capable of autonomous flight.
  • UAVs should be small enough for transportation and maintenance.
A UAV can be operated either remotely or autonomously and also fly over difficult-to-access areas [75]. Researchers [72] have found that for large linear infrastructure projects, and with manual control of the UAV, frequent turning and climbing maneuvers can be exhausting for the UAV pilot. The quality of the images captured by UAVs also depends on the pilot’s performance [76]. This motivated the development of autonomous trajectory planning for UAVs [77,78]. Active collision avoidance for UAVs is required to avoid accidents on sites due to their use, and it is more complicated than that for autonomous ground vehicles [79]. Shared autonomy has also been developed, in which humans provide higher-level goals to the UAV, but lower-level control, such as keeping a safe distance from the building and avoiding objects, is conducted autonomously [80]. Shared autonomy reduces the cognitive load on the human operator while still utilizing their experience and expertise [80].
Although UAVs are efficient and versatile in an outdoor environment, they are of limited use in indoor situations because of visual obstructions [81]. Another challenge for UAVs is that GPS signals used by outdoor autonomous UAVs may become unreliable in indoor settings [82,83]. Many UAVs use other sensors to aid in navigation, such as gyroscopes, magnetometers, barometers, SONAR, and inertial navigation systems [65]. Vision-based localization and navigation have also been developed and tested [84]. These autonomous navigation and path-planning techniques for UAVs and other robots are discussed in detail in Section 6.1.

4.2. Unmanned Ground Vehicles (UGV)

UGVs, also known as rovers, are the simplest in the design of all the robots discussed in this review. These robots work well on flat surfaces but perform poorly on cluttered surfaces [85]. UGVs can be wheel-driven, using pneumatic wheels to move [86], or crawler-mounted or tracked systems [87]. Crawlers, which are also used in military tanks and heavy equipment, have better traction on slick or wet terrain. If the surface conditions are suitable, wheeled UGVs are the most efficient in terms of power consumption, cost, control, robustness, and speed [85]. Owing to lower power consumption and longer runtime, UGVs can be used as human assistants for long inspection rounds [88]. Some examples of commercially available UGVs used for research are the Clearpath Jackal [89] and Clearpath Husky [90].
Due to their low center of gravity, UGVs are the most stable and can carry large payloads. Some common payloads attached to UGVs in the built environment for inspection and monitoring purposes are (a) regular fixed cameras and stereo cameras for visual data collection [90,91]; (b) LiDAR and laser scanners for 3D data capture [89,91]; (c) mechanical arms for obstacle removal [90]; (d) ground-penetrating RADAR (GPR), ultrasonic sensors, and infrared sensors for behind-wall and under-ground sensing [92,93,94,95]; (e) a graphics processing unit (GPU) for processing [89]; and (f) IMU, GPS, and UWB sensors for localizing and navigation [89,96].
UGVs have been used in a variety of situations other than buildings due to their ease of design and operation. They have been used for the inspection of bridge decks [89,97], storage tanks [98], HVAC ducts [5], and even sewer and water pipelines [99]. Owing to their adaptability, they can perform floor cleaning, wall construction, and wall painting in addition to inspection [90]. They make an excellent test bed for algorithms and sensor design due to their ease of operation.
The low height of UGVs is also a disadvantage because it limits their reach in large halls or spaces with high ceilings. As a result, UGVs are also used in close collaboration with other robots, such as UAVs. Such multi-robot collaborations are explained in more detail in Section 4.9.

4.3. Wall-Climbing Robots

Wall-climbing robots are used for the inspections of building facades, windows, or external pipes. Manual inspection of exterior utilities on a high-rise building is performed by human inspectors supported on a temporary frame suspended from the roof of the structure, which poses a huge safety risk to the human inspector [100]. A UAV can be the tool of choice for such applications because of its easy setup and flexibility. However, as discussed above, UAVs are susceptible to high winds and legal restrictions. Liu et al. [100] developed a cable-suspended wall-climbing robot for the inspection of exterior pipelines. Such robots are much safer and can carry larger sensor payloads. They can also be used for the external inspection of large above-ground storage tanks [101]. When supported by a single cable, as developed by [100], the robot can only move in the vertical direction. However, horizontal motion can be added by supporting the robot with two cables suspended from two different horizontally-spaced points at the top of the structure [101].
Other methods of wall-climbing include grip-climbing using mechanical actuators and springs to create gripping action [102]. These robots do not require any pre-installed infrastructure to climb the vertical structure. However, they need surface protrusions to grip while climbing and are not suitable for smooth surfaces.
Magnetic climbing robots use electromagnets controlled by circuits to climb steel structures [103]. The power needed to operate the magnets in these robots is very little compared to UAVs, and they can hold a position indefinitely [103]. A similar robot was developed by [104] with magnetic tracks. The robot was designed to inspect the interior of large steel storage tanks. Magnetic climbers are susceptible to losing traction from the surface due to dust or unevenness, which can be dangerous [105]. Therefore, both grip-climbing and magnetic-climbing robots are limited in their use for special types of surfaces.
Owing to the need for dedicated infrastructure or a special type of surface in the previously mentioned wall-climbing robots, researchers have also developed suction-mounts that can stick to a wider range of wall surface types. The Alicia robot was developed by [106] as three circular suction mounts linked together by two arms. The mounts are sealed by special sealants to maintain negative pressure and develop strong adhesion with the wall surface. A similar design was seen in [107]. Their robot Mantis was also designed as three suction modules joined by two links used for the inspection of window frames. When moving from one frame to another, one of the modules would detach and go over the frame followed by the other two modules one-by-one. Kouzehgar et al. [107] acknowledged that cracked glass windows may create a dangerous situation for their robot; therefore, they also developed a crack-detection algorithm to avoid attaching to cracked surfaces. In contrast, the ROMERIN robot developed by [108] used six inter-linked suction cups and turbines, instead of pumps, to create stronger air flow and negative air pressure. The use of many suction cups/mounts and high air-flow turbines provided the robot resilience against cracked surfaces [108].

4.4. Cable-Crawling Robots

Cable-crawling robots have a niche area of application in the inspection of stay cables in cable-stayed and suspension bridges [109]. These robots are different from the cable-suspended robots used for wall-climbing explained in Section 4.2. These robots do not need additional infrastructure to navigate but crawl along the existing steel cables in bridges through the use of drive rollers [110]. Although drones can also be used to safely inspect these bridge cables at high-altitudes, drones need to maintain a minimum safe distance to avoid collision [109]. Images captured from a distance do not provide enough clarity to detect micro-cracks on the cable surface [109]. Additionally, cable-crawling robots equipped with multiple cameras, as performed in [109], can inspect the whole surface of the steel cables instead of only one side at a time.
A similar cable-crawling robot was also developed by [110] in Japan for the inspection of the steel cables of suspension bridges. Their design was similar to [109], yet slightly different. Their robot consisted of a base platform that crawled the main cables of the bridge connecting the support tower and a tethered camera module that was lowered from the base platform for the inspection of the hanging ropes. The actual inspection was carried out by the camera module suspended from the base platform, while the base platform provided horizontal locomotion along the bridge.

4.5. Marine Robots

Marine robots are used for the inspection of marine structures such as bridge piers, dam embankments, underwater pipes, and offshore structures. There are two types of marine robots found in the literature—(a) submersible robots, also known as unmanned underwater vehicles (UUVs) [111] or autonomous underwater vehicles (AUVs) [112]; and (b) unmanned surface vehicles (USV), sometimes called surface water platforms (SWPs) [113]. Both USV and UUV are sometimes clubbed together under a single term of unmanned marine vehicles [114].
Submersible robots, as the name suggests, can be fully submerged in water and can perform inspection under water. They are also sometimes referred to as underwater drones [115]. Underwater inspection is conventionally performed by human divers, which is costly, labor-intensive, and unsafe [116,117]. With UUVs, positioning and navigating the robot is a huge challenge because most sensor and positioning techniques (GPS, fiducials, UWB, visual odometry, etc.) used on the ground are inviable underwater [111]. Dead reckoning techniques using inertial navigation, as explained later in Section 6.1.1, can be used for underwater navigation, but its accuracy deteriorates with time due to drift [111]. Optic and acoustic systems are also used that measure distance from the reflected light or sound waves [8]. However, inertial systems do not rely on information from outside and therefore are not affected by underwater rocks or marine life, as is the case with optic and acoustic systems [8].
Unmanned surface vehicles, on the other hand, operate on the surface of a water body. They are low-cost devices that can facilitate safer inspection of marine structures. USVs are often designed as large devices so they are stable in rough waters, though small USVs have also been developed and tested. The use of waterproof cameras and other equipment is a crucial design consideration for USVs [118]. Although USVs have been primarily used in military applications before, the construction industry can also benefit largely from their application. Bridge inspection is one of the largest areas where USV can be utilized because out of 575,000 bridges in the US, 85% of them span waterways [119].

4.6. Hinged Microbots

Hinged microbots are made up of several tiny body sections hinged together. These robots imitate the movements of snakes or worms. Instead of wheels, they move by using multiple motored joints along their body. They can move by contracting and stretching their bodies like worms, or by sidewinding like snakes [120]. Various propagation techniques for these robots have previously been investigated, such as piezoelectric, hydraulic, pneumatic, and electrical micro-actuators [9]. Electrical micro-motors have been found to provide the best speed and power.
The main advantage of hinged microbots is that they are lighter and smaller in size than other types of robots [9]. They are only a few cubic centimeters in size. As a result, they are appropriate for inspecting sewer, gas, water, or other pipes [9,120]. However, due to size limitations, they can only carry a limited amount of data-collection equipment. They are most often equipped with cameras on their heads [120]. Pipeline inspection is also complicated by a limited and unreliable network. Owing to unreliable communication within pipelines, Paap et al. [120] developed autonomous path-finding for their robot that can navigate sewer pipes even if there is no communication with the operator. With the advancement in machine learning techniques, Lakshmanan et al. [121] created a path-planning method for their hinged microbot based on reinforcement learning that quickly finds the best path with the least energy requirement.

4.7. Legged Robots

Legged robots are relatively newer than other types of robots used for the inspection and monitoring of the built environment. They move using mechanical limbs controlled by multiple motors in each limb. Legged robots may have two legs (bipedal) [122], four legs (quadrupeds) [123], or even six legs (hexapods) [124]. They have the adaptability and mobility to traverse various types of terrains, making them well-suited for construction sites [123]. Numerous legged robots have been developed by NASA, MIT, IIT, ETH, Boston Dynamics, Ghost Robotics, ANYbotics, and Unitree [125]. However, due to high technical complexity, very few have been used outside of a laboratory setting [125]. Due to less availability of commercial legged robots in the market, the literature on these in the construction domain is also sparse. A major advantage of using legged robots over UGVs is that the former can traverse stairs, which facilitates multi-story inspections [126].

4.8. Hybrid Robots

A hybrid class of robots uses more than one locomotion to navigate. These are a special type of robots that are developed for specific problems and are not commercially available in the market for general use. A simple hybrid robot is a wheeled robot with additional rotors on top as developed by [20]. In such a robot, the wheels provide stability on a flat surface indoors whereas the rotors assist in inspection at height, obstacle avoidance, and floor change.
Another hybrid robot that was developed by [127] adds rotors to wall-climbing robots. As discussed above, wall-climbing robots using suction or magnetic mounts can be useful for flat exterior surfaces. However, protrusions on the façade such as columns and mullions limit the maneuverability of wall-climbing robots. Rotors allow skipping over these protrusions easily without any additional infrastructure, such as cables, thereby extending the reach of wall-climbing robots. A similar robot developed by [128] used wheels with adhesive coating. The rotors not only provided the vertical thrust but also horizontal thrust to maintain contact with the structure. The wall-sticking mechanism can also be provided through the use of electro-magnets as done by [37]. Electro-magnets can be more stable against strong winds, however, may only work with steel structures and not with concrete or glass surfaces.
Another example of hybrid robot was provided by [35] that consists of legs with magnetic padding. The magnetic legs cannot only climb steel elements for at-height inspections but can also walk over wall protrusions. The strong electro-magnets also provide some fall protection against high winds. However, glass and ceramic surfaces may not work with this type of robot. A similar robot with six magnetic legs was developed by [129]. The robot matched a spider in appearance that climb walls for at-height inspection. Such complex robots require complex control systems and circuity to control the legs and their magnetism in tune with the walking motion, which is a separate research area.

4.9. Multi-Robot Systems

Researchers have also used teams of multiple robots of the same or different types for inspection and monitoring of the built environment. The robotic system developed by [130] comprised two quadruped robots working in a “so-called” master-slave relationship. The master robot was responsible for mission planning, task allocation, and comprehensive report generation and carried a high-performance computing unit for these purposes. The slave or the secondary robot carried other payloads such as a thermal camera, robotic arm, and long-range Lidar along with a smaller computing unit for running local control algorithms. Such systems can carry larger payloads without the need for an over-size robotic platform. The primary focus of the research with multiple robots is the development of multi-robot collaboration strategies.
Another type of multi-robot system used by researchers is a team of Unmanned Aerial and unmanned ground vehicles. In the multi-robot system developed by [33], the wheeled robot on the ground carried the sensors for mapping the environment while the UAV provided a wider view of the area from a higher vantage point for better path planning and navigation. A similar approach was undertaken by [34,131,132], in which the UAV provided an initial scan of the area for the identification of obstacles and occlusions. Based on the initial scan by the UAV, the optimum scan locations were selected for the wheeled robot for higher-quality and longer scans.
Multi-UAV swarms have been used by Khaloo et al. [133] and Mansouri et al. [134] independently. Large infrastructure projects, such as gravity dams and linear transportation infrastructure, can be too large for a single UAV to inspect in a single mission and require mid-mission recharging. Utilizing multiple UAVs together may reduce the time for data collection while covering a large area. In studies related to multiple UAVs where each robot is inspecting a part of the target structure, the combine path planning and data fusion from multiple data sources become the primary research focus [135]. This even applies to other multi-robot systems as well.
Finally, a team of underwater robots and USV has been independently developed by Ueda et al. [136], Yang et al. [137], and Shimono et al. [113] for the underwater inspection of dams and bridge piers. In these studies, the USV provided horizontal navigation from the surface of the water body and lowered the submersible robot suspended by cables or winch for closer inspection under the water.

5. Application Domain

After conducting a qualitative content analysis of the shortlisted papers, many different application domains and research areas were identified from the literature. It was found that, within the context of building and infrastructure inspection and monitoring, robots have been used for five different applications, namely, (a) maintenance inspection, (b) construction quality inspection, (c) as-built/as-is modeling, (d) progress monitoring, and (e) safety inspection. These application domains are explained in detail in the following subsections. The frequency of work in each of these application domains is presented in Figure 4.

5.1. Maintenance Inspection

Operation and maintenance of a building cost about 50–70% of the total life-cycle cost of a project [138]. Structures deteriorate due to aging and extreme weather [139]. Regular maintenance increases the useful life of the building [138]. It is crucial to identify defects before they get worse and cause building failure [140]. Periodic monitoring of a structure after it has been constructed for maintenance purposes is also known as structural health monitoring (SHM) [141]. SHM is critical to ensure the safety and integrity of the structures [142]. Visual inspection has long been used to uncover structural flaws in order to guide building rehabilitation [143]. Current practices involve regular and manual inspection of the structure for any structural defects, which is slow and costly [138,140]. Robots can regularly monitor the structural elements for any signs of damage and alert humans if further analysis is required [144]. Maintenance inspection also involves measuring the rate of deterioration for predictive analysis [145]. Images captured using robots are more consistent in their perspective, which is more useful in noticing the rate of change as compared to images captured manually [145].
Robotic inspection also provides a safer alternative to manual inspection. High-rise towers with glass facades, which are a common sight in most cities, require regular maintenance [107]. The classical approach to the maintenance of high-rise buildings poses a high risk to human workers due to high winds [107]. UAVs and wall-climbing robots provide a safer alternative for performing maintenance inspections of high-rise buildings [108,146]. Inspection of tall cylindrical structures such as storage tanks and silos also poses a similar risk to human inspectors [147], and so do roof inspections, since manually inspecting roofs can be difficult due to access difficulties [56].
Old structures may also become unsafe for humans to inspect, especially after sustaining damage from a natural disaster [11,148]. Many historical buildings are abandoned because of numerous structural defects that make them hazardous for manual inspection [149]. Robots equipped with special sensors can be used to diagnose structural integrity before a human can perform a detailed inspection [150]. Ghosh Mondal et al. [151] trained a convolutional neural network (CNN) model with images captured from UAVs for assessing the extent of damage in disaster-affected structures. Sometimes, a single type of sensor is not adequate to identify all types of defects; therefore, multi-robot systems with different types of sensors are also deployed to monitor the structure [152]. One such sensor used is an infrared thermal camera to collect thermal images of the building [153]. Thermal images may reveal special defects such as material degradation and air leakage that might not be detectable using the naked eye or simple RGB images [153].
The monitoring of bridges is often undertaken as a manual inspection task. However, it involves a human inspector being exposed to high-speed traffic and fall hazards from going under the bridge deck [154]. It also involves the bridge being closed and the use of lifting equipment to reach the hard-to-reach places under the bridge [152]. Due to these challenges, bridges may suffer from poor maintenance, which has been the reason for many collapses [154]. UAVs can be used for regular structural health monitoring of bridge decks where it is riskier for humans to reach frequently [32]. Schober [155] used a climbing robot for the inspection and testing of high bridge pillars. Lee et al. [156] studied the use of a teleoperated robot for the inspection of hard-to-reach parts of a bridge.
Robots are also used for measuring building performance. Thermal leakages in a building may cause up to 40% energy loss, thereby decreasing building performance [22]. UAVs fitted with infrared sensors have been used for monitoring of thermal efficiency of buildings [157,158,159]. They have also been used to create heat maps of a building envelope to detect thermal anomalies using thermal cameras [22]. This process is called thermography [160]. Pini et al. [161] argued that using a robotic system for inspection improves repeatability in measurement and reduces human involvement.

5.2. Construction Quality Inspection

Construction quality inspection involves checking the building elements while they are being constructed to ensure they are within tolerable limits and meet industry standards [130].The use of low-quality and changes in temperature may cause cracks in the structures [162]. Cracks in critical structural components must be inspected and documented in detail [163]. Field workers manually inspect the status and condition of the building of interest in a traditional visual inspection [130,164]. Manual inspection requires a significant amount of manpower, which in turn increases the chances of human errors [129]. Undetected defects in the structure may affect the safety of the structure in the long run [129]. Using robots in collaboration with humans for construction quality inspection reduces labor and improves productivity and accuracy [165].
Prieto et al. [130] developed a multi-robot system called AutoCIS for performing quality inspection of construction work. Quality inspection of different types of elements needs different automation approaches [130]. Some examples of quality defects found regularly in construction projects are cracks, wall cavities, surface finish defects, alignment errors, unevenness, and inclination defects [130]. Detection of each of these defects needs a different automation approach. The defect detection methods are categorized into two categories—destructive and non-destructive [130]. For robotic inspection, non-destructive methods are preferred that use different non-destructive sensors such as cameras, 3D scanners, RGB-D cameras, thermal cameras, or other sensing probes [130].
Cracks and other surface defects can be detected by non-destructive methods. Two of the most extensively used approaches for crack-detection are edge detector-based and deep learning-based approaches [163]. While edge detection is an image processing technique that does not require large training datasets, deep learning approaches such as CNNs do [166]. Kouzehgar et al. [107] used a CNN for the detection of cracks in the glass façades of high-rise towers, which is a high-risk and labor-intensive process, using a suction-mounted wall-climbing robot. More discussion on knowledge extraction methods using image processing and deep learning is presented in Section 6.2.
UGV with LiDAR was used to detect defects in infrastructure projects such as bridge decks, which are more dangerous for humans to perform [89]. Lorenc et al. [167] developed an encapsulated robotic system for the inspection of drilled shafts which can be used in high-water-table situations that would normally require a costly and time-consuming dewatering process for human inspection. Barry et al. [101] used a cable-suspended robot for the inspection of vertical structures, such as storage tanks, ship hulls, and high-rise towers.
Various methods for inspecting the quality of building interior components have also been developed. For example, Shutin et al. [168] used robots to inspect the quality of blockwork. Many defects arise during the bricklaying or blockwork activities. and inspecting and correcting them takes significant time and effort [168]. Quality inspection of indoor drywall partitions has been performed using a UAV by Hamledari et al. [169]. Hussung et al. [170] developed a ground-based robot called BetoScan for the non-destructive strength-testing of concrete and reinforced concrete elements. An integrated framework for combining these techniques used by different researchers in order to develop a single robotic system capable of independently checking the quality of various elements should be developed.

5.3. Progress Monitoring

Construction projects can get delayed due to various causes, such as weather, supply chain disruptions, or human error [20]. Projects must be monitored regularly, and corrective actions must be taken to reduce the impact of the delays [20]. Traditionally, the project manager (PM) visits the jobsite to monitor construction operations and compares the as-built state of the construction with as-planned state [171]. It allows project managers to timely identify schedule discrepancies [172]. When the PM cannot visit the site, time-series images that show the progress over time are also used by them for progress monitoring [173]. Typically, this process is manual and strenuous [123,174,175]. Automated progress monitoring aims to make this process more efficient and precise through the use of technology such as BIM, robotics, and image processing [171].
Lee et al. [20] developed a robotic system equipped with LiDAR, wheels, and rotors connected with a remote server for remote progress monitoring. The advantage of such a system is that it saves time for the project manager, as the robotic system moves faster than a human, so can cover a larger area in less time [176,177]. The PM can monitor multiple projects from one remote location [178]. Bang et al. [179] developed an image stitching algorithm that converts a video captured through a UAV into a sitewide panorama. This allows the PM to view the entire jobsite at one glance and monitor the progress.
An extension to the above approach is to automatically detect progress from visual data by robots. Computer vision has been used to automatically identify building components (e.g., drywall) from 2D images and update the progress information in an Industry Foundation Classes (IFC)-based 4D Building Information Model (BIM) [169]. Such algorithms used in conjunction with autonomously navigating robotic systems [180,181] driven by BIMs automate a significant part of progress monitoring, thereby reducing human effort.
Using point clouds from laser scans is also used to detect constructed elements to measure progress [174]. Since laser scanning is a time-consuming task, using autonomous robots for laser scanning can reduce the human resource requirement. A major challenge in automating the progress monitoring process through visual scans is to identify the optimum points that provide the best view of all constructed elements [174].
Visualization of construction progress based on information collected by robots needs to be considered too. Human inspectors receive a wealth of contextual information during manual walkthroughs, which helps them understand the construction work. Looking at 2D images is not intuitive for that task [126]. Halder et al. [126] created a methodology for visualizing 360° images collected on-site using a quadruped robot within a 3D virtual environment created with BIM. The 360° images were geolocated inside the building using BIM, which provides enough contextual information to understand the image’s relationship to the building. Immersive virtual reality has also been tested to assist a remote inspector visualize the site visuals easily and intuitively [178].

5.4. As-Built/As-Is Modeling

During construction, deviations may arise between the as-built structure and the as-planned model of the project due to various reasons. To avoid disputes, the changes must be communicated to all stakeholders in a timely manner [182]. Three-dimensional reconstruction of the as-built state is useful for measuring those deviations [46,183]. Three-dimensional reconstruction is creating a 3D model of the building using visual data from it. Capturing visual data from the site using manual means is laborious and time-consuming [184]. Robots can be used to automate the data-capturing process for 3D reconstruction or as-built modeling [185,186].
Point clouds are a popular data structure for storing as-built information in the form of physical scans of structures [187]. Point clouds are generated either using laser scanners or image-based photogrammetry [187]. Photogrammetry techniques rely only on RGB cameras to capture 2D images for as-built modeling [188]. Photogrammetry also allows taking volumetric measurements from images [189]. Image processing techniques such as structure-from-motion are used to build a 3D model from 2D images from a moving camera for photogrammetry [190]. However, this requires heavy post-processing of the images. Laser scanners are more accurate, but they require specialized equipment and skilled operators, which is substantially more expensive than conventional RGB image capturing. Photogrammetry using regular RGB cameras is more suitable than bulky laser scanners for UAVs. However, the accuracy of measurements generated from photogrammetry is subject to various factors, such as the choice of lens, sensor resolution, the sensitivity of the sensor, and lighting conditions [191]. LiDAR is also used instead of laser scanners. Being lighter in weight, LiDAR mounted on UAVs has also been used for as-built modeling [192]. Another approach is using Kinect sensors [48]. Kinect measures the distance of objects using structured light projection [48]. Kinect sensors are lighter than laser scanners; therefore, they have also been used with UAVs [48].
Visual obstructions at construction sites, such as scaffolding, stacks of building materials, and people moving around, create the requirement for multiple scans within a single space [193]. This makes manual laser-scanning time-consuming and labor-intensive. By automating this procedure, the rate of generating as-built information may be raised from weekly inspections to daily data collection. [187].
Both UAVs and UGVs can be used for as-built data collection [34,91,194]. Robots equipped with laser scanners were used for as-built/as-is modeling of construction sites [193]. Drones were used with photogrammetry to create automatic point clouds [187]. Hamledari et al. [52] used a UAV for collecting as-built information and updating the BIM model by updating the IFC data pre-loaded on its system. Patel et al. [30] used Pix4D to create automatic flight plans to fully automate the capturing of images of a building using a UAV and create a point cloud for inspection.

5.5. Safety Inspection

The Bureau of Labor Statistics reported that one-fifth of worksite fatalities in 2017 occurred on construction worksites [195]. Safety managers perform safety inspections of construction sites on a day-to-day basis to control hazards. Safety inspection involves safety managers performing walkthroughs of the site and visually observing construction operations to enable the early detection of problems [196]. In response, project managers implement precautionary actions to prevent accidents [196]. However, this the project managers from engaging in other productive. Additionally, safety is an important issue that can benefit from more frequent monitoring. One of the leading causes of accidents on construction sites is falls [197]. Implementing site-wide fall protection practices has been found inadequate to prevent falls; instead, continuous monitoring is important for preventing fall accidents [197].
UAVs have been used for remote surveillance of construction sites for improving safety and preventing accidents [21,196,197]. UAVs move faster than humans and can access hard-to-reach places [196] and perform continuous monitoring. Gheisari et al. [197] developed computer vision algorithms to detect unguarded openings from video streams transmitted by drones. Kim et al. [21] trained computer vision models for object detection and object tracking for proximity monitoring. They used a drone to survey a construction site and track construction resources (workers, equipment, and material) to prevent struck-by accidents.

6. Research Areas

To study the challenges of using robots for inspection and monitoring of the built environment, the papers were reviewed and categorized based on the research areas studied by the researchers. There were eight main research areas discovered in the literature—(a) autonomous navigation, (b) knowledge extraction, (c) motion control system, (d) safety implications, (e) sensing, (f) human factors, (g) multi-robot collaboration, and (h) data transmission. The following subsections discuss each of these research areas in detail.

6.1. Autonomous Navigation

Atyabi et al. [112] define intelligence as “the ability to discover knowledge and use it” and autonomy as the “ability to describe its own purpose without instructions”. Navigating a robot autonomously on a construction site for inspection has been a prime focus of research for many researchers [89,198,199,200]. Technologies have been developed to make the robot aware of its surroundings. Autonomous navigation of robots can be divided into three stages—localization, planning, and navigation.

6.1.1. Localization

In the localization stage, the robot estimates its current position in the target space. Myung et al. [201] used the local navigation system (LNS) for the localization of any robot inside a building. The LNS uses preinstalled anchor nodes for calculating the location of the robot, similar to how the Global Positioning System (GPS) works, though GPS is ineffective in an indoor setting. Traditional LNS requires a direct line of sight with the anchor nodes and produces an accuracy of up to 11 m in a non-line-of-sight setting, which is not very useful. However, Myung et al. [201] developed an algorithm to compensate for non-line-of-sight and achieved an accuracy of 2 m. This is a significant improvement but may not be sufficient for reliable inspection in small spaces.
In GPS-denied environments, such as inside large buildings or areas surrounded by high-rise structures, additional technologies such as Real-Time Kinematic (RTK), Differential GPS (DGPS), and Ultra-Wide Band (UWB) technologies are used to augment the capabilities of GPS and improve the positioning accuracy to within the centimeter range [200]. However, these technologies require additional infrastructure to set up and operate [53].
The Simultaneous Localization and Mapping (SLAM) algorithm is used to create a digital map of the surroundings using a camera, LiDAR, or other sensors. Phillips and Narasimhan [89] combined measurements from an IMU sensor, wheel odometry, and GPS (whenever available) and used the extended Kalman filter (EKF) for estimating the position of a Clearpath Jackal robot, a medium-sized UGV. This technique is called dead-reckoning, which can enable localization without any external infrastructure, but can be subject to drift over time and needs to be re-calibrated repeatedly [53]. Parameters required for localization of the robot are categorized into four categories—interior orientation parameters (e.g., camera focal length, the principle point, and length distortion), exterior orientation parameters (e.g., orientation from IMU), ground control points (e.g., any known natural or artificial targets), and tie/pass points (e.g., unique feature points on the scene that can be tracked in multiple images) [202]. The scale-invariant feature transform (SIFT) technique was used for feature extraction in the surrounding environment and comparison with a CAD model for estimating the position of the robot in the frame of reference of the building [203,204,205].
In underwater inspection, such as inside water tanks, visual sensors become ineffective. An underwater robot needs ultrasonic acoustic and inertial navigation sensors [8]. GNSS and underwater acoustic position systems have been used together to provide reliable localization underwater [206]. The other approach is to use a USV to carry a submersible vehicle. The USV being above water can receive GNSS signal, and the submersible vehicle’s absolute position can be calculated from its position relative to the USV [137].

6.1.2. Path Planning

Path planning involves finding the optimum path or a series of control points (called waypoints) from the robot’s current location to its target location while avoiding collision with the different elements of the structure [207]. Besides the positions of the control points, the path information may also include the velocity and yaw of the robot [135]. BIM has been used as a planning tool for robot path planning [207,208]. Most research on path planning focuses on single-building inspection. However, a multi-building path planning approach has also been performed with UAVs by Lin et al. [209]. Another technique used for path planning for UAVs is discrete particle swarm optimization [31]. Dynamically changing site conditions in construction projects also require the robot to re-adjust its path, which means any pre-programmed path may not work in the long run or must be revised each time the robot is expected to complete its mission. Therefore, path planning should consider the current state of the structure. Therefore, 4D BIM is used for path planning because it captures the state of the structure with time during construction [81]. However, the success of the path planning will depend on how accurately the 4D BIM represents the actual temporal state of the structure. BIM also provides information about the openings (e.g., doors) that needs to be considered in indoor path planning [126,210]. When BIM is not available, a 3D point cloud can also be used in its place for path planning [211].
Krishna Lakshmanan et al. [121] used reinforced learning (RL) to train a robot to find the optimum path. RL is one of three types of machine learning models, other than supervised and unsupervised learning. It is based on finding the correct solution based on a positive and negative reward system. Asadi et al. [180] used an embedded Nvidia GPU on the robot body for mapping and navigation using deep-learning-based semantic segmentation of images captured by the robot’s camera. This eliminates the continuous need for connectivity and improves the autonomy of the robot.

6.1.3. Navigation

Construction sites are highly cluttered environments. During the navigation, the robot must be aware of its surroundings and make minor adjustments to its path [212]. Warszawski et al. [213] compared laser sensors, ultrasonic sensors, and infrared sensors for mapping a floor for robot navigation. Zhang et al. [214] and Asadi et al. [212] used deep-learning for vision-based obstacle avoidance of the robot. Asadi et al. [90] developed an autonomous arm for a UGV for detecting and removing obstacles from the robot’s path. They used a CNN-based segmentation model for object detection. This is useful for removing small obstacles up to the payload capacity of the attached arm.
Wang and Luo [165] developed a wall-following algorithm using LiDAR sensors and a ground-based wheeled robot to run along the walls and find defects. This is a simple algorithm that does not need any pre-planning of the path, but it cannot move away from the walls. In large halls or auditoriums, the wall-following feature can be a limitation. Al-Kaff et al. [215] used visual servoing with UAV to maintain a safe distance from the wall. Visual servoing uses a feedback loop between the camera image and robot control to dynamically control the robot with the changing condition [215].

6.2. Knowledge Extraction

Computer vision is used extensively to create a rich set of information from site images and videos for inspection purposes [216]. Computer vision and deep-learning have been used to detect defects in a building structure, such as exposed aggregates on a concrete surface [217] or cracks on the walls of storage tanks [218]. It is also used for post-disaster reconnaissance of damaged buildings [148,151]. CNNs have performed remarkably well in detecting defects from images [219]. Li et al. [217] trained a U-net (a CNN-based deep-learning model) to identify exposed aggregate on a concrete surface. The U-net is a popular deep-learning model used for pixel-wise image segmentation. With a training dataset of 408 images and a validation dataset of 52 images, they achieved an accuracy of 91.43%. The results are promising, which could be further improved with a larger dataset. Alipour and Harris [220] used a publicly available dataset of forty thousand images to train a deep-learning model to identify cracks on concrete and asphalt surfaces. They achieved an F1-score of up to 99.1%, which is a measure of how well a classifier function performs. The major challenge with crack detection using deep-learning is the wide range of complex backgrounds due to different types of facade materials [221].
Computer vision has also been used for the automatic creation of as-built models from site images. Hamledari et al. [169] used computer vision to identify components of indoor partitions, such as electrical outlets and doors using pictures captured from a UAV. Hamledari et al. [52] further developed the technique to compare as-built elements with the as-designed elements in an IFC database. If a discrepancy is found, the element type, property, or geometry is updated to build an as-built model automatically.
Researchers also developed non-ML-based image processing approaches that do not require big training datasets. For example, mathematical transformations for stitching multiple images across a single site to create a panoramic view of the project has been developed for inspection [179,222,223]. Photogrammetry, as explained in Section 5.4, is an image-processing technique [224]. Pix4D is a commercial platform used for photogrammetry that provides the option of taking building measurements from a 3D model created from 2D images. Pix4D has been used for knowledge extraction (taking measurements) using robots [30]. Edge detection techniques, which find discontinuities in the image, such as the Laplacian of Gaussian algorithm, are used to detect cracks on walls [225,226] but do not require training data.
Another important consideration in using robots in the field is that due to their efficiency and speed in capturing data, the high volume of data may soon create data overload which slows down learning from the data [227]. Thus, it is important to extract only useful data from the complete pool of data collected by the robot. In this direction, Ham and Kamari [227] developed a method to rank and select images captured by a UAV based on the number of construction-related objects visible in the image using semantic segmentation.

6.3. Motion Control System

Inspections of building facades, ducts, pipes, and shafts require a different type of locomotion and navigation. Barry et al. [101] studied the use of cable-suspended robots for the inspection of high-rise towers or storage tanks from the outside. Heffron [228] developed a remotely-operated submersible vehicle for underwater inspection. Brunete et al. [9] developed a worm-like microbot for the inspection of small-diameter pipes and used infrared sensors for navigation and defect detection. ALICIA3 [229] is a wall-climbing robot that uses pressure-controlled suction cups to stay affixed to the walls and is used for the inspection of building facades. Researchers have installed multi-joint arms and grippers, telescopic supports, and rotating heads [29,35] to enable the robots to reach hard-to-reach places. These works improved the overall reach of the robots.
Apart from the basic wheel or rotor-based locomotion, hybrid locomotive systems further improve the maneuverability of the robot in cluttered environments, such as construction sites. A drone equipped with wheels and a propeller can inspect a building’s exterior, move in small spaces, and also climb stairs [20]. A legged robot with magnetic pads under its feet can walk on the ground and climb on steel columns and pipes for closer inspection [35]. A climbing-flying robot can climb on walls and poles for better quality inspection and can fly to pass obstacles and avoid falling [36,37].
Aerial robots deal with a high degree of freedom and are extremely prone to environmental disturbances. Images captured from an aerial robot or UAV are often characterized as “noisy” [230]. To perform an effective inspection, the robot (or drone) needs to keep adjusting to the external environment. González-deSantos et al. [231] equipped a UAV with Lidar to measure distance and orientation for maintaining a stable altitude and position with respect to the inspection target, which was important for the ultrasonic gauge sensor used to measure the thickness of metallic sheets. Kuo et al. [232] developed a fuzzy logic-based control system to control the camera and the thrust of the UAV to get more stable images.
In wall-climbing robots, the pressure in the suction cups is controlled using pressure sensors installed inside the robotic systems. This adjusts the suction strength for different surface conditions [233]. Barry et al. [101] developed a rope-climbing robot for climbing and inspecting walls of storage tanks and high-rise buildings. They designed the control system to counteract the oscillations of the rope caused by the climbing motion of the robot.

6.4. Sensing

Robots can be equipped with various types of sensors to perform different types of inspections. They can be equipped with high-precision servo-accelerometers to measure story drift in earthquake-affected structures without the need for a human inspector [234]. Many other sensors have been used with robots for non-destructive testing and structural health monitoring, such as eddy current, RADAR, infrared (IR), and ultrasound sensors [95,170]. Liu et al. [100] proposed using gas detectors with a pipe-climbing robot for detecting gas leakage in high-rise towers. Elbanhawi and Simic [235] developed a robotic arm to automatically collect water samples from a solar pond. The arm was equipped with multiple sensors for in situ measurement of density, salinity, pH, and turbidity. Huston et al. [236] embedded sensors, such as strain gauges, in the building structure. They used a UGV to power the sensors and collected data wirelessly through electromagnetic induction. This avoided the need for embedding power cables for powering the sensors. Davidson and Chase [92] installed RADAR on a robotic cart to create a digital reconstruction of the interior of a bridge deck through diffraction tomography. Kriengkomol et al. [129] used audio sensors to perform a hammering test on steel structures using a hexapod robot. The hammering test detects and analyses the vibration of a structural element after striking it with a hammer. Anomalies in the resulting vibrations indicate loose bolts or other structural defects [129]. An emissometer connected to robotic arms has been used for autonomous mapping of the thermal emissivity of the building envelope for energy studies [161]. Robots equipped with IR sensors are used to monitor the building envelope for any thermal anomalies [158,237,238,239]. When using IR, the angle of tilt of the sensors affects the accuracy of measurement [240]. Therefore, UAVs can be advantageous in maintaining the right angle while collecting data for high-rise buildings. Experts may choose to combine multiple sensors into a single robotic platform for faster inspection and more detailed inspection. These sensors can be grouped together into six categories—thermal/IR sensors, ground-penetrating radars (GPR), accelerometers, laser-based sensors (LIDAR or laser scanners), strain gauges, microphones, and electrical resistivity sensors. The frequency of works reporting the use of these sensors found in the review is presented in Figure 5. The numbers in the figure represent the number of papers reviewed reporting the use of such type of sensors.
When using multiple sensors together in a single robotic platform, fusing multiple sensor data together to present more meaningful data is important and was the focus of research in [94] and [86]. Massaro et al. [95] combined data from a ground-penetrating RADAR, thermal IR camera, LIDAR, GPS, and accelerometer into a long short-term memory neural network for detecting road defects. Combining and processing the data from different types of sensors, such as positioning sensors (GPS, IMU), visual sensors (camera), and non-visual sensors (RADAR, thermal camera) can provide more in-depth inspection in a shorter time [86,94]. However, more sensors will increase the weight the robot needs to carry, and its power requirements [241]. The more sensors a robot carries, the larger the capacity required, and the larger it has to be in size, which in turn increases the power requirements even further.

6.5. Multi-Robot Collaboration

Researchers have used multiple robots, such as UAV-UAV or UAV-UGV pairs for better navigation. In the UAV-UGV pair, the UGV performs the actual inspection while the UAV provides navigational support by being its eye from a higher vantage point [33,34,131]. Construction sites are highly cluttered, and it becomes difficult for a ground-based robot to navigate the site due to limited visibility [33]. To address this challenge, a collaborative system of a UAV and a UGV can be used for localization and path planning [33,34]. The UAV provides an eagle-eyed view of the construction site and creates a point cloud with 2D photogrammetry which is then used by the UGV for navigation [34,131]. However, in indoor spaces, the presence of a large number of obstacles limits the effectiveness of this approach without a suitable anti-collision system in the UAV [242].
Another approach is to use the UAV for the main inspection and the UGV as a docking station for recharging the battery of the UAV [243]. This is useful in long-duration missions since the UAV needs to be lightweight and can be equipped with a smaller battery with a limited runtime. The UGV with a larger battery accompanies the UAV and provides periodic power boosts [243].
The UAV-UAV approach is used for navigation and localization in Global Navigation Satellite System (GNSS)-challenged environments [244]. This is achieved by coordinating the movements of two UAVs in such a way that one of them is in a location where GNSS is available, while the other calculates its absolute position based on its relative position from the first UAV [244].
Mantha et al. [208] developed an algorithm for the efficient allocation of tasks and path-finding for multiple robots while keeping path obstructions and the single-charge run capacity of robots as constrained. Genetic and k-means algorithms in conjunction with the Travelling Salesman Problem (TSP) also allow finding the optimum solution to the multi-robot path planning and task allocation problem [245,246]. Multi-robot collaboration is important for the efficient use of these tools while avoiding clashes.

6.6. Safety Implications

Requirements for the safe use of robots on construction sites are not well-explored, as there are very few notable works on the safety and usability assessment of robots for inspection. There also needs to be more research on regulating the use of robots on construction sites. To the best of the authors’ knowledge, there are no statutory regulations on using ground robots on construction sites that provide guidelines for ensuring the safety of people from robots, especially autonomous robots. The Federal Aviation Administration (FAA) provides regulations on the use of UAVs [247], which are generally applicable to its use anywhere including construction. The FAA previously disallowed the use of UAVs out of sight of the pilot and over a populated area [247]. This rule necessitated the involvement of a human with the UAV at all times during its operation, which defeated the purpose of fully autonomous inspection. However, recent amendments in the FAA rule effective on 21 April 2021 allow the use of UAVs in populated areas and at night-time [248]. The FAA also provides a waiver from part 107.31, which prohibits the use of UAVs out of sight of the pilot, for small aircraft [248]. Using the FAA rules, the Occupational Safety and Health Administration (OSHA) [249], through its memorandum to regional administrators, outlined some guidelines and recommended some best practices for using drones specifically for inspection in a workplace setting. These recommendations include procedures for “pre-deployment, pre-flight operations, postflight field procedures, postflight office procedures, flight reporting, program monitoring and evaluation, training, recordkeeping, and accident reporting” [250]. OSHA too prohibits the use of drones over people, out of sight of the operator, and after sunset.
Various models and simulations have been developed and tested for qualitative and quantitative analysis of risks associated with the use of aerial robots in construction sites [247,250,251]. Flight simulators are also used to train UAV pilots for the safe usage of UAVs in construction sites and to reduce risk from the use of UAVs [252].

6.7. Data Transmission

In this category, researchers have developed data compression and transmission techniques for faster and more robust data transfer from wireless robots over the Internet or local area network [253,254,255]. Li and Zhang [256] combined the Internet of Things (IoT) with robotics for wirelessly collecting data from multiple sensors inside the building for quality inspection. Wireless communication with robots usually occurs via radio frequency (RF) signals using protocols such as WiFi and Bluetooth [256,257]. Other RF-based communication protocols have also been used, such as LoRa and ZigBee [256,257]. However, in urban settings, where there is a large number of sources of RF signals, signal interference and/or loss of signal can severely affect the safe performance of the robot. Therefore, Mahama et al. [257] studied the effect of RF noise in urban settings in the operation of UAVs and recommended using redundant channels and faster frequency switching for seamless operations.
A robotic inspection may generate a vast amount of data in the form of images or point clouds [255]. Depending on the fidelity of the data required, either lossless or lossy data compression techniques may be used. Lossless compression produces exact reconstruction, whereas lossy conversion has a higher compression ratio but loses some details during reconstruction [255].

6.8. Human Factors

Humans and robots collect and process information differently. While humans can understand abstract and unstructured data easily, robots need structured instructions. On the other hand, robots can process and execute instructions more accurately than humans without tiring. Human–robot interactions deal with understanding the intent of the human operator and breaking it down into simple instructions that can be executed by the robot. At the same time, it also deals with presenting the raw data collected by robots in a form that makes sense to humans. Human–robot collaboration (HRC) can utilize the strengths of both agents [258].
Augmented, virtual, and mixed reality are excellent tools for presenting site information in a human-friendly format. Van Dam et al. [259] discovered that annotating the visual data captured by a UAV helps human operators perform inspection tasks by reducing their workload and decreasing the likelihood of missing an error in the work. Liu et al. [164] augmented the live visual feed from a UAV with the BIM. Jacob-Loyola et al. [175], on the other hand, augmented the as-planned BIM model with a reconstructed 3D model of the building created from UAV images in an augmented reality environment. Both of these methods used and converted information captured by robots into a more intuitive format for human evaluation. However, using augmented reality with BIM is challenged by the problem of alignment, as consistently keeping the BIM aligned with the reality in a moving UAV is a challenging task [164].
Even though many studies are working on making robots more and more autonomous, full or partial human intervention is still required for the safe and efficient use of robots [260]. Robotics, being a fairly new technology, requires proper training for the operators [115,260]. Virtual reality is found to be an excellent test-bed for prototyping new human–robot interactions [261,262]. It provides a safe environment to train human operators without the risk of harm to the site, the robot, or any human [260].

7. Future Research Directions

The comprehensive review of the literature revealed a few research gaps in each of the identified research areas. These research gaps are presented as follows.

7.1. Autonomous Navigation

Currently, robots that can successfully perform regular autonomous navigation through the unstructured and dynamically changing environments of construction sites do not exist. Therefore, future research should develop control and autonomy algorithms to this end. Additionally, robots are efficient data collection agents because of their speed and consistency, ensuring that the data available to project stakeholders are up-to-date and consistent [145]. Studies on autonomous navigation, including [30], have considered predefined scan locations from where the robot would perform scans for as-built modeling. A good quality laser scan may take several minutes at each point to complete [263]. Therefore, it is important to minimize the number of scan locations to enable the robot to cover the whole structure on a single charge. Human reasoning is still required for selecting the optimum locations from where the robot would collect data through autonomous navigation. The next step towards improving autonomy should be making the robot able to select scan locations itself. The selection of target scan locations requires experience and domain knowledge in addition to the knowledge about the structural geometry. Future studies can investigate the process followed by humans and the visual cues used by them in selecting right locations for capturing site images or performing laser scans. This knowledge can help robots to independently identify the navigation goals that meets human requirements.

7.2. Knowledge Extraction

As discussed in Section 5.2, different elements of a structure require a unique approach to quality inspection. Previous research has developed independent techniques for inspecting the quality of a single type of element, such as indoor partitions [169], reinforced concrete [217], storage tanks [218], glass facades [107], and so on. Although vision-based defect detection techniques can detect many surface defects, other sensors, such as hammering devices, are employed to detect internal defects [264]. Future research should focus on developing an integration framework that will combine these techniques into a single robotic system for a comprehensive inspection and monitoring of the built environment.
Nevertheless, vision-based defect detection is an important research area [163]. The CNN-based deep-learning approach is commonly explored by many researchers [71,151]. However, CNN and other deep-learning models require large training datasets to train. Various researchers created separate databases of labeled images to train CNN models. However, these databases contain only a single type of component. SDNET2018 by the Structural Defects Network (SDNET) is an open-source dataset of 56,000 annotated images of concrete defects developed at Utah State University [265]. Similar open-source libraries of images of different building components and construction defects can significantly advance the development of more intelligent inspection robots.

7.3. Motion Control System

Most of the reviewed literature has focused on UAVs and UGVs. Future research should study other robots with different types of locomotion in the inspection and monitoring of the built environment. Amongst the promising categories of robots are legged robots, which can navigate stairs and avoid small obstacles, making them better suited for construction environments. These robots have not been thoroughly studied, owing to the lack of many commercial options on the market. Quadruped robots that are currently on the market, e.g., Spot by Boston Dynamics, are costly. A cost-benefit and return-on-investment analysis of the quadruped robot is required to build the case for industry adoption in the future. Although legged robots are more adaptable than wheeled UGVs, their adaptability to unstable and unfinished terrain must also be assessed. A comparative analysis of different types of robots in undertaking inspection and monitoring of the built environment in multiple scenarios can uncover the potentials and limitations of the robot types.
More research on hybrid robots is also needed. As discussed in Section 4.8, hybrid robots combine multiple motion control systems, such as wheels, legs, suction mounts, and rotors. Built-environment structures vary significantly in shape and size [3]. Requirements for a bridge inspection robot can be significantly different from the requirements for a building inspection robot. Similarly, indoor and outdoor inspections require significantly different motion capabilities. A versatile robotic platform suitable for a comprehensive inspection of a wide range of structures should possess multiple motion capabilities, such as wheeled legs with rotors. Future research can develop more types of hybrid robots and study their usability in the construction and operation of buildings and infrastructure.

7.4. Sensing

Since vision-based defect detection requires a large training dataset and still not be fully accurate in all conditions, a multi-sensor approach has been explored to detect surface defects that take input from various sensors, such as RADAR, thermal cameras, and accelerometers [86,94,95]. More research is needed in this area to identify other types of defects as well by fusing data from multiple sensors. For example, data from regular and thermal cameras can be combined to detect wet patches on ceilings and walls. Data from the accelerometer and LiDAR can be combined to measure the verticality of walls and the horizontality of floors.

7.5. Multi-Robot Collaboration

Future research should investigate the use of multiple heterogeneous robots in performing inspection and monitoring of the built environment. UGV-UAV teams have been studied and found to provide better capabilities as compared to a single type of robot [33,34,131,244]. Due to their complementary capabilities, UGV and UAV create synergistic relationships that can outperform either of them individually. However, due to their different motion capabilities and degrees of freedom, they require different types of input. For example, in addition to velocity and angle, UAVs also require a target altitude. Moreover, different manufacturers provide their own software development kits (SDKs) which might not be compatible with another manufacturer’s SDK. This requires the development of a separate integration layer in the software architecture. Additionally, if one of the robots is replaced with another robot from a different manufacturer, the software backend needs to be re-developed. This creates the necessity for the development of exchangeable data formats similar to IFC. Future research may develop standard formats and protocols for multi-robot collaboration which are manufacturer-independent.

7.6. Safety Implications

Robots can be a safer alternative to manual inspection methods, but they can also introduce new safety hazards. There is a significant gap in the research on assessing the safety implications of introducing autonomous robots to construction sites. As discussed in Section 6.6, there have been some studies on the risk assessment of UAVs in construction [266], but other types of robots pose different types of risks which need to be evaluated as well. While FAA regulates the use of UAVs [247], an overarching legal framework for all types of robots needs to be developed. Future research may assess the perceived risks and safety of other types of robots in order to provide regulatory guidance. Such regulatory guidance does not only provide a framework to mitigate risks from the use of robots, but also provides a framework to evaluate the risk for insurance and liability purposes.

7.7. Data Transmission

Internet of Things (IoT) is identified as one of the enablers of Industry 4.0 [267]. IoT is defined as the “interconnection of sensing and actuating devices providing the ability to share information across platforms through a unified framework, developing a common operating picture for enabling innovative applications” [25]. As more smart devices, such as robots, are deployed in the built environment, their interconnection may be used to improve their performance. Since Wi-Fi may not be always available in live construction projects, a direct device-to-device protocol can be developed for robots and other smart devices to communicate with each other. Live construction sites are dynamic and cluttered environments. Many changes in the environment may render the robot lost and cause it to completely fail its mission [125]. However, with the ability to communicate with other devices at the site, such as fixed cameras, construction equipment, or other robots, the robot may become more robust in path finding and other operations. Future research can identify what devices already exist in the buildings and infrastructure and how they can communicate with robots for a collaborative inspection activity.

7.8. Human Factors

The unstructured environment of a live construction site is a major challenge for robots [268]. Robots can perform well in structured and straightforward tasks but need human reasoning in complex tasks that occur in an unstructured environment [268]. Human–robot collaboration (HRC) combines the strengths of humans and robots [269]. Shared autonomy has been studied [80], which involves humans carrying out higher-level reasoning, such as deciding where to take images [270], while the robot manages low-level tasks such as avoiding obstacles and maintaining a safe distance from the wall. The development of shared autonomy workflows for different types of robots in different application domains for inspection and monitoring of the built environment can contribute to the mutual adaptation of human–robot teams.
Other domains have investigated various modes of human–robot interactions (HRI) [271]. Construction noise, lighting conditions, and the diverse technical literacy of construction workers necessitate a closer examination of HRI in the context of construction. Future studies can investigate the effect of different HRI modes on the performance of human–robot teams during the inspection and monitoring of the built environment.
Human–robot collaboration can also improve the autonomy of the robot in the long run by making the robot learn gradually from the human. Reinforcement Learning (RL) can be used to develop human-in-the-loop methodologies in which robots learn to identify defect indicators from human inspectors through observations. Future work should develop an HRI model to understand human behavior in response to external stimuli in human–robot collaboration tasks and train the robot to learn from it for increased autonomy in future tasks in similar situations.
Recent growth in the use of automation and robotics in construction has necessitated reskilling of the construction workforce for the adoption of technology safely and efficiently [272]. Studies have been conducted on training the construction workforce in using robots through VR [272]. VR provides a safe and cost-effective medium to train construction personnel. Future research can identify training needs to develop continuing education programs for the construction industry to assist with adopting robotic technologies in the inspection and monitoring of the built environment through the use of VR or other mediums. Incorporating robotics in construction education should be explored to prepare future experts with the required competencies. Future research should survey robotics experts to identify the knowledge, skills, and abilities required for future construction workers and professionals in conducting inspection and monitoring of the built environment to efficiently use robotics in the architecture, engineering, construction, and operation (AECO) industry.

8. Conclusions

Inspection and monitoring of buildings and infrastructure are important for the success of the project and the longevity of the structure. Multiple people are involved in the inspection and monitoring throughout the lifecycle of the structure. Traditional methods involve visual observations through personally visiting the site, which is time-consuming and adds to the overall lifecycle cost. This article presents a comprehensive review of the literature on robotic inspection of the built environment.
This research contributes to understanding the potential of robots for the regular inspection of the built environment. The knowledge synthesized from the body of literature is used to highlight different types of robots that have been applied for automated inspection and the different applications these robots can serve within the inspection and monitoring of the built environment and the prevalent research areas.
The study identified five application domains for use of robots for inspection purposes in the built environment, namely, post-construction maintenance inspection, construction quality inspection, progress monitoring, as-built modeling, and safety inspection. For these inspection domains in the built environment, a variety of robots were used. Drones were found to be the most prevalent, followed by wheeled robots. Other custom robots with varying locomotion and sizes were also developed for a variety of purposes. The study also found eight different research areas that have been studied. Those are autonomous navigation, knowledge extraction, a motion control system, sensing, safety implications, multi-robot collaboration, human–robot collaboration, and data transmission.
One of the limitations of this review is that it only includes materials written in English. The authors acknowledge that many important studies published in other languages are not included in this review. Another limitation is that this review only covers publications published through April 2022. Owing to the rapid growth in this topic, more articles may have been published by the time this review is made available to the readers. Another limitation is that the review includes only academic publications. Many robotic applications might be developed by corporations and startups who do not publish their research in academic publishing venues, and therefore might have not been represented in this review.
With the rapid growth in the research on robotics and supporting technologies such as deep learning and computer vision, robots will be utilized more often than they are now in the AECO industry. Future adoption of robotics will require an understanding of the technology by both managers and workers. This systematic literature review will help both researchers and AECO practitioners to develop and implement new robotic solutions. It will also assist construction and facility managers in understanding how robotics fits into their existing processes and how to modify their existing policies in order to adopt and benefit from it.

Funding

This research and the APC was funded by the National Science Foundation (NSF) Future of Work at the Human-Technology Frontier Big Idea under Grant No. 2128948.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available on request from the corresponding author. The data are not publicly available as they are confidential.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Vo-Tran, H.; Kanjanabootra, S. Information Sharing Problems among Stakeholders in the Construction Industry at the Inspection Stage: A Case Study. In Proceedings of the CIB World Building Congres: Construction and Society, Brisbane, Australia, 5–9 May 2013; pp. 50–63. [Google Scholar]
  2. Hallermann, N.; Morgenthal, G.; Rodehorst, V. Unmanned Aerial Systems (UAS)—Case Studies of Vision Based Monitoring of Ageing Structures. In Proceedings of the International Symposium Non-Destructive Testing in Civil Engineering (NDT-CE), Berlin, Germany, 15–17 September 2015; pp. 15–17. [Google Scholar]
  3. Lattanzi, D.; Miller, G. Review of Robotic Infrastructure Inspection Systems. J. Infrastruct. Syst. 2017, 23, 04017004. [Google Scholar] [CrossRef]
  4. Pinto, L.; Bianchini, F.; Nova, V.; Passoni, D. Low-Cost Uas Photogrammetry for Road Infrastructures’ Inspection. ISPRS—Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2020, 43B2, 1145–1150. [Google Scholar] [CrossRef]
  5. Bulgakov, A.; Sayfeddine, D. Air Conditioning Ducts Inspection and Cleaning Using Telerobotics. Procedia Eng. 2016, 164, 121–126. [Google Scholar] [CrossRef]
  6. Wang, Y.; Su, J. Automated Defect and Contaminant Inspection of HVAC Duct. Autom. Constr. 2014, 41, 15–24. [Google Scholar] [CrossRef]
  7. Inzartsev, A.; Eliseenko, G.; Panin, M.; Pavin, A.; Bobkov, V.; Morozov, M. Underwater Pipeline Inspection Method for AUV Based on Laser Line Recognition: Simulation Results. In Proceedings of the 2019 IEEE Underwater Technology (UT), Kaohsiung, Taiwan, 16–19 April 2019. [Google Scholar]
  8. Kreuzer, E.; Pinto, F.C. Sensing the Position of a Remotely Operated Underwater Vehicle. In Proceedings of the Theory and Practice of Robots and Manipulators; Morecki, A., Bianchi, G., Jaworek, K., Eds.; Springer: Vienna, Austria, 1995; pp. 323–328. [Google Scholar]
  9. Brunete, A.; Hernando, M.; Torres, J.E.; Gambao, E. Heterogeneous Multi-Configurable Chained Microrobot for the Exploration of Small Cavities. Autom. Constr. 2012, 21, 184–198. [Google Scholar] [CrossRef]
  10. Tan, Y.; Li, S.; Liu, H.; Chen, P.; Zhou, Z. Automatic Inspection Data Collection of Building Surface Based on BIM and UAV. Autom. Constr. 2021, 131, 103881. [Google Scholar] [CrossRef]
  11. Torok Matthew, M.; Golparvar-Fard, M.; Kochersberger Kevin, B. Image-Based Automated 3D Crack Detection for Post-Disaster Building Assessment. J. Comput. Civ. Eng. 2014, 28, A4014004. [Google Scholar] [CrossRef]
  12. Khan, F.; Ellenberg, A.; Mazzotti, M.; Kontsos, A.; Moon, F.; Pradhan, A.; Bartoli, I. Investigation on Bridge Assessment Using Unmanned Aerial Systems. In Proceedings of the Structures Congress 2015, Portland, OR, USA, 23–25 April 2015; pp. 404–413. [Google Scholar] [CrossRef]
  13. Siciliano, B.; Khatib, O. Springer Handbook of Robotics; Springer: Berlin, Germany, 2016; ISBN 978-3-319-32552-1. [Google Scholar]
  14. Birk, A.; Pfingsthorn, M.; Bulow, H. Advances in Underwater Mapping and Their Application Potential for Safety, Security, and Rescue Robotics (SSRR). In Proceedings of the 2012 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR), College Station, TX, USA, 5–8 November 2012. [Google Scholar]
  15. Ruiz, R.D.B.; Lordsleem Jr, A.C.; Rocha, J.H.A.; Irizarry, J. Unmanned Aerial Vehicles (UAV) as a Tool for Visual Inspection of Building Facades in AEC+FM Industry. Constr. Innov. 2021, 22, 1155–1170. [Google Scholar] [CrossRef]
  16. Agnisarman, S.; Lopes, S.; Chalil Madathil, K.; Piratla, K.; Gramopadhye, A. A Survey of Automation-Enabled Human-in-the-Loop Systems for Infrastructure Visual Inspection. Autom. Constr. 2019, 97, 52–76. [Google Scholar] [CrossRef]
  17. Halder, S.; Afsari, K.; Chiou, E.; Patrick, R.; Hamed, K.A. Construction Inspection & Monitoring with Quadruped Robots in Future Human-Robot Teaming. J. Build. Eng. 2023, 65, 105814. [Google Scholar]
  18. Lim, R.S.; La, H.M.; Shan, Z.; Sheng, W. Developing a Crack Inspection Robot for Bridge Maintenance. In Proceedings of the 2011 IEEE International Conference on Robotics and Automation, Shanghai, China, 9–13 May 2011; IEEE: Piscataway, NJ, USA, 2011; pp. 6288–6293. [Google Scholar]
  19. Kim, I.-H.; Jeon, H.; Baek, S.-C.; Hong, W.-H.; Jung, H.-J. Application of Crack Identification Techniques for an Aging Concrete Bridge Inspection Using an Unmanned Aerial Vehicle. Sensors 2018, 18, 1881. [Google Scholar] [CrossRef] [PubMed]
  20. Lee, J.H.; Park, J.; Jang, B. Design of Robot Based Work Progress Monitoring System for the Building Construction Site. In Proceedings of the 2018 International Conference on Information and Communication Technology Convergence (ICTC), Jeju Island, Republic of Korea, 17–19 October 2018; pp. 1420–1422. [Google Scholar]
  21. Kim, D.; Liu, M.; Lee, S.; Kamat, V.R. Remote Proximity Monitoring between Mobile Construction Resources Using Camera-Mounted UAVs. Autom. Constr. 2019, 99, 168–182. [Google Scholar] [CrossRef]
  22. Rakha, T.; Gorodetsky, A. Review of Unmanned Aerial System (UAS) Applications in the Built Environment: Towards Automated Building Inspection Procedures Using Drones. Autom. Constr. 2018, 93, 252–264. [Google Scholar] [CrossRef]
  23. Bock, T.; Linner, T. Construction Robots: Elementary Technologies and Single-Task Construction Robots; Cambridge University Press: Cambridge, UK, 2016; Volume 3, ISBN 978-1-107-07599-3. [Google Scholar]
  24. Pal, A.; Hsieh, S.-H. Deep-Learning-Based Visual Data Analytics for Smart Construction Management. Autom. Constr. 2021, 131, 103892. [Google Scholar] [CrossRef]
  25. Tang, S.; Shelden, D.R.; Eastman, C.M.; Pishdad-Bozorgi, P.; Gao, X. A Review of Building Information Modeling (BIM) and the Internet of Things (IoT) Devices Integration: Present Status and Future Trends. Autom. Constr. 2019, 101, 127–139. [Google Scholar] [CrossRef]
  26. Huang, Y.; Yan, J.; Chen, L. A Bird’s Eye of Prefabricated Construction Based on Bibliometric Analysis from 2010 to 2019. In Proceedings of the IOP Conf. Series: Materials Science and Engineering, Ulaanbaatar, Mongolia, 10–13 September 2020; p. 052011. [Google Scholar]
  27. Aria, M.; Cuccurullo, C. Bibliometrix: An R-Tool for Comprehensive Science Mapping Analysis. J. Informetr. 2017, 11, 959–975. [Google Scholar] [CrossRef]
  28. Li, P.; Lu, Y.; Yan, D.; Xiao, J.; Wu, H. Scientometric Mapping of Smart Building Research: Towards a Framework of Human-Cyber-Physical System (HCPS). Autom. Constr. 2021, 129, 103776. [Google Scholar] [CrossRef]
  29. Sawada, J.; Kusumoto, K.; Maikawa, Y.; Munakata, T.; Ishikawa, Y. A Mobile Robot for Inspection of Power Transmission Lines. IEEE Trans. Power Deliv. 1991, 6, 309–315. [Google Scholar] [CrossRef]
  30. Patel, T.; Suthar, V.; Bhatt, N. Application of Remotely Piloted Unmanned Aerial Vehicle in Construction Management BT-Recent Trends in Civil Engineering; Pathak, K.K., Bandara, J.M.S.J., Agrawal, R., Eds.; Springer: Singapore, 2021; pp. 319–329. [Google Scholar]
  31. Phung, M.D.; Quach, C.H.; Dinh, T.H.; Ha, Q. Enhanced Discrete Particle Swarm Optimization Path Planning for UAV Vision-Based Surface Inspection. Autom. Constr. 2017, 81, 25–33. [Google Scholar] [CrossRef]
  32. Whang, S.-H.; Kim, D.-H.; Kang, M.-S.; Cho, K.; Park, S.; Son, W.-H. Development of a Flying Robot System for Visual Inspection of Bridges; International Society for Structural Health Monitoring of Intelligent Infrastructure (ISHMII): Winnipeg, MB, Canada, 2007. [Google Scholar]
  33. Asadi, K.; Kalkunte Suresh, A.; Ender, A.; Gotad, S.; Maniyar, S.; Anand, S.; Noghabaei, M.; Han, K.; Lobaton, E.; Wu, T. An Integrated UGV-UAV System for Construction Site Data Collection. Autom. Constr. 2020, 112, 103068. [Google Scholar] [CrossRef]
  34. Kim, P.; Park, J.; Cho, Y.K.; Kang, J. UAV-Assisted Autonomous Mobile Robot Navigation for as-Is 3D Data Collection and Registration in Cluttered Environments. Autom. Constr. 2019, 106, 102918. [Google Scholar] [CrossRef]
  35. Kamagaluh, B.; Kumar, J.S.; Virk, G.S. Design of Multi-Terrain Climbing Robot for Petrochemical Applications. In Proceedings of the Adaptive Mobile Robotics, Baltimore, MD, USA, 23–26 July 2012; pp. 639–646. [Google Scholar]
  36. Katrasnik, J.; Pernus, F.; Likar, B. A Climbing-Flying Robot for Power Line Inspection. In Climbing and Walking Robots; Miripour, B., Ed.; IntechOpen: Rijeka, Croatia, 2010; pp. 95–110. [Google Scholar]
  37. Mattar, R.A.; Kalai, R. Development of a Wall-Sticking Drone for Non-Destructive Ultrasonic and Corrosion Testing. Drones 2018, 2, 8. [Google Scholar] [CrossRef]
  38. Irizarry, J.; Costa, D. Exploratory Study of Potential Applications of Unmanned Aerial Systems for Construction Management Tasks. J. Manag. Eng. 2016, 32, 05016001. [Google Scholar] [CrossRef]
  39. Razali, S.N.M.; Kaamin, M.; Razak, S.N.A.; Hamid, N.B.; Ahmad, N.F.A.; Mokhtar, M.; Ngadiman, N.; Sahat, S. Application of UAV and Csp1 Matrix for Building Inspection at Muzium Negeri, Seremban. Int. J. Innov. Technol. Explor. Eng. 2019, 8, 1366–1372. [Google Scholar] [CrossRef]
  40. Moore, J.; Tadinada, H.; Kirsche, K.; Perry, J.; Remen, F.; Tse, Z.T.H. Facility Inspection Using UAVs: A Case Study in the University of Georgia Campus. Int. J. Remote Sens. 2018, 39, 7189–7200. [Google Scholar] [CrossRef]
  41. Biswas, S.; Sharma, R. Goal-Aware Navigation of Quadrotor UAV for Infrastructure Inspection. In Proceedings of the AIAA Scitech 2019 Forum, American Institute of Aeronautics and Astronautics, San Diego, CA, USA, 7–11 January 2019. [Google Scholar]
  42. Mao, Z.; Yan, Y.; Wu, J.; Hajjar, J.F.; Padlr, T. Towards Automated Post-Disaster Damage Assessment of Critical Infrastructure with Small Unmanned Aircraft Systems. In Proceedings of the 2018 IEEE International Symposium on Technologies for Homeland Security (HST), Woburn, MA, USA, 23–24 October 2018. [Google Scholar]
  43. Walczyński, M.; Bożejko, W.; Skorupka, D. Parallel Optimization Algorithm for Drone Inspection in the Building Industry. AIP Conf. Proc. 2017, 1863, 230014. [Google Scholar] [CrossRef]
  44. Vacca, G.; Furfaro, G.; Dessì, A. The Use of the Uav Images for the Building 3D Model Generation. In Proceedings of the International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences—ISPRS Archives, Dar es Salaam, TN, USA, 29 August 2018; Volume 42, pp. 217–223. [Google Scholar]
  45. Taj, G.; Anand, S.; Haneefi, A.; Kanishka, R.P.; Mythra, D.H.A. Monitoring of Historical Structures Using Drones; IOP Publishing Ltd.: Bristol, UK, 2020; Volume 955. [Google Scholar]
  46. Saifizi, M.; Syahirah, N.; Mustafa, W.A.; Rahim, H.A.; Nasrudin, M.W. Using Unmanned Aerial Vehicle in 3D Modelling of UniCITI Campus to Estimate Building Size; IOP Publishing Ltd.: Bristol, UK, 2021; Volume 1962. [Google Scholar]
  47. Hament, B.; Oh, P. Unmanned Aerial and Ground Vehicle (UAV-UGV) System Prototype for Civil Infrastructure Missions. In Proceedings of the 2018 IEEE International Conference on Consumer Electronics (ICCE), Las Vegas, NV, USA, 12–14 January 2018; Mohanty, S., Corcoran, P., Li, H., Sengupta, A., Lee, J., Eds.; IEEE: Piscataway, NJ, USA, 2018. [Google Scholar]
  48. Roca, D.; Lagüela, S.; Díaz-Vilariño, L.; Armesto, J.; Arias, P. Low-Cost Aerial Unit for Outdoor Inspection of Building Façades. Autom. Constr. 2013, 36, 128–135. [Google Scholar] [CrossRef]
  49. Lin, J.J.; Han, K.K.; Golparvar-Fard, M. A Framework for Model-Driven Acquisition and Analytics of Visual Data Using UAVs for Automated Construction Progress Monitoring. In Proceedings of the Computing in Civil Engineering, Austin, TX, USA, 21–23 June 2015; pp. 156–164. [Google Scholar]
  50. Keyvanfar, A.; Shafaghat, A.; Awanghamat, A. Optimization and Trajectory Analysis of Drone’s Flying and Environmental Variables for 3D Modelling the Construction Progress Monitoring. Int. J. Civ. Eng. 2021, 20, 363–388. [Google Scholar] [CrossRef]
  51. Freimuth, H.; König, M. Planning and Executing Construction Inspections with Unmanned Aerial Vehicles. Autom. Constr. 2018, 96, 540–553. [Google Scholar] [CrossRef]
  52. Hamledari, H.; Davari, S.; Azar, E.R.; McCabe, B.; Flager, F.; Fischer, M. UAV-Enabled Site-to-BIM Automation: Aerial Robotic- and Computer Vision-Based Development of As-Built/As-Is BIMs and Quality Control. In Proceedings of the Construction Research Congress 2018: Construction Information Technology—Selected Papers from the Construction Research Congress 2018, New Orleans, LA, USA, 2–4 April 2018; American Society of Civil Engineers (ASCE): Reston, VA, USA, 2018; pp. 336–346. [Google Scholar]
  53. Kayhani, N.; McCabe, B.; Abdelaal, A.; Heins, A.; Schoellig, A.P. Tag-Based Indoor Localization of UAVs in Construction Environments: Opportunities and Challenges in Practice. In Proceedings of the Construction Research Congress 2020, Tempe, AZ, USA, 8–10 March 2020; pp. 226–235. [Google Scholar]
  54. Previtali, M.; Barazzetti, L.; Brumana, R.; Roncoroni, F. Thermographic Analysis from Uav Platforms for Energy Efficiency Retrofit Applications. J. Mob. Multimed. 2013, 9, 66–82. [Google Scholar]
  55. Teixeira, J.M.; Ferreira, R.; Santos, M.; Teichrieb, V. Teleoperation Using Google Glass and Ar, Drone for Structural Inspection. In Proceedings of the 2014 XVI Symposium on Virtual and Augmented Reality, Piata Salvador, Brazil, 12–15 May 2014; pp. 28–36. [Google Scholar]
  56. Esfahan, N.R.; Attalla, M.; Khurshid, K.; Saeed, N.; Huang, T.; Razavi, S. Employing Unmanned Aerial Vehicles (UAV) to Enhance Building Roof Inspection Practices: A Case Study. In Proceedings of the Canadian Society for Civil Engineering Annual Conference, London, ON, Canada, 1–4 June 2016; Volume 1, pp. 296–305. [Google Scholar]
  57. Chen, K.; Reichard, G.; Xu, X. Opportunities for Applying Camera-Equipped Drones towards Performance Inspections of Building Facades. In Proceedings of the ASCE International Conference on Computing in Civil Engineering 2019, Atlanta, GA, USA, 17–19 June 2019; pp. 113–120. [Google Scholar]
  58. Oudjehane, A.; Moeini, S.; Baker, T. Construction Project Control and Monitoring with the Integration of Unmanned Aerial Systems with Virtual Design and Construction Models; Canadian Society for Civil Engineering: Vancouver, BC, Canada, 2017; Volume 1, pp. 381–406. [Google Scholar]
  59. Choi, J.; Yeum, C.M.; Dyke, S.J.; Jahanshahi, M.; Pena, F.; Park, G.W. Machine-Aided Rapid Visual Evaluation of Building Façades. In Proceedings of the 9th European Workshop on Structural Health Monitoring, Manchester, UK, 10–13 July 2018. [Google Scholar]
  60. Mavroulis, S.; Andreadakis, E.; Spyrou, N.-I.; Antoniou, V.; Skourtsos, E.; Papadimitriou, P.; Kasssaras, I.; Kaviris, G.; Tselentis, G.-A.; Voulgaris, N.; et al. UAV and GIS Based Rapid Earthquake-Induced Building Damage Assessment and Methodology for EMS-98 Isoseismal Map Drawing: The June 12, 2017 Mw 6.3 Lesvos (Northeastern Aegean, Greece) Earthquake. Int. J. Disaster Risk Reduct. 2019, 37, 101169. [Google Scholar] [CrossRef]
  61. Daniel Otero, L.; Gagliardo, N.; Dalli, D.; Otero, C.E. Preliminary SUAV Component Evaluation for Inspecting Transportation Infrastructure Systems. In Proceedings of the 2016 Annual IEEE Systems Conference (SysCon), Orlando, FL, USA, 18–21 April 2016. [Google Scholar]
  62. Pereira, F.C.; Pereira, C.E. Embedded Image Processing Systems for Automatic Recognition of Cracks Using UAVs. IFAC-PapersOnLine 2015, 28, 16–21. [Google Scholar] [CrossRef]
  63. Watanabe, K.; Moritoki, N.; Nagai, I. Attitude Control of a Camera Mounted-Type Tethered Quadrotor for Infrastructure Inspection. In Proceedings of the IECON 2017-43rd Annual Conference of the IEEE Industrial Electronics Society, Beijing, China, 29 October–1 November 2017; pp. 6252–6257. [Google Scholar]
  64. Murtiyoso, A.; Koehl, M.; Grussenmeyer, P.; Freville, T. Acquisition and Processing Protocols for Uav Images: 3d Modeling of Historical Buildings Using Photogrammetry. In Proceedings of the 26th International CIPA Symposium 2017, Ottawa, ON, Canada, 28 August–1 September 2017; Volume 4, pp. 163–170. [Google Scholar]
  65. Saifizi, M.; Azani Mustafa, W.; Syahirah Mohammad Radzi, N.; Aminudin Jamlos, M.; Zulkarnain Syed Idrus, S. UAV Based Image Acquisition Data for 3D Model Application. IOP Conf. Ser. Mater. Sci. Eng. 2020, 917, 012074. [Google Scholar] [CrossRef]
  66. Khaloo, A.; Lattanzi, D. Integrating 3D Computer Vision and Robotic Infrastructure Inspection. In Proceedings of the 11th International Workshop on Structural Health Monitoring (IWSHM), Stanford, CA, USA, 12–14 September 2017; Volume 2, pp. 3202–3209. [Google Scholar]
  67. Kersten, J.; Rodehorst, V.; Hallermann, N.; Debus, P.; Morgenthal, G. Potentials of Autonomous UAS and Automated Image Analysis for Structural Health Monitoring. In Proceedings of the IABSE Symposium 2018, Nantes, France, 19–21 September 2018; pp. S24–S119. [Google Scholar]
  68. Kakillioglu, B.; Velipasalar, S.; Rakha, T. Autonomous Heat Leakage Detection from Unmanned Aerial Vehicle-Mounted Thermal Cameras. In Proceedings of the 12th International Conference, Jeju, Republic of Korea, 25–28 October 2018. [Google Scholar]
  69. Gomez, J.; Tascon, A. A Protocol for Using Unmanned Aerial Vehicles to Inspect Agro-Industrial Buildings. Inf. Constr. 2021, 73, e421. [Google Scholar] [CrossRef]
  70. Dergachov, K.; Kulik, A. Impact-Resistant Flying Platform for Use in the Urban Construction Monitoring. In Methods and Applications of Geospatial Technology in Sustainable Urbanism; IGI Global: Pennsylvania, PA, USA, 2021; pp. 520–551. [Google Scholar]
  71. Kucuksubasi, F.; Sorguc, A.G. Transfer Learning-Based Crack Detection by Autonomous UAVs. In Proceedings of the 35th International Symposium on Automation and Robotics in Construction (ISARC), Berlin, Germany, 20–25 July 2018; Teizer, J., Ed.; International Association for Automation and Robotics in Construction (IAARC): Taipei, Taiwan, 2018; pp. 593–600. [Google Scholar]
  72. Bruggemann, T. Automated Feature-Driven Flight Planning for Airborne Inspection of Large Linear Infrastructure Assets. IEEE Trans. Autom. Sci. Eng. 2022, 19, 804–817. [Google Scholar] [CrossRef]
  73. Nakata, K.; Umemoto, K.; Kaneko, K.; Ryusuke, F. Development and Operation Of Wire Movement Type Bridge Inspection Robot System ARANEUS. In Proceedings of the International Symposium on Applied Science, Bali, Indonesia, 24–25 October 2019; Kalpa Publications in Engineering: Manchester, UK, 2020; pp. 168–174. [Google Scholar]
  74. Lyu, J.; Zhao, T.; Xu, G. Research on UAV’s Fixed-Point Cruise Method Aiming at the Appearance Defects of Buildings. In Proceedings of the ACM International Conference Proceeding Series; Association for Computing Machinery: New York, NY, USA, 2021; pp. 783–787. [Google Scholar]
  75. Vazquez-Nicolas, J.M.; Zamora, E.; Gonzalez-Hernandez, I.; Lozano, R.; Sossa, H. Towards Automatic Inspection: Crack Recognition Based on Quadrotor UAV-Taken Images. In Proceedings of the 2018 International Conference on Unmanned Aircraft Systems (ICUAS), Dallas, TX, USA, 12–15 June 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 654–659. [Google Scholar]
  76. Choi, Y.; Payan, A.P.; Briceno, S.I.; Mavris, D.N. A Framework for Unmanned Aerial Systems Selection and Trajectory Generation for Imaging Service Missions. In Proceedings of the 2018 Aviation Technology, Integration, and Operations Conference, Atlanta, GA, USA, 25–29 June 2018. [Google Scholar]
  77. Zhenga, Z.J.; Pana, M.; Pan, W. Virtual Prototyping-Based Path Planning of Unmanned Aerial Vehicles for Building Exterior Inspection. In Proceedings of the 2020 37th ISARC, Kitakyushu, Japan, 27–28 October 2020; pp. 16–23. [Google Scholar]
  78. Debus, P.; Rodehorst, V. Multi-Scale Flight Path Planning for UAS Building Inspection; Lecture Notes in Civil Engineering; Springer: Berlin, Germany, 2021; Volume 98, p. 1085. [Google Scholar]
  79. Freimuth, H.; Müller, J.; König, M. Simulating and Executing UAV-Assisted Inspections on Construction Sites. In Proceedings of the 2017 34rd ISARC, Taipei, Taiwan, 28 June–1 July 2017; pp. 647–654. [Google Scholar]
  80. Sa, I.; Corke, P. Vertical Infrastructure Inspection Using a Quadcopter and Shared Autonomy Control; Springer Tracts in Advanced Robotics; Springer: Berlin, Germany, 2014; Volume 92, p. 232. [Google Scholar]
  81. Hamledari, H.; Davari, S.; Sajedi, S.O.; Zangeneh, P.; McCabe, B.; Fischer, M. UAV Mission Planning Using Swarm Intelligence and 4D BIMs in Support of Vision-Based Construction Progress Monitoring and as-Built Modeling. In Proceedings of the Construction Research Congress 2018: Construction Information Technology-Selected Papers from the Construction Research Congress 2018, New Orleans, LA, USA, 2–4 April 2018; American Society of Civil Engineers (ASCE): Reston, VA, USA, 2018; pp. 43–53. [Google Scholar]
  82. Kayhani, N.; Zhao, W.; McCabe, B.; Schoellig, A.P. Tag-Based Visual-Inertial Localization of Unmanned Aerial Vehicles in Indoor Construction Environments Using an on-Manifold Extended Kalman Filter. Autom. Constr. 2022, 135, 104112. [Google Scholar] [CrossRef]
  83. Vanegas, F.; Gaston, K.; Roberts, J.; Gonzalez, F. A Framework for UAV Navigation and Exploration in GPS-Denied Environments. In Proceedings of the 2019 IEEE Aerospace Conference, Big Sky, MT, USA, 2–9 March 2019. [Google Scholar]
  84. Usenko, V.; von Stumberg, L.; Stückler, J.; Cremers, D. TUM Flyers: Vision—Based MAV Navigation for Systematic Inspection of Structures; Springer Tracts in Advanced Robotics; Springer: Berlin, Germany, 2020; Volume 136, p. 209. [Google Scholar]
  85. Rea, P.; Ottaviano, E.; Castillo-Garcia, F.; Gonzalez-Rodriguez, A. Inspection Robotic System: Design and Simulation for Indoor and Outdoor Surveys; Machado, J., Soares, F., Trojanowska, J., Yildirim, S., Eds.; Springer: Berlin, Germany, 2022; pp. 313–321. [Google Scholar]
  86. Gibb, S.; La, H.M.; Le, T.; Nguyen, L.; Schmid, R.; Pham, H. Nondestructive Evaluation Sensor Fusion with Autonomous Robotic System for Civil Infrastructure Inspection. J. Field Robot. 2018, 35, 988–1004. [Google Scholar] [CrossRef]
  87. Nitta, Y.; Nishitani, A.; Iwasaki, A.; Watakabe, M.; Inai, S.; Ohdomari, I. Damage Assessment Methodology for Nonstructural Components with Inspection Robot. Key Eng. Mater. 2013, 558, 297–304. [Google Scholar] [CrossRef]
  88. Watanabe, A.; Even, J.; Morales, L.Y.; Ishi, C. Robot-Assisted Acoustic Inspection of Infrastructures-Cooperative Hammer Sounding Inspection. In Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany, 28 September–2 October 2015; pp. 5942–5947. [Google Scholar]
  89. Phillips, S.; Narasimhan, S. Automating Data Collection for Robotic Bridge Inspections. J. Bridge Eng. 2019, 24, 04019075. [Google Scholar] [CrossRef]
  90. Asadi, K.; Jain, R.; Qin, Z.; Sun, M.; Noghabaei, M.; Cole, J.; Han, K.; Lobaton, E. Vision-Based Obstacle Removal System for Autonomous Ground Vehicles Using a Robotic Arm. In Proceedings of the ASCE International Conference on Computing in Civil Engineering 2019, Atlanta, GA, USA, 17–19 June 2019; pp. 328–335. [Google Scholar]
  91. Kim, P.; Chen, J.; Kim, J.; Cho, Y.K. SLAM-Driven Intelligent Autonomous Mobile Robot Navigation for Construction Applications. In Proceedings of the Advanced Computing Strategies for Engineering, Lausanne, Switzerland, 10–13 June 2018; Smith, I.F.C., Domer, B., Eds.; Springer International Publishing: Cham, Switzerland, 2018; pp. 254–269. [Google Scholar]
  92. Davidson, N.C.; Chase, S.B. Initial Testing of Advanced Ground-Penetrating Radar Technology for the Inspection of Bridge Decks: The HERMES and PERES Bridge Inspectors. In Proceedings of the SPIE Proceedings, Newport Beach, CA, USA, 1 February 1999; Volume 3587, pp. 180–185. [Google Scholar] [CrossRef]
  93. Dobmann, G.; Kurz, J.H.; Taffe, A.; Streicher, D. Development of Automated Non-Destructive Evaluation (NDE) Systems for Reinforced Concrete Structures and Other Applications. In Non-Destructive Evaluation of Reinforced Concrete Structures; Maierhofer, C., Reinhardt, H.-W., Dobmann, G., Eds.; Woodhead Publishing: Sawston, UK, 2010; Volume 2, pp. 30–62. ISBN 978-1-84569-950-5. [Google Scholar]
  94. Gibb, S.; Le, T.; La, H.M.; Schmid, R.; Berendsen, T. A Multi-Functional Inspection Robot for Civil Infrastructure Evaluation and Maintenance. In Proceedings of the 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, BC, Canada, 24–28 September 2017; pp. 2672–2677. [Google Scholar]
  95. Massaro, A.; Savino, N.; Selicato, S.; Panarese, A.; Galiano, A.; Dipierro, G. Thermal IR and GPR UAV and Vehicle Embedded Sensor Non-Invasive Systems for Road and Bridge Inspections. In Proceedings of the 2021 IEEE International Workshop on Metrology for Industry 4.0 & IoT (MetroInd4.0&IoT), Rome, Italy, 7–9 June 2021; pp. 248–253. [Google Scholar]
  96. Kanellakis, C.; Fresk, E.; Mansouri, S.; Kominiak, D.; Nikolakopoulos, G. Towards Visual Inspection of Wind Turbines: A Case of Visual Data Acquisition Using Autonomous Aerial Robots. IEEE Access 2020, 8, 181650–181661. [Google Scholar] [CrossRef]
  97. McCrea, A.; Chamberlain, D.A. Towards the Development of a Bridge Inspecting Automated Device. In Proceedings of the Automation and Robotics in Construction X, Houston, TX, USA, 26 May 1993; Watson, G.H., Tuccker, R.L., Walters, J.K., Eds.; International Association for Automation and Robotics in Construction (IAARC): Chennai, India, 1993; pp. 269–276. [Google Scholar]
  98. Mu, H.; Li, Y.; Chen, D.; Li, J.; Wang, M. Design of Tank Inspection Robot Navigation System Based on Virtual Reality. In Proceedings of the 2021 IEEE International Conference on Robotics and Biomimetics (ROBIO), Sanya, China, 27–31 December 2021; pp. 1773–1778. [Google Scholar]
  99. Moselhi, O.; Shehab-Eldeen, T. Automated Detection of Surface Defects in Water and Sewer Pipes. Autom. Constr. 1999, 8, 581–588. [Google Scholar] [CrossRef]
  100. Liu, K.P.; Luk, B.L.; Tong, F.; Chan, Y.T. Application of Service Robots for Building NDT Inspection Tasks. Ind. Robot 2011, 38, 58–65. [Google Scholar] [CrossRef]
  101. Barry, N.; Fisher, E.; Vaughan, J. Modeling and Control of a Cable-Suspended Robot for Inspection of Vertical Structures. In Proceedings of the Journal of Physics: Conference Series; IOP Publishing: Bristol, UK, 2016; Volume 744, p. 12071. [Google Scholar]
  102. Aracil, R.; Saltarén, R.J.; Almonacid, M.; Azorín, J.M.; Sabater, J.M. Climbing Parallel Robots Morphologies. IFAC Proc. Volume 2000, 33, 471–476. [Google Scholar] [CrossRef]
  103. Esser, B.; Huston, D.R.; Esser, B.; Huston, D.R. Versatile Robotic Platform for Structural Health Monitoring and Surveillance. Smart Struct. Syst. 2005, 1, 325. [Google Scholar] [CrossRef]
  104. Schempf, H. Neptune: Above-Ground Storage Tank Inspection Robot System. In Proceedings of the 1994 IEEE International Conference on Robotics and Automation, San Diego, CA, USA, 8–13 May 1994; IEEE: Piscataway, NJ, USA, 1994; Volume 2, pp. 1403–1408. [Google Scholar]
  105. Caccia, M.; Robino, R.; Bateman, W.; Eich, M.; Ortiz, A.; Drikos, L.; Todorova, A.; Gaviotis, I.; Spadoni, F.; Apostolopoulou, V. MINOAS a Marine INspection RObotic Assistant: System Requirements and Design. IFAC Proc. Vol. 2010, 43, 479–484. [Google Scholar] [CrossRef]
  106. Longo, D.; Muscato, G.; Sessa, S. Simulator for Locomotion Control of the Alicia3 Climbing Robot. In Climbing and Walking Robots; Tokhi, M.O., Virk, G.S., Hossain, M.A., Eds.; Springer: Berlin/Heidelberg, Germany, 2006; pp. 843–850. [Google Scholar]
  107. Kouzehgar, M.; Krishnasamy Tamilselvam, Y.; Vega Heredia, M.; Rajesh Elara, M. Self-Reconfigurable Façade-Cleaning Robot Equipped with Deep-Learning-Based Crack Detection Based on Convolutional Neural Networks. Autom. Constr. 2019, 108, 102959. [Google Scholar] [CrossRef]
  108. Hernando, M.; Brunete, A.; Gambao, E. ROMERIN: A Modular Climber Robot for Infrastructure Inspection. IFAC-Paper 2019, 52, 424–429. [Google Scholar] [CrossRef]
  109. Hou, S.; Dong, B.; Wang, H.; Wu, G. Inspection of Surface Defects on Stay Cables Using a Robot and Transfer Learning. Autom. Constr. 2020, 119, 103382. [Google Scholar] [CrossRef]
  110. Kajiwara, H.; Hanajima, N.; Kurashige, K.; Fujihira, Y. Development of Hanger-Rope Inspection Robot for Suspension Bridges. J. Robot. Mechatron. 2019, 31, 855–862. [Google Scholar] [CrossRef]
  111. Boreyko, A.A.; Moun, S.A.; Scherbatyuk, A.P. Precise UUV Positioning Based on Images Processing for Underwater Construction Inspection | Pacific/Asia Offshore Mechanics Symposium | OnePetro. In Proceedings of the The Eighth ISOPE Pacific/Asia Offshore Mechanics Symposium, Bangkok, Thailand, 10–14 November 2008; pp. 14–20. [Google Scholar]
  112. Atyabi, A.; MahmoudZadeh, S.; Nefti-Meziani, S. Current Advancements on Autonomous Mission Planning and Management Systems: An AUV and UAV Perspective. Annu. Rev. Control 2018, 46, 196–215. [Google Scholar] [CrossRef]
  113. Shimono, S.; Toyama, S.; Nishizawa, U. Development of Underwater Inspection System for Dam Inspection: Results of Field Tests. In Proceedings of the OCEANS 2016 MTS/IEEE Monterey, Monterey, CA, USA, 19–23 September 2016; pp. 1–4. [Google Scholar]
  114. Shojaei, A.; Moud, H.I.; Flood, I. Proof of Concept for the Use of Small Unmanned Surface Vehicle in Built Environment Management. In Proceedings of the Construction Research Congress, New Orleans, LA, USA, 2–4 April 2018; pp. 116–126. [Google Scholar]
  115. Masago, H. Approaches towards Standardization of Performance Evaluation of Underwater Infrastructure Inspection Robots: Establishment of Standard Procedures and Training Programs. In Proceedings of the 2021 IEEE International Conference on Intelligence and Safety for Robotics (ISR), Tokoname, Japan, 4–6 March 2021; pp. 244–247. [Google Scholar]
  116. Cymbal, M.; Tao, H.; Tao, J. Underwater Inspection with Remotely Controlled Robot and Image Based 3D Structure Reconstruction Techniques. In Proceedings of the Transportation Research Board 95th Annual Meeting, Washington, DC, USA, 10–14 January 2016; pp. 1–10. [Google Scholar]
  117. Ma, Y.; Ye, R.; Zheng, R.; Geng, L.; Yang, Y. A Highly Mobile Ducted Underwater Robot for Subsea Infrastructure Inspection. In Proceedings of the 2016 IEEE International Conference on Cyber Technology in Automation, Control, and Intelligent Systems (CYBER), Chengdu, China, 19–22 June 2016; pp. 397–400. [Google Scholar]
  118. Shojaei, A.; Moud, H.I.; Flood, I. Proof of Concept for the Use of Small Unmanned Surface Vehicle in Built Environment Management. In Proceedings of the Construction Research Congress 2018: Construction Information Technology-Selected Papers from the Construction Research Congress 2018, New Orleans, LA, USA, 2–4 April 2018; American Society of Civil Engineers (ASCE): Reston, VA, USA, 2018; pp. 116–126. [Google Scholar]
  119. Shojaei, A.; Izadi Moud, H.; Razkenari, M.; Flood, I.; Hakim, H. Feasibility Study of Small Unmanned Surface Vehicle Use in Built Environment Assessment. In Proceedings of the Institute of Industrial and Systems Engineers (IISE) Annual Conference, Orlando, FL, USA, 19 May 2018. [Google Scholar]
  120. Paap, K.L.; Christaller, T.; Kirchner, F. A Robot Snake to Inspect Broken Buildings. In Proceedings of the 2000 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2000) (Cat. No.00CH37113), Takamatsu, Japan, 31 October–5 November 2000; Volume 3, pp. 2079–2082. [Google Scholar]
  121. Lakshmanan, A.; Elara, M.; Ramalingam, B.; Le, A.; Veerajagadeshwar, P.; Tiwari, K.; Ilyas, M. Complete Coverage Path Planning Using Reinforcement Learning for Tetromino Based Cleaning and Maintenance Robot. Autom. Constr. 2020, 112, 103078. [Google Scholar] [CrossRef]
  122. Lin, J.-L.; Hwang, K.-S.; Jiang, W.-C.; Chen, Y.-J. Gait Balance and Acceleration of a Biped Robot Based on Q-Learning. IEEE Access 2016, 4, 2439–2449. [Google Scholar] [CrossRef]
  123. Afsari, K.; Halder, S.; King, R.; Thabet, W.; Serdakowski, J.; Devito, S.; Mahnaz, E.; Lopez, J. Identification of Indicators for Effectiveness Evaluation of Four-Legged Robots in Automated Construction Progress Monitoring. In Proceedings of the Construction Research Congress, Arlington, VA, USA, 9–12 March 2022; American Society of Civil Engineers: Arlington, TX, USA, 2022; pp. 610–620. [Google Scholar]
  124. Faigl, J.; Čížek, P. Adaptive Locomotion Control of Hexapod Walking Robot for Traversing Rough Terrains with Position Feedback Only. Robot. Auton. Syst. 2019, 116, 136–147. [Google Scholar] [CrossRef]
  125. Afsari, K.; Halder, S.; Ensafi, M.; DeVito, S.; Serdakowski, J. Fundamentals and Prospects of Four-Legged Robot Application in Construction Progress Monitoring. In Proceedings of the ASC International Proceedings of the Annual Conference, Virtual, CA, USA, 5–8 April 2021; pp. 271–278. [Google Scholar]
  126. Halder, S.; Afsari, K.; Serdakowski, J.; DeVito, S. A Methodology for BIM-Enabled Automated Reality Capture in Construction Inspection with Quadruped Robots. In Proceedings of the International Symposium on Automation and Robotics in Construction, Bogotá, Colombia, 13–15 July 2021; pp. 17–24. [Google Scholar]
  127. Shin, J.-U.; Kim, D.; Jung, S.; Myung, H. Dynamics Analysis and Controller Design of a Quadrotor-Based Wall-Climbing Robot for Structural Health Monitoring. In Structural Health Monitoring 2015; Stanford University: Stanford, CA, USA, 2015. [Google Scholar]
  128. Shin, J.-U.; Kim, D.; Kim, J.-H.; Jeon, H.; Myung, H. Quadrotor-Based Wall-Climbing Robot for Structural Health Monitoring. Struct. Health Monit. 2013, 2, 1889–1894. [Google Scholar]
  129. Kriengkomol, P.; Kamiyama, K.; Kojima, M.; Horade, M.; Mae, Y.; Arai, T. Hammering Sound Analysis for Infrastructure Inspection by Leg Robot. In Proceedings of the 2015 IEEE International Conference on Robotics and Biomimetics (ROBIO), Zhuhai, China, 6–9 December 2015; pp. 887–892. [Google Scholar]
  130. Prieto, S.A.; Giakoumidis, N.; García de Soto, B. AutoCIS: An Automated Construction Inspection System for Quality Inspection of Buildings. In Proceedings of the 2021 38th ISARC, Dubai, United Arab Emirates, 2–4 November 2021. [Google Scholar]
  131. Kim, P.; Price, L.C.; Park, J.; Cho, Y.K. UAV-UGV Cooperative 3D Environmental Mapping. In Proceedings of the ASCE International Conference on Computing in Civil Engineering 2019, Atlanta, GA, USA, 17–19 June 2019; ASCE: Atlanta, GA, USA, 2019; pp. 384–392. [Google Scholar]
  132. Kim, P.; Park, J.; Cho, Y. As-Is Geometric Data Collection and 3D Visualization through the Collaboration between UAV and UGV. In Proceedings of the 36th International Symposium on Automation and Robotics in Construction (ISARC), Banff, AB, Canada, 21–24 May 2019; Al-Hussein, M., Ed.; International Association for Automation and Robotics in Construction (IAARC): Banff, AB, Canada, 2019; pp. 544–551. [Google Scholar]
  133. Khaloo, A.; Lattanzi, D.; Jachimowicz, A.; Devaney, C. Utilizing UAV and 3D Computer Vision for Visual Inspection of a Large Gravity Dam. Front. Built Environ. 2018, 4, 31. [Google Scholar] [CrossRef]
  134. Mansouri, S.S.; Kanellakis, C.; Fresk, E.; Kominiak, D.; Nikolakopoulos, G. Cooperative UAVs as a Tool for Aerial Inspection of the Aging Infrastructure. In Proceedings of the Field and Service Robotics; Hutter, M., Siegwart, R., Eds.; Springer International Publishing: Cham, Switzerland, 2018; pp. 177–189. [Google Scholar]
  135. Mansouri, S.S.; Kanellakis, C.; Fresk, E.; Kominiak, D.; Nikolakopoulos, G. Cooperative Coverage Path Planning for Visual Inspection. Control Eng. Pract. 2018, 74, 118–131. [Google Scholar] [CrossRef]
  136. Ueda, T.; Hirai, H.; Fuchigami, K.; Yuki, R.; Jonghyun, A.; Yasukawa, S.; Nishida, Y.; Ishii, K.; Sonoda, T.; Higashi, K.; et al. Inspection System for Underwater Structure of Bridge Pier. Proc. Int. Conf. Artif. Life Robot. 2019, 24, 521–524. [Google Scholar] [CrossRef]
  137. Yang, Y.; Hirose, S.; Debenest, P.; Guarnieri, M.; Izumi, N.; Suzumori, K. Development of a Stable Localized Visual Inspection System for Underwater Structures. Adv. Robot. 2016, 30, 1415–1429. [Google Scholar] [CrossRef]
  138. Sulaiman, M.; AlQahtani, A.; Liu, H.; Binalhaj, M. Utilizing Wi-Fi Access Points for Unmanned Aerial Vehicle Localization for Building Indoor Inspection. In Proceedings of the 2021 IEEE 12th Annual Ubiquitous Computing, Electronics & Mobile Communication Conference (UEMCON), New York, NY, USA, 1–4 December 2021; Paul, R., Ed.; IEEE: Piscataway, NJ, USA, 2021; pp. 775–779. [Google Scholar]
  139. Kuo, C.H.; Hsiung, S.H.; Peng, K.C.; Peng, K.C.; Yang, T.H.; Hsieh, Y.C.; Tsai, Y.D.; Shen, C.C.; Kuang, C.Y.; Hsieh, P.S.; et al. Unmanned System and Vehicle for Infrastructure Inspection–Image Correction, Quantification, Reliabilities and Formation of Façade Map. In Proceedings of the 9th European Workshop on Structural Health Monitoring (EWSHM 2018), Manchester, UK, 10–13 July 2018. [Google Scholar]
  140. Jo, J.; Jadidi, Z.; Stantic, B. A Drone-Based Building Inspection System Using Software-Agents; Ivanovic, M., Badica, C., Dix, J., Jovanovic, Z., Malgeri, M., Savic, M., Eds.; Speinger: Berlin, Germany, 2018; Volume 737, pp. 115–121. [Google Scholar]
  141. Wang, R.; Xiong, Z.; Chen, Y.L.; Manjunatha, P.; Masri, S.F. A Two-Stage Local Positioning Method with Misalignment Calibration for Robotic Structural Monitoring of Buildings. J. Dyn. Syst. Meas. Control Trans. ASME 2019, 141, 061014. [Google Scholar] [CrossRef]
  142. Falorca, J.F.; Lanzinha, J.C.G. Facade Inspections with Drones–Theoretical Analysis and Exploratory Tests. Int. J. Build. Pathol. Adapt. 2021, 39, 235–258. [Google Scholar] [CrossRef]
  143. Grosso, R.; Mecca, U.; Moglia, G.; Prizzon, F.; Rebaudengo, M. Collecting Built Environment Information Using UAVs: Time and Applicability in Building Inspection Activities. Sustainability 2020, 12, 4731. [Google Scholar] [CrossRef]
  144. Kuo, C.; Leber, A.; Kuo, C.; Boller, C.; Eschmann, C.; Kurz, J. Unmanned Robot System for Structure Health Monitoring and Non-Destructive Building Inspection, Current Technologies Overview and Future Improvements. In Proceedings of the 9th International Workshop on Structural Health Monitoring, Stanford, CA, USA, 10–13 September 2013; Chang, F., Ed.; pp. 1–8. [Google Scholar]
  145. Perry, B.J.; Guo, Y.; Atadero, R.; van de Lindt, J.W. Tracking Bridge Condition over Time Using Recurrent UAV-Based Inspection. In Bridge Maintenance, Safety, Management, Life-Cycle Sustainability and Innovations; CRC Press: Boca Raton, FL, USA, 2021; pp. 286–291. [Google Scholar]
  146. Liu, Y.; Lin, Y.; Yeoh, J.K.W.; Chua, D.K.H.; Wong, L.W.C.; Ang, M.H., Jr.; Lee, W.L.; Chew, M.Y.L. Framework for Automated UAV-Based Inspection of External Building Façades; Advances in 21st Century Human Settlements; Springer: Berlin, Germany, 2021; p. 194. [Google Scholar]
  147. Fujihira, Y.; Buriya, K.; Kaneko, S.; Tsukida, T.; Hanajima, N.; Mizukami, M. Evaluation of Whole Circumference Image Acquisition System for a Long Cylindrical Structures Inspection Robot. In Proceedings of the 2017 56th Annual Conference of the Society of Instrument and Control Engineers of Japan (SICE), Kanazawa, Japan, 19–22 September 2017; pp. 1108–1110. [Google Scholar]
  148. Levine, N.M.; Spencer, B.F., Jr. Post-Earthquake Building Evaluation Using UAVs: A BIM-Based Digital Twin Framework. Sensors 2022, 22, 873. [Google Scholar] [CrossRef] [PubMed]
  149. Yusof, H.; Ahmad, M.; Abdullah, A. Historical Building Inspection Using the Unmanned Aerial Vehicle (Uav). Int. J. Sustain. Constr. Eng. Technol. 2020, 11, 12–20. [Google Scholar] [CrossRef]
  150. Mita, A.; Shinagawa, Y. Response Estimation of a Building Subject to a Large Earthquake Using Acceleration Data of a Single Floor Recorded by a Sensor Agent Robot. In Proceedings of the Sensors and Smart Structures Technologies for Civil, Mechanical, and Aerospace Systems 2014, San Diego, CA, USA, 10–13 March 2014; International Society for Optics and Photonics: Bellingham, WA, USA, 2014; Volume 9061, p. 90611G. [Google Scholar]
  151. Ghosh Mondal, T.; Jahanshahi, M.R.; Wu, R.; Wu, Z.Y. Deep Learning-based Multi-class Damage Detection for Autonomous Post-disaster Reconnaissance. Struct. Control Health Monit. 2020, 27, e2507. [Google Scholar] [CrossRef]
  152. Mader, D.; Blaskow, R.; Westfeld, P.; Weller, C. Potential of Uav-Based Laser Scanner and Multispectral Camera Data in Building Inspection. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, 41, 1135–1142. [Google Scholar] [CrossRef]
  153. Mirzabeigi, S.; Razkenari, M. Automated Vision-Based Building Inspection Using Drone Thermography. In Proceedings of the Construction Research Congress 2022, Arlington, VA, USA, 9–12 March 2022; Jazizadeh, F., Shealy, T., Garvin, M., Eds.; pp. 737–746. [Google Scholar]
  154. Jung Hyung-Jo; Lee Jin-Hwan; Yoon Sungsik; Kim In-Ho Bridge Inspection and Condition Assessment Using Unmanned Aerial Vehicles (UAVs): Major Challenges and Solutions from a Practical Perspective. Smart Struct. Syst. 2019, 24, 669–681. [CrossRef]
  155. Schober, T. CLIBOT-a Rope Climbing Robot for Building Surface Inspection. Bautechnik 2010, 87, 81–85. [Google Scholar] [CrossRef]
  156. Lee, J.; Hwang, I.; Lee, H. Development of Advanced Robot System for Bridge Inspection and Monitoring. IABSE Symp. Rep. 2007, 93, 9–16. [Google Scholar] [CrossRef]
  157. Adán, A.; Prieto, S.A.; Quintana, B.; Prado, T.; García, J. An Autonomous Thermal Scanning System with Which to Obtain 3D Thermal Models of Buildings. In Proceedings of the Advances in Informatics and Computing in Civil and Construction Engineering, Chicago, IL, USA, 9 October 2018; Mutis, I., Hartmann, T., Eds.; Springer International Publishing: Cham, Switzerland, 2019; pp. 489–496. [Google Scholar]
  158. Dios, J.R.M.; Ollero, A. Automatic Detection of Windows Thermal Heat Losses in Buildings Using UAVs. In Proceedings of the 2006 World Automation Congress, Budapest, Hungary, 24–26 July 2006; pp. 1–6. [Google Scholar]
  159. Wang, C.; Cho, Y.K.; Gai, M. As-Is 3D Thermal Modeling for Existing Building Envelopes Using a Hybrid LIDAR System. J. Comput. Civ. Eng. 2013, 27, 645–656. [Google Scholar] [CrossRef]
  160. Peng, X.; Zhong, X.; Chen, A.; Zhao, C.; Liu, C.; Chen, Y. Debonding Defect Quantification Method of Building Decoration Layers via UAV-Thermography and Deep Learning. Smart Struct. Syst. 2021, 28, 55–67. [Google Scholar] [CrossRef]
  161. Pini, F.; Ferrari, C.; Libbra, A.; Leali, F.; Muscio, A. Robotic Implementation of the Slide Method for Measurement of the Thermal Emissivity of Building Elements. Energy Build. 2016, 114, 241–246. [Google Scholar] [CrossRef] [Green Version]
  162. Bohari, S.N.; Amran, A.U.; Zaki, N.A.M.; Suhaimi, M.S.; Rasam, A.R.A. Accuracy Assessment of Detecting Cracks on Concrete Wall at Different Distances Using Unmanned Autonomous Vehicle (UAV) Images; IOP Publishing Ltd.: Bristol, UK, 2021; Volume 620. [Google Scholar]
  163. Yan, Y.; Mao, Z.; Wu, J.; Padir, T.; Hajjar, J.F. Towards Automated Detection and Quantification of Concrete Cracks Using Integrated Images and Lidar Data from Unmanned Aerial Vehicles. Struct. Control Health Monit. 2021, 28, e2757. [Google Scholar] [CrossRef]
  164. Liu, D.; Xia, X.; Chen, J.; Li, S. Integrating Building Information Model and Augmented Reality for Drone-Based Building Inspection. J. Comput. Civ. Eng. 2021, 35, 04020073. [Google Scholar] [CrossRef]
  165. Wang, J.; Luo, C. Automatic Wall Defect Detection Using an Autonomous Robot: A Focus on Data Collection. In Proceedings of the ASCE International Conference on Computing in Civil Engineering 2019, Atlanta, GA, USA, 17–19 June 2019; pp. 312–319. [Google Scholar]
  166. Vazquez-Nicolas, J.M.; Zamora, E.; González-Hernández, I.; Lozano, R.; Sossa, H. PD+SMC Quadrotor Control for Altitude and Crack Recognition Using Deep Learning. Int. J. Control Autom. Syst. 2020, 18, 834–844. [Google Scholar] [CrossRef]
  167. Lorenc, S.J.; Yeager, M.; Bernold, L.E. Field Test Results of the Remotely Controlled Drilled Shaft Inspection System (DSIS). In Proceedings of the Robotics 2000, Albuquerque, New Mexico, 27 February–2 March 2000; pp. 120–125. [Google Scholar]
  168. Shutin, D.; Malakhov, A.; Marfin, K. Applying Automated and Robotic Means in Construction as a Factor for Providing Constructive Safety of Buildings and Structures. MATEC Web Conf. 2018, 251, 3005. [Google Scholar] [CrossRef]
  169. Hamledari, H.; McCabe, B.; Davari, S. Automated Computer Vision-Based Detection of Components of under-Construction Indoor Partitions. Autom. Constr. 2017, 74, 78–94. [Google Scholar] [CrossRef]
  170. Hussung, D.; Kurz, J.; Stoppel, M. Automated Non-Destructive Testing Technology for Large-Area Reinforced Concrete Structures. Concr. Reinf. Concr. Constr. 2012, 107, 794–804. [Google Scholar] [CrossRef]
  171. Qu, T.; Zang, W.; Peng, Z.; Liu, J.; Li, W.; Zhu, Y.; Zhang, B.; Wang, Y. Construction Site Monitoring Using UAV Oblique Photogrammetry and BIM Technologies. In Proceedings of the 22nd International Conference of the Association for Computer-Aided Architectural Design Research in Asia (CAADRIA) 2017, Yunlin, Taiwan, 5–8 April 2017; The Association for Computer-Aided Architectural Design Research in Asia (CAADRIA): Hong Kong, China, 2017; pp. 655–663. [Google Scholar]
  172. Samsami, R.; Mukherjee, A.; Asce, M.; Brooks, C.N. Application of Unmanned Aerial System (UAS) in Highway Construction Progress Monitoring Automation; American Society of Civil Engineers: Arlington, VA, USA, 2022; pp. 698–707. [Google Scholar]
  173. Han, D.; Lee, S.B.; Song, M.; Cho, J.S. Change Detection in Unmanned Aerial Vehicle Images for Progress Monitoring of Road Construction. Buildings 2021, 11, 150. [Google Scholar] [CrossRef]
  174. Shen, H.; Li, X.; Jiang, X.; Liu, Y. Automatic Scan Planning and Construction Progress Monitoring in Unknown Building Scene. In Proceedings of the 2021 IEEE International Conference on Robotics and Biomimetics (ROBIO), Sanya, China, 27–31 December 2021; pp. 1617–1622. [Google Scholar]
  175. Jacob-Loyola, N.; Muñoz-La Rivera, F.; Herrera, R.F.; Atencio, E. Unmanned Aerial Vehicles (UAVs) for Physical Progress Monitoring of Construction. Sensors 2021, 21, 4227. [Google Scholar] [CrossRef]
  176. Perez, M.A.; Zech, W.C.; Donald, W.N. Using Unmanned Aerial Vehicles to Conduct Site Inspections of Erosion and Sediment Control Practices and Track Project Progression. Transp. Res. Rec. 2015, 2528, 38–48. [Google Scholar] [CrossRef]
  177. Serrat, C.; Banaszek, S.; Cellmer, A.; Gilbert, V.; Banaszek, A. UAV, Digital Processing and Vectorization Techniques Applied to Building Condition Assessment and Follow-Up. Teh. Glas.-Tech. J. 2020, 14, 507–513. [Google Scholar] [CrossRef]
  178. Halder, S.; Afsari, K. Real-Time Construction Inspection in an Immersive Environment with an Inspector Assistant Robot. In Proceedings of the ASC2022. 58th Annual Associated Schools of Construction International Conference, Atlanta, GA, USA, 20–23 April 2022; Leathem, T., Collins, W., Perrenoud, A., Eds.; EasyChair: Atlanta, GA, USA, 2022; Volume 3, pp. 389–397. [Google Scholar]
  179. Bang, S.; Kim, H.; Kim, H. UAV-Based Automatic Generation of High-Resolution Panorama at a Construction Site with a Focus on Preprocessing for Image Stitching. Autom. Constr. 2017, 84, 70–80. [Google Scholar] [CrossRef]
  180. Asadi, K.; Ramshankar, H.; Pullagurla, H.; Bhandare, A.; Shanbhag, S.; Mehta, P.; Kundu, S.; Han, K.; Lobaton, E.; Wu, T. Vision-Based Integrated Mobile Robotic System for Real-Time Applications in Construction. Autom. Constr. 2018, 96, 470–482. [Google Scholar] [CrossRef]
  181. Ibrahim, A.; Sabet, A.; Golparvar-Fard, M. BIM-Driven Mission Planning and Navigation for Automatic Indoor Construction Progress Detection Using Robotic Ground Platform. Proc. 2019 Eur. Conf. Comput. Constr. 2019, 1, 182–189. [Google Scholar] [CrossRef]
  182. Shirina, N.V.; Kalachuk, T.G.; Shin, E.R.; Parfenyukova, E.A. Use of UAV to Resolve Construction Disputes; Lecture Notes in Civil Engineering; Springer: Berlin, Germany, 2021; Volume 151, LNCE; p. 246. [Google Scholar]
  183. Shang, Z.; Shen, Z. Real-Time 3D Reconstruction on Construction Site Using Visual SLAM and UAV. Constr. Res. Congr. 2018, 44, 305–315. [Google Scholar]
  184. Freimuth, H.; König, M. A Toolchain for Automated Acquisition and Processing of As-Built Data with Autonomous UAVs; University College Dublin: Dublin, Ireland, 2019. [Google Scholar]
  185. Hamledari, H. IFC-Enabled Site-to-BIM Automation: An Interoperable Approach Toward the Integration of Unmanned Aerial Vehicle (UAV)-Captured Reality into BIM. In Proceedings of the BuildingSMART Int. Student Project Award 2017, London, UK, 15 August 2017. [Google Scholar]
  186. Tian, J.; Luo, S.; Wang, X.; Hu, J.; Yin, J. Crane Lifting Optimization and Construction Monitoring in Steel Bridge Construction Project Based on BIM and UAV. Adv. Civ. Eng. 2021, 2021, 5512229. [Google Scholar] [CrossRef]
  187. Freimuth, H.; König, M. A Framework for Automated Acquisition and Processing of As-Built Data with Autonomous Unmanned Aerial Vehicles. Sensors 2019, 19, 4513. [Google Scholar] [CrossRef]
  188. Wang, J.; Huang, S.; Zhao, L.; Ge, J.; He, S.; Zhang, C.; Wang, X. High Quality 3D Reconstruction of Indoor Environments Using RGB-D Sensors. In Proceedings of the 2017 12th IEEE Conference on Industrial Electronics and Applications (ICIEA), Siem Reap, Cambodia, 18–20 June 2017; pp. 1739–1744. [Google Scholar]
  189. Coetzee, G.L. Smart Construction Monitoring of Dams with UAVS-Neckartal Dam Water Project Phase 1; ICE Publishing: London, UK, 2018; pp. 445–456. [Google Scholar]
  190. Golparvar-Fard, M.; Peña-Mora, F.; Savarese, S. D4AR-a 4-Dimensional Augmented Reality Model for Automating Construction Progress Monitoring Data Collection, Processing and Communication. J. Inf. Technol. Constr. 2009, 14, 129–153. [Google Scholar]
  191. Lattanzi, D.; Miller, G.R. 3D Scene Reconstruction for Robotic Bridge Inspection. J. Infrastruct. Syst. 2015, 21, 4014041. [Google Scholar] [CrossRef]
  192. De Winter, H.; Bassier, M.; Vergauwen, M. Digitisation in Road Construction: Automation of As-Built Models; Copernicus GmbH: Gottingen, Germany, 2022; Volume 46, pp. 69–76. [Google Scholar]
  193. Kim, P.; Chen, J.; Cho, Y.K. SLAM-Driven Robotic Mapping and Registration of 3D Point Clouds. Autom. Constr. 2018, 89, 38–48. [Google Scholar] [CrossRef]
  194. Bang, S.; Kim, H.; Kim, H. Vision-Based 2D Map Generation for Monitoring Construction Sites Using UAV Videos. In Proceedings of the 2017 34rd ISARC, Taipei, Taiwan, 1 July 2017; Chiu, K.C., Ed.; IAARC: Taipei, Taiwan, 2017; pp. 830–833. [Google Scholar]
  195. Kim, S.; Irizarry, J.; Costa, D.B. Field Test-Based UAS Operational Procedures and Considerations for Construction Safety Management: A Qualitative Exploratory Study. Int. J. Civ. Eng. 2020, 18, 919–933. [Google Scholar] [CrossRef]
  196. Gheisari, M.; Esmaeili, B. Applications and Requirements of Unmanned Aerial Systems (UASs) for Construction Safety. Saf. Sci. 2019, 118, 230–240. [Google Scholar] [CrossRef]
  197. Gheisari, M.; Rashidi, A.; Esmaeili, B. Using Unmanned Aerial Systems for Automated Fall Hazard Monitoring. In Proceedings of the Construction Research Congress 2018, New Orleans, LA, USA, 2–4 April 2018; pp. 62–72. [Google Scholar]
  198. Asadi, K.; Chen, P.; Han, K.; Wu, T.; Lobaton, E. Real-Time Scene Segmentation Using a Light Deep Neural Network Architecture for Autonomous Robot Navigation on Construction Sites. In Proceedings of the Computing in Civil Engineering 2019: Data, Sensing, and Analytics, Reston, VA, USA, 17–19 June 2019; American Society of Civil Engineers: Reston, VA, USA, 2019; pp. 320–327. [Google Scholar]
  199. Mantha, B.R.; de Soto, B. Designing a Reliable Fiducial Marker Network for Autonomous Indoor Robot Navigation. In Proceedings of the Proceedings of the 36th International Symposium on Automation and Robotics in Construction (ISARC), Banff, AB, Canada, 21–24 May 2019; pp. 21–24. [Google Scholar]
  200. Raja, A.K.; Pang, Z. High Accuracy Indoor Localization for Robot-Based Fine-Grain Inspection of Smart Buildings. In Proceedings of the 2016 IEEE International Conference on Industrial Technology (ICIT), Taipei, Taiwan, 14–17 March 2016; pp. 2010–2015. [Google Scholar]
  201. Myung, H.; Jung, J.; Jeon, H. Robotic SHM and Model-Based Positioning System for Monitoring and Construction Automation. Adv. Struct. Eng. 2012, 15, 943–954. [Google Scholar] [CrossRef]
  202. Chen, K.; Reichard, G.; Akanmu, A.; Xu, X. Geo-Registering UAV-Captured Close-Range Images to GIS-Based Spatial Model for Building Façade Inspections. Autom. Constr. 2021, 122, 103503. [Google Scholar] [CrossRef]
  203. Cesetti, A.; Frontoni, E.; Mancini, A.; Ascani, A.; Zingaretti, P.; Longhi, S. A Visual Global Positioning System for Unmanned Aerial Vehicles Used in Photogrammetric Applications. J. Intell. Robot. Syst. 2011, 61, 157–168. [Google Scholar] [CrossRef]
  204. Paterson, A.M.; Dowling, G.R.; Chamberlain, D.A. Building Inspection: Can Computer Vision Help? Autom. Constr. 1997, 7, 13–20. [Google Scholar] [CrossRef]
  205. Shang, Z.; Shen, Z. Vision Model-Based Real-Time Localization of Unmanned Aerial Vehicle for Autonomous Structure Inspection under GPS-Denied Environment. In Proceedings of the Computing in Civil Engineering 2019, Atlanta, GA, USA, 17–19 June 2019; pp. 292–298. [Google Scholar]
  206. Sugimoto, H.; Moriya, Y.; Ogasawara, T. Underwater Survey System of Dam Embankment by Remotely Operated Vehicle. In Proceedings of the 2017 IEEE Underwater Technology (UT), Busan, Republic of Korea, 21–24 February 2017; pp. 1–6. [Google Scholar]
  207. Ibrahim, A.; Roberts, D.; Golparvar-Fard, M.; Bretl, T. An Interactive Model-Driven Path Planning and Data Capture System for Camera-Equipped Aerial Robots on Construction Sites. In Proceedings of the ASCE International Workshop on Computing in Civil Engineering 2017, Seattle, WA, USA, 25–27 June 2017; American Society of Civil Engineers (ASCE): Seattle, WA, USA, 2017; pp. 117–124. [Google Scholar]
  208. Mantha, B. Navigation, Path Planning, and Task Allocation Framework For Mobile Co-Robotic Service Applications in Indoor Building Environments. Ph.D. Thesis, University of Michigan, Ann Arbor, MI, USA, 2018. [Google Scholar]
  209. Lin, S.; Kong, X.; Wang, J.; Liu, A.; Fang, G.; Han, Y. Development of a UAV Path Planning Approach for Multi-Building Inspection with Minimal Cost; Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Springer: Berlin, Germany, 2021; Volume 12606, LNCS; p. 93. [Google Scholar]
  210. González-Desantos, L.M.; Frías, E.; Martínez-Sánchez, J.; González-Jorge, H. Indoor Path-Planning Algorithm for Uav-Based Contact Inspection. Sensors 2021, 21, 642. [Google Scholar] [CrossRef]
  211. Shi, L.; Mehrooz, G.; Jacobsen, R.H. Inspection Path Planning for Aerial Vehicles via Sampling-Based Sequential Optimization. In Proceedings of the 2021 International Conference on Unmanned Aircraft Systems (ICUAS), Athens, Greece, 15–18 June 2021; pp. 679–687. [Google Scholar]
  212. Asadi, K.; Ramshankar, H.; Pullagurla, H.; Bhandare, A.; Shanbhag, S.; Mehta, P.; Kundu, S.; Han, K.; Lobaton, E.; Wu, T. Building an Integrated Mobile Robotic System for Real-Time Applications in Construction. In Proceedings of the 35th International Symposium on Automation and Robotics in Construction (ISARC), Berlin, Germany, 20–25 July 2018; Teizer, J., Ed.; International Association for Automation and Robotics in Construction (IAARC): Taipei, Taiwan, 2018; pp. 453–461. [Google Scholar]
  213. Warszawski, A.; Rosenfeld, Y.; Shohet, I. Autonomous Mapping System for an Interior Finishing Robot. J. Comput. Civ. Eng. 1996, 10, 67–77. [Google Scholar] [CrossRef]
  214. Zhang, H.; Chi, S.; Yang, J.; Nepal, M.; Moon, S. Development of a Safety Inspection Framework on Construction Sites Using Mobile Computing. J. Manag. Eng. 2017, 33, 4016048. [Google Scholar] [CrossRef]
  215. Al-Kaff, A.; Moreno, F.; Jose, L.; Garcia, F.; Martin, D.; de la Escalera, A.; Nieva, A.; Garcea, J. VBII-UAV: Vision-Based Infrastructure Inspection-UAV; Rocha, A., Correia, A., Adeli, H., Reis, L., Costanzo, S., Eds.; Springer: Berlin, Germany, 2017; Volume 570, pp. 221–231. [Google Scholar]
  216. Seo, J.; Han, S.; Lee, S.; Kim, H. Computer Vision Techniques for Construction Safety and Health Monitoring. Adv. Eng. Inform. 2015, 29, 239–251. [Google Scholar] [CrossRef]
  217. Li, Y.; Li, X.; Wang, H.; Wang, S.; Gu, S.; Zhang, H. Exposed Aggregate Detection of Stilling Basin Slabs Using Attention U-Net Network. KSCE J. Civ. Eng. 2020, 24, 1740–1749. [Google Scholar] [CrossRef]
  218. Giergiel, M.; Buratowski, T.; Małka, P.; Kurc, K.; Kohut, P.; Majkut, K. The Project of Tank Inspection Robot. In Proceedings of the Structural Health Monitoring II, Online, 12 July 2012; Trans Tech Publications Ltd.: Bäch, Switzerland, 2012; Volume 518, pp. 375–383. [Google Scholar]
  219. Munawar, H.; Ullah, F.; Heravi, A.; Thaheem, M.; Maqsoom, A. Inspecting Buildings Using Drones and Computer Vision: A Machine Learning Approach to Detect Cracks and Damages. Drones 2022, 6, 5. [Google Scholar] [CrossRef]
  220. Alipour, M.; Harris, D.K. Increasing the Robustness of Material-Specific Deep Learning Models for Crack Detection across Different Materials. Eng. Struct. 2020, 206, 110157. [Google Scholar] [CrossRef]
  221. Chen, K.; Reichard, G.; Xu, X.; Akanmu, A. Automated Crack Segmentation in Close-Range Building Facade Inspection Images Using Deep Learning Techniques. J. Build. Eng. 2021, 43, 102913. [Google Scholar] [CrossRef]
  222. Kuo, C.-M.; Kuo, C.-H.; Lin, S.-P.; Manuel, M.; Lin, P.T.; Hsieh, Y.-C.; Lu, W.-H. Infrastructure Inspection Using an Unmanned Aerial System (UAS) With Metamodeling-Based Image Correction. In Proceedings of the ASME 2016 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, Charlotte, NC, USA, 21–24 August 2016; American Society of Mechanical Engineers Digital Collection: New York, NY, USA, 2016. [Google Scholar]
  223. Peng, K.C.; Feng, L.; Hsieh, Y.C.; Yang, T.H.; Hsiung, S.H.; Tsai, Y.D.; Kuo, C. Unmanned Aerial Vehicle for Infrastructure Inspection with Image Processing for Quantification of Measurement and Formation of Facade Map. In Proceedings of the 2017 International Conference on Applied System Innovation (ICASI), Sapporo, Japan, 13–17 May 2017; pp. 1969–1972. [Google Scholar]
  224. Costa, D.B.; Mendes, A.T.C. Lessons Learned from Unmanned Aerial System-Based 3D Mapping Experiments. In Proceedings of the 52nd ASC Annual International Conference, Provo, Utah, 13–16 April 2016. [Google Scholar]
  225. Choi, S.; Kim, E. Image Acquisition System for Construction Inspection Based on Small Unmanned Aerial Vehicle BT-Advanced Multimedia and Ubiquitous Engineering; Park, J.J., Jong, H., Chao, H.-C., Arabnia, H., Yen, N.Y., Eds.; Springer: Berlin/Heidelberg, Germany, 2015; pp. 273–280. [Google Scholar]
  226. Yan, R.-J.; Kayacan, E.; Chen, I.-M.; Tiong, L.; Wu, J. QuicaBot: Quality Inspection and Assessment Robot. IEEE Trans. Autom. Sci. Eng. 2018, 16, 506–517. [Google Scholar] [CrossRef]
  227. Ham, Y.; Kamari, M. Automated Content-Based Filtering for Enhanced Vision-Based Documentation in Construction toward Exploiting Big Visual Data from Drones. Autom. Constr. 2019, 105, 102831. [Google Scholar] [CrossRef]
  228. Heffron, R.E. The Use of Submersible Remotely Operated Vehicles for Inspection of Water-Filled Pipelines and Tunnels. In Proceedings of the Pipelines in the Constructed Environment, San Diego, CA, USA, 23–27 August 1998; pp. 397–404. [Google Scholar]
  229. Longo, D.; Muscato, G. Adhesion Control for the Alicia3 Climbing Robot. In Proceedings of the Climbing and Walking Robots, Madrid, Spain, 22–24 September 2004; Springer: Berlin/Heidelberg, Germany, 2005; pp. 1005–1015. [Google Scholar]
  230. Kuo, C.-H.; Kanlanjan, S.; Pagès, L.; Menzel, H.; Power, S.; Kuo, C.-M.; Boller, C.; Grondel, S. Effects of Enhanced Image Quality in Infrastructure Monitoring through Micro Aerial Vehicle Stabilization. In Proceedings of the European Workshop on Structural Health Monitoring, Nantes, France, 8–11 July 2014; pp. 710–717. [Google Scholar]
  231. González-deSantos, L.M.; Martínez-Sánchez, J.; González-Jorge, H.; Ribeiro, M.; de Sousa, J.B.; Arias, P. Payload for Contact Inspection Tasks with UAV Systems. Sensors 2019, 19, 3752. [Google Scholar] [CrossRef]
  232. Kuo, C.; Kuo, C.; Boller, C. Adaptive Measures to Control Micro Aerial Vehicles for Enhanced Monitoring of Civil Infrastructure. In Proceedings of the World Congress on Structural Control and Monitoring, Barcelona, Spain, 15–17 July 2014. [Google Scholar]
  233. Andriani, N.; Maulida, M.D.; Aniroh, Y.; Alfafa, M.F. Pressure Control of a Wheeled Wall Climbing Robot Using Proporsional Controller. In Proceedings of the 2016 International Conference on Information Communication Technology and Systems (ICTS), Surabaya, Indonesia, 12 October 2016; pp. 124–128. [Google Scholar]
  234. Lee, N.; Mita, A. Sensor Agent Robot with Servo-Accelerometer for Structural Health Monitoring. In Proceedings of the Sensors and Smart Structures Technologies for Civil, Mechanical, and Aerospace Systems 2012, San Diego, CA, USA, 12–15 March 2012; Tomizuka, M., Yun, C.-B., Lynch, J.P., Eds.; International Society for Optics and Photonics: Bellingham, WA, USA, 2012; Volume 8345, pp. 753–759. [Google Scholar]
  235. Elbanhawi, M.; Simic, M. Robotics Application in Remote Data Acquisition and Control for Solar Ponds. Appl. Mech. Mater. 2013, 253–255, 705–715. [Google Scholar] [CrossRef]
  236. Huston, D.R.; Esser, B.; Gaida, G.; Arms, S.W.; Townsend, C.P. Wireless Inspection of Structures Aided by Robots. Proc. SPIE 2001, 4337, 147–154. [Google Scholar] [CrossRef]
  237. Carrio, A.; Pestana, J.; Sanchez-Lopez, J.-L.; Suarez-Fernandez, R.; Campoy, P.; Tendero, R.; García-De-Viedma, M.; González-Rodrigo, B.; Bonatti, J.; Gregorio Rejas-Ayuga, J.; et al. UBRISTES: UAV-Based Building Rehabilitation with Visible and Thermal Infrared Remote Sensing. Adv. Intell. Syst. Comput. 2016, 417, 245–256. [Google Scholar] [CrossRef]
  238. Rakha, T.; El Masri, Y.; Chen, K.; Panagoulia, E.; De Wilde, P. Building Envelope Anomaly Characterization and Simulation Using Drone Time-Lapse Thermography. Energy Build. 2022, 259, 111754. [Google Scholar] [CrossRef]
  239. Martínez De Dios, J.R.; Ollero, A.; Ferruz, J. Infrared Inspection of Buildings Using Autonomous Helicopters. IFAC Proc. Vol. 2006, 4, 602–607. [Google Scholar] [CrossRef]
  240. Ortiz-Sanz, J.; Gil-Docampo, M.; Arza-García, M.; Cañas-Guerrero, I. IR Thermography from UAVs to Monitor Thermal Anomalies in the Envelopes of Traditional Wine Cellars: Field Test. Remote Sens. 2019, 11, 1424. [Google Scholar] [CrossRef]
  241. Eschmann, C.; Kuo, C.-M.; Kuo, C.-H.; Boller, C. High-resolution multisensor infrastructure inspection with unmanned aircraft systems. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2013, XL-1/W2, 125–129. [Google Scholar] [CrossRef]
  242. Caroti, G.; Piemonte, A.; Zaragoza, I.M.-E.; Brambilla, G. Indoor Photogrammetry Using UAVs with Protective Structures: Issues and Precision Tests. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2018, 42, 137–142. [Google Scholar] [CrossRef]
  243. Cocchioni, F.; Pierfelice, V.; Benini, A.; Mancini, A.; Frontoni, E.; Zingaretti, P.; Ippoliti, G.; Longhi, S. Unmanned Ground and Aerial Vehicles in Extended Range Indoor and Outdoor Missions. In Proceedings of the 2014 International Conference on Unmanned Aircraft Systems (ICUAS), Orlando, FL, USA, 27–30 May 2014; pp. 374–382. [Google Scholar]
  244. Causa, F.; Fasano, G.; Grassi, M. Improving Autonomy in GNSS-Challenging Environments by Multi-UAV Cooperation. In Proceedings of the 2019 IEEE/AIAA 38th Digital Avionics Systems Conference (DASC), San Diego, CA, USA, 8–12 September 2019; pp. 1–10. [Google Scholar]
  245. Mantha, B.; Menassa, C.; Kamat, V. Multi-Robot Task Allocation and Route Planning for Indoor Building Environment Applications. In Proceedings of the Construction Research Congress 2018, New Orleans, LA, USA, 2–4 April 2018; American Society of Civil Engineers (ASCE): New Orleans, LA, USA, 2018; Volume 2018-April, pp. 137–146. [Google Scholar]
  246. Mantha, B.R.K.; Menassa, C.C.; Kamat, V.R. Task Allocation and Route Planning for Robotic Service Networks in Indoor Building Environments. J. Comput. Civ. Eng. 2017, 31, 04017038. [Google Scholar] [CrossRef]
  247. Moud, H.I.; Zhang, X.; Flood, I.; Shojaei, A.; Zhang, Y.; Capano, C. Qualitative and Quantitative Risk Analysis of Unmanned Aerial Vehicle Flights on Construction Job Sites: A Case Study. Int. J. Adv. Intell. Syst. 2019, 12, 135–146. [Google Scholar]
  248. Federal Aviation Administration Part 107 Waiver. Available online: https://www.faa.gov/uas/commercial_operators/part_107_waivers (accessed on 19 June 2022).
  249. OSHA (Occupational Safety and Health Administration) OSHA’s Use of Unmanned Aircraft Systems in Inspections-11/10/2016 | Occupational Safety and Health Administration. Available online: https://www.osha.gov/dep/memos/use-of-unmanned-aircraft-systems-inspection_memo_05182018.html (accessed on 20 January 2021).
  250. Moud, H.I.; Flood, I.; Shojaei, A.; Zhang, Y.; Zhang, X.; Tadayon, M.; Hatami, M. Qualitative Assessment of Indirect Risks Associated with Unmanned Aerial Vehicle Flights over Construction Job Sites. In Proceedings of the Computing in Civil Engineering 2019, Atlanta, GA, USA, 17–19 June 2019; pp. 83–89. [Google Scholar]
  251. Paes, D.; Kim, S.; Irizarry, J. Human Factors Considerations of First Person View (FPV) Operation of Unmanned Aircraft Systems (UAS) in Infrastructure Construction and Inspection Environments. In Proceedings of the 6th CSCE-CRC International Construction Specialty Conference, Vancouver, BC, Canada, 31 May–3 June 2017. [Google Scholar]
  252. Izadi Moud, H.; Razkenari, M.A.; Flood, I.; Kibert, C. A Flight Simulator for Unmanned Aerial Vehicle Flights Over Construction Job Sites BT-Advances in Informatics and Computing in Civil and Construction Engineering; Mutis, I., Hartmann, T., Eds.; Springer International Publishing: Cham, Switzerland, 2019; pp. 609–616. [Google Scholar]
  253. Petukhov, A.; Rachkov, M. Video Compression Method for On-Board Systems of Construction Robots. In Proceedings of the Proceedings of the 20th International Symposium on Automation and Robotics in Construction ISARC 2003—The Future Site; Maas, G., Van Gassel, F., Eds.; International Association for Automation and Robotics in Construction (IAARC): Eindhoven, The Netherlands, 2003; pp. 443–447. [Google Scholar]
  254. Schilling, K. Telediagnosis and Teleinspection Potential of Telematic Techniques. Adv. Eng. Softw. 2000, 31, 875–879. [Google Scholar] [CrossRef]
  255. Yang, Y.; Nagarajaiah, S. Robust Data Transmission and Recovery of Images by Compressed Sensing for Structural Health Diagnosis. Struct. Control Health Monit. 2017, 24, e1856. [Google Scholar] [CrossRef]
  256. Li, Z.; Zhang, B. Research on Application of Robot and IOT Technology in Engineering Quality Intelligent Inspection. In Proceedings of the 2020 2nd International Conference on Robotics, Intelligent Control and Artificial Intelligence, Shanghai, China, 17–19 October 2020; pp. 17–22. [Google Scholar]
  257. Mahama, E.; Walpita, T.; Karimoddini, A.; Eroglu, A.; Goudarzi, N.; Cavalline, T.; Khan, M. Testing and Evaluation of Radio Frequency Immunity of Unmanned Aerial Vehicles for Bridge Inspection. In Proceedings of the 2021 IEEE Aerospace Conference (50100), Virtual, 6–20 March 2021; pp. 1–8. [Google Scholar]
  258. Xia, P.; Xu, F.; Qi, Z.; Du, J.; Asce, M.; Student, P.D. Human Robot Comparison in Rapid Structural Inspection. In Proceedings of the Construction Research Congress, Arlington, VA, USA, 9–12 March 2022; American Society of Civil Engineers: Reston, VA, USA, 2022; pp. 570–580. [Google Scholar]
  259. Van Dam, J.; Krasner, A.; Gabbard, J. Augmented Reality for Infrastructure Inspection with Semi-Autonomous Aerial Systems: An Examination of User Performance, Workload, and System Trust. In Proceedings of the 2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), Atlanta, GA, USA, 22–26 March 2020; pp. 743–744. [Google Scholar]
  260. Li, Y.; Karim, M.; Qin, R. A Virtual-Reality-Based Training and Assessment System for Bridge Inspectors with an Assistant Drone. Ieee Trans. Hum.-Mach. Syst. 2022, 52, 591–601. [Google Scholar] [CrossRef]
  261. Dam, J.V.; Krasner, A.; Gabbard, J.L. Drone-Based Augmented Reality Platform for Bridge Inspection: Effect of AR Cue Design on Visual Search Tasks. In Proceedings of the 2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), Atlanta, GA, USA, 22–26 March 2020; pp. 201–204. [Google Scholar]
  262. Eiris, R.; Albeaino, G.; Gheisari, M.; Benda, W.; Faris, R. InDrone: A 2D-Based Drone Flight Behavior Visualization Platform for Indoor Building Inspection. Smart Sustain. Built Environ. 2021, 10, 438–456. [Google Scholar] [CrossRef]
  263. Cheng, W.; Shen, H.; Chen, Y.; Jiang, X.; Liu, Y. Automatical Acquisition of Point Clouds of Construction Sites and Its Application in Autonomous Interior Finishing Robot. In Proceedings of the 2019 IEEE International Conference on Robotics and Biomimetics (ROBIO), Dali, China, 6–8 December 2019; pp. 1711–1716. [Google Scholar]
  264. Salaan, C.J.; Tadakuma, K.; Okada, Y.; Ohno, K.; Tadokoro, S. UAV with Two Passive Rotating Hemispherical Shells and Horizontal Rotor for Hammering Inspection of Infrastructure. In Proceedings of the 2017 IEEE/SICE International Symposium on System Integration (SII), Taipei, Taiwan, 11–14 December 2017; pp. 769–774. [Google Scholar]
  265. Dorafshan, S.; Thomas, R.J.; Maguire, M. SDNET2018: An Annotated Image Dataset for Non-Contact Concrete Crack Detection Using Deep Convolutional Neural Networks. Data Brief 2018, 21, 1664–1668. [Google Scholar] [CrossRef] [PubMed]
  266. Moud., H.I.; Shojaei., A.; Flood., I.; Zhang., X. Monte Carlo Based Risk Analysis of Unmanned Aerial Vehicle Flights over Construction Job Sites. In Proceedings of the 8th International Conference on Simulation and Modeling Methodologies, Technologies and Applications, Porto, Portugal, 29–31 July 2018; pp. 451–458. [Google Scholar]
  267. Karnik, N.; Bora, U.; Bhadri, K.; Kadambi, P.; Dhatrak, P. A Comprehensive Study on Current and Future Trends towards the Characteristics and Enablers of Industry 4.0. J. Ind. Inf. Integr. 2022, 27, 100294. [Google Scholar] [CrossRef]
  268. Brosque, C.; Galbally, E.; Khatib, O.; Fischer, M. Human-Robot Collaboration in Construction: Opportunities and Challenges. In Proceedings of the 2020 International Congress on Human-Computer Interaction, Optimization and Robotic Applications (HORA), Ankara, Turkey, 26–27 June 2020; pp. 1–8. [Google Scholar]
  269. Salmi, T.; Ahola, J.M.; Heikkilä, T.; Kilpeläinen, P.; Malm, T. Human-Robot Collaboration and Sensor-Based Robots in Industrial Applications and Construction. In Robotic Building; Bier, H., Ed.; Springer International Publishing: Cham, Switzerland, 2018; pp. 25–52. ISBN 978-3-319-70866-9. [Google Scholar]
  270. Reardon, C.; Fink, J. Air-Ground Robot Team Surveillance of Complex 3D Environments. In Proceedings of the 2016 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR), Lausanne, Switzerland, 23–27 October 2016; Melo, K., Ed.; IEEE: Piscataway, NJ, USA, 2016; pp. 320–327. [Google Scholar]
  271. Alonso-Martin, F.; Malfaz, M.; Sequeira, J.; Gorostiza, J.F.; Salichs, M.A. A Multimodal Emotion Detection System during Human--Robot Interaction. Sensors 2013, 13, 15549–15581. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  272. Adami, P.; Rodrigues, P.B.; Woods, P.J.; Becerik-Gerber, B.; Soibelman, L.; Copur-Gencturk, Y.; Lucas, G. Effectiveness of VR-Based Training on Improving Construction Workers’ Knowledge, Skills, and Safety Behavior in Robotic Teleoperation. Adv. Eng. Inform. 2021, 50, 101431. [Google Scholar] [CrossRef]
Figure 1. Methodology used for the selection of sources.
Figure 1. Methodology used for the selection of sources.
Applsci 13 02304 g001
Figure 2. Historical frequency of research.
Figure 2. Historical frequency of research.
Applsci 13 02304 g002
Figure 3. Types of robots used in literature.
Figure 3. Types of robots used in literature.
Applsci 13 02304 g003
Figure 4. Application domains of using robots for inspection and monitoring.
Figure 4. Application domains of using robots for inspection and monitoring.
Applsci 13 02304 g004
Figure 5. Types of sensing technologies used with robots for inspection and monitoring.
Figure 5. Types of sensing technologies used with robots for inspection and monitoring.
Applsci 13 02304 g005
Table 1. Descriptive statistics of the reviewed sources.
Table 1. Descriptive statistics of the reviewed sources.
DescriptionResults
Timespan1991:2022
Sources (Journals, Books, etc)185
Documents269
Average years from publication5.25
Average citations per documents11.13
Average citations per year per doc1.591
References463
DOCUMENT TYPES
Article100
Book chapter6
Conference paper163
DOCUMENT CONTENTS
Author’s Keywords (DE)515
AUTHORS
Authors663
Author Appearances823
Authors of single-authored documents70
Authors of multi-authored documents593
AUTHORS COLLABORATION
Single-authored documents70
Documents per Author0.406
Authors per Document2.46
Co-Authors per Documents3.06
Collaboration Index2.98
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Halder, S.; Afsari, K. Robots in Inspection and Monitoring of Buildings and Infrastructure: A Systematic Review. Appl. Sci. 2023, 13, 2304. https://doi.org/10.3390/app13042304

AMA Style

Halder S, Afsari K. Robots in Inspection and Monitoring of Buildings and Infrastructure: A Systematic Review. Applied Sciences. 2023; 13(4):2304. https://doi.org/10.3390/app13042304

Chicago/Turabian Style

Halder, Srijeet, and Kereshmeh Afsari. 2023. "Robots in Inspection and Monitoring of Buildings and Infrastructure: A Systematic Review" Applied Sciences 13, no. 4: 2304. https://doi.org/10.3390/app13042304

APA Style

Halder, S., & Afsari, K. (2023). Robots in Inspection and Monitoring of Buildings and Infrastructure: A Systematic Review. Applied Sciences, 13(4), 2304. https://doi.org/10.3390/app13042304

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop