Next Article in Journal
Efficient Fuzzy Image Stretching for Automatic Ganglion Cyst Extraction Using Fuzzy C-Means Quantization
Next Article in Special Issue
Ultrasound-Based Smart Corrosion Monitoring System for Offshore Wind Turbines
Previous Article in Journal
Development of a Convenient Home Meal Replacement Product Containing Roasted Abalone (Haliotis discus hannai) with Honey Butter Sauce
Previous Article in Special Issue
Modeling and Control of a Microgrid Connected to the INTEC University Campus
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Coverage Path Planning with Semantic Segmentation for UAV in PV Plants

by
Andrés Pérez-González
*,
Nelson Benítez-Montoya
,
Álvaro Jaramillo-Duque
and
Juan Bernardo Cano-Quintero
Research Group in Efficient Energy Management (GIMEL), Electrical Engineering Department, Universidad de Antioquia, Calle 67 No. 53-108, Medellín 050010, Colombia
*
Author to whom correspondence should be addressed.
Appl. Sci. 2021, 11(24), 12093; https://doi.org/10.3390/app112412093
Submission received: 16 November 2021 / Revised: 14 December 2021 / Accepted: 15 December 2021 / Published: 19 December 2021

Abstract

:
Solar energy is one of the most strategic energy sources for the world’s economic development. This has caused the number of solar photovoltaic plants to increase around the world; consequently, they are installed in places where their access and manual inspection are arduous and risky tasks. Recently, the inspection of photovoltaic plants has been conducted with the use of unmanned aerial vehicles (UAV). Although the inspection with UAVs can be completed with a drone operator, where the UAV flight path is purely manual or utilizes a previously generated flight path through a ground control station (GCS). However, the path generated in the GCS has many restrictions that the operator must supply. Due to these restrictions, we present a novel way to develop a flight path automatically with coverage path planning (CPP) methods. Using a DL server to segment the region of interest (RoI) within each of the predefined PV plant images, three CPP methods were also considered and their performances were assessed with metrics. The UAV energy consumption performance in each of the CPP methods was assessed using two different UAVs and standard metrics. Six experiments were performed by varying the CPP width, and the consumption metrics were recorded in each experiment. According to the results, the most effective and efficient methods are the exact cellular decomposition boustrophedon and grid-based wavefront coverage, depending on the CPP width and the area of the PV plant. Finally, a relationship was established between the size of the photovoltaic plant area and the best UAV to perform the inspection with the appropriate CPP width. This could be an important result for low-cost inspection with UAVs, without high-resolution cameras on the UAV board, and in small plants.

1. Introduction

According to REN21, over the past two years, global photovoltaic (PV) plants capacities and annual additions have grown and expanded rapidly. For instance, 621 GW were installed in the year 2019 and 760 GW in the year 2020 [1], despite the reduction in electricity consumption and shifted daily demand patterns due to the COVID-19 pandemic [2]. Additionally, it has become one of the most profitable options and is an energy resource that has recently decreased in cost. As a result, solar electricity generation has grown in residential, commercial, and utility-scale projects [3]. The future of PV generation will focus on optimizing hybrid systems [4] and improving the performance of each element of the system, as well as reducing their cost due to large-scale production [5]. Furthermore, the sector trend has been asking for low prices, and the competitive market has encouraged investment in solar PV technologies across the entire value chain, particularly in solar cells and modules, to improve efficiencies and reduce the levelized cost of energy (LCOE) [1]. As a result, PV power plants could grow almost sixfold over the next ten years, reaching a cumulative capacity of 2840 GW globally by 2030 and rising to 8519 GW by 2050, according to [6].
Thus, it is established that the number of PV plants and generated power have increased around the world. This implies specific technical challenges in their maintenance and operation (O and M) [7]. Some of these threats affect their production, which increases cost, and decreases profitability. The most common threats are the failures in the inverters and PV modules [8]. Through dirty equipment, the state of the environment, or manufacturing defects in the PV modules, PV plant energy generation can be curtailed by 31% in the worst cases [9,10,11].
One must consider that PV plants are commonly installed on roofs, rooftops, canopies, or facades for urban environments. Likewise, solar farms utilize rural environments, such as deserts, plains, and hills [7,12,13]. Depending on the location of the PV plant, the manual inspection tasks could be exhausting and take up to 8 h/MW, if the number of modules is considerable. The amount of inspection time increases for solar PV plants on rooftops or canopies, by virtue of aspects of their installation [14]. In addition, inspections to detect threats in the panels must be conducted by trained personnel. In some cases, problems occur in elevated installations, for which special training and certification is required. These jobs could put people and facilities at risk [15].
In recent decades, unmanned aerial vehicles (UAVs) have been increasingly used in inspection and patrol tasks [16,17,18]. UAV-based applications for PV plant inspections have many advantages in comparison with the manual inspection methods. The main advantages are flexibility, lower cost, larger area coverage, faster detection, higher precision, and the capacity to perform a superior and automatic inspections [11,17,18,19,20,21].
There are many approaches to performing an inspection with UAVs in PV modules. One of them used UAVs with a thermal imaging camera to take photos in the infrared spectrum to evaluate the UAVs parameters, such as height, speed, viewing angle, sun reflection, irradiance and temperature, all of which are necessary to perform defect inspection [22]. In a second approach, UAVs were implemented to inspect different solar PV plants, wherein analysis of the correlation between altitude and the pixel resolution was used to detect PV panel defects and features like shape, size, location, and color, among others, of a particular defect were also detected [7,23,24,25,26]. In a third approach, the authors proposed PV plant fault inspection with UAVs using image mosaics combined with orthophotography techniques to create a digital map; image mosaics were combined with multiple visible and/or infrared range images into a single mosaic image covering a large area [14,27,28]. Whereas the orthophotography technique is a vertical photograph that shows images of objects in true planimetric position [29]. Thus, these two techniques were integrated with previous works to achieve an advanced tool that allows monitoring and taking actions in the operation and maintenance (O and M) of PV plants [7]. In short, this tool has been used to detect defects and dust or dirt in PV modules [28]. Apart from this, some approaches have been developed for the detection of defects in PV modules using artificial vision techniques, machine learning, deep learning, and the integration therewith of the previous approaches [7,24,30,31,32,33]. Additionally, these approaches to planning the UAV’s flight path were configured from a ground control station (GCS) program. Irrespective of these approaches, an important aspect is that the UAV should automatically follow the path to cover important points in PV plants. Many research efforts have been made to calculate the paths of and solve the waypoint planning problem for UAV inspection of solar PV plant applications [34,35,36,37], but none of them propose the coverage path planning (CPP) as a method to complete this task.
The CPP, given a region of interest (RoI) in a 2D environment, consists of calculating the path that passes through each one of the points that make up the desired environment and must be found considering the limitations of movement [38,39]. CPP is classified as a classical NP-hard problem in the field of computational complexity. These problems were initially analyzed for indoor environments with mobile robots. However, with the development of GPS, CPPs began to be used for missions with UAVs. Due to the environment in which the task is performed and the obstacles present, precise localization in the environment is an arduous task, which makes the CPP a difficult problem [40]. Additionally, it is classified as a motion planning subtopic in robotics, and has two approaches, heuristic and complete. In heuristic approaches, the robots follow a set of rules defining their behavior, but do not present a guarantee for successful full coverage. These guarantee using the cellular decomposition of the area, which involves space discretization into cells to simplify the coverage in each sub-region, unlike complete methods, which cannot afford such processing. Another important issue mentioned by the authors of reference [39] is the flight time to fully cover the area, which can be reduced using multiple robots and by reducing the number of turning maneuvers. Finally, the available RoI information is important; several approaches accept previous knowledge of the robot’s respecting the search RoI (offline), while sensor-based approaches collect such information in real-time along the coverage (online) [41].
In the literature, CPP approaches are needed in several application areas, such as floor cleaning [42], agriculture [43,44], wildfire tracking [45], bush trimming [46], power line inspection [47], photogrammetry [39], visual inspection [48], and many more. Additionally, many surveys regarding CPP present several approaches and techniques for performing missions with, mostly, land vehicles [41,49,50]. The research interest in aerial robots (indoor and outdoor) has surely motivated the research of CPP [51]. This can be implemented in many UAVs platforms, such as fixed-wing, rotary-wing, and hybrid UAV (VTOL) [39]. Rotary-wing UAVs are inexpensive and have good maneuverability, and their small payload capacity limits the weight of on-board sensors and flight time. Hence, they are more suitable for CPP missions on a small scale. Additionally, the increasing usage of UAVs in applications with complicated missions has led to CPP methods being a very active research area for single and multiple UAVs, especially recently [39,52,53,54], As evidenced in a previous work [43], the classic taxonomy of coverage paths in UAVs are classified into no decomposition, exact cellular decomposition, and approximate cellular decomposition. The first performs the coverage with a single UAV, for which no decomposition technique is required, because the shape of the RoI has a non-complex geometry. The second divides the free space into simple, non-overlapping regions called cells. The union of all cells fills the free space completely. The cells in which there are no obstacles are easy to cover and can be covered by the robot with simple movements. The third is based on grids. They use a representation of the environment decomposed into a collection of uniformly squared cells [55], considering rectangular, concave, and convex polygons for RoIs. In addition to this, CPP performances were assessed with applied metrics according to [39]. The elemental approach most used to solve offline CPP problem sis the area decomposition into non-overlapping sub-regions [56], to determine the appropriate visiting sequence of each sub-region and to cover each decomposed region in a back-and-forth movement to secure a complete coverage path. As a result, the methods for obtaining complete coverage of an RoI are the exact and approximate cellular decomposition methods [57,58,59].
On the other hand, image processing helps to obtain a map of the robot or RoI. Robots, such as UAVs, need to know the RoI before commencing CPP [60], which represents where the PV plants are and can be determined in a process called boundary extraction [61,62]. Then, the deep learning (DL) image segmentation technique, also known as semantic segmentation [63], is achieved by applying deep convolutional neural networks (CNN), such as the U-Net network model [64,65] or the FCN model [66], which dramatically enhance the segmentation results. Once the segmentation is done and the mask is obtained or the RoI is identified, the GCS calculates the CPP that guides the UAV in the automatic plant inspection, during which it captures images of PV plants [67]. Most of the failures occur at the centimeter or millimeter level, and that poses a challenge for the inexpensive sensors available today [68]. Tests of coverage paths it can be conducted with drones in a real or simulated environment. The most viable option for this stage of the work is simulation, as verified by other research [69,70,71,72]. Simulation has been recognized as an important research tool; initially, simulation was an academic research tool, but with the advancement of computers, simulation has reached new levels. It is a remarkable tool that guarantees support in design, planning, analysis, and making decisions in different areas of research and development. Of course, robotics, as modern technology, has not been an exception [73,74].
This work is focused on implementing the best strategy of coverage path planning (CPP) over PV plants with UAVs using semantic segmentation in a deep learning server to obtain the RoI. The experimental results were obtained by simulating the CPP methods and using UAVs, such as the 3DR Iris and Typhoon aerial robots.
The key contributions of this work are as follows:
  • This work proposes CPPs as a novel strategy for conducting an inspection flight over a PV plant with an UAV, since there are no previous reports of such work.
  • This work used three CPP methods over three PV plants, which were modeled in a simulation environment to evaluate metrics and parameters. As a result, a relationship was found between the CPP width and energy consumption, and according to that, the best CPP method to implement.
  • This work proposes a hybrid CPP method that uses image processing and a DL server to find the RoI quickly and accurately, becoming a semi-automatic process.
  • A free simulation tool is provided, with an interface to simulate the inspection of PV plants with UAVs.
This paper is structured as follows. In Section 2, necessary definitions and the techniques used to obtain the results are described. In Section 3, the three CPP methods implemented are compared to show relationships among the CPP width, number of maneuvers, and energy consumption, with the aim of finding the best CPP method to implement. Finally, in Section 4, some conclusions are given.

2. Materials and Methods

In this work, three PV power plants were selected that met the image requirements of no light distortions, non-complex geometry, and grouped panels. These plants are in different parts of the world. The first PV plant has an area of 35 , 975 square meters, located in Brazil (−22.119948323621525, −51.44666423682603), known as Usina Solar Unioeste 1 [75]. The second PV plant has an area of 25 , 056 square meters, located in Iran (34.0504329771808, 49.796635265177294), known as Arak power plant [37]. The third plant is in the United States (38.55989816199527, −121.42374978032115); the plant has 1344 square meters of area, and it is part of a PV plant located on the roof of the California State University Sacramento Library (CSUSL) [76].
These plants were subjected to a series of processing stages, as shown in Figure 1. The RoIs were obtained from Google Maps satellite images with a predefined altitude, and limits, from which the image (input image) of the desired PV system was obtained as the first stage, shown in Figure 1. In the second stage, the image was entered into a DL server that was developed in this work, taking into account the previous work [65]. The DL server was launched with TensorFlow [77] and Flask [78]. This stage obtains a mask (image segmentation) of the PV plant, as shown in Figure 1. In the third stage it was necessary to apply a series of OpenCV functions (post-processing), as referenced in Algorithm A1 in the Appendix A, to adjust the mask (output mask) to the PV plant area, as seen in Figure 1. Then, the output mask or RoI was introduced in the GUI interface, where the CPP method was selected to internally execute. Later, the path GPS positions (CPP computed) were sent to the UAV through MAVlink commands [79,80]. The UAV executed these commands in the Gazebo platform (CPP simulation) and, at the same time, the simulation data was fed back the GUI interface. Simultaneously all GPS points reached by the UAV were drawn in the GCS platform (CPP). In this stage, the trajectory was validated. Each stage is described in greater detail in the following sections.

2.1. Deep Learning (DL) Server for Segmentation

A deep learning (DL) server for segmentation was necessary to extract the RoIs from the Google Maps images. Additionally, the server was used to achieve this task automatically with the process called semantic segmentation, wherein each pixel is labeled with the class of its enclosing object [65,66]. In previous work, a convolutional neural network was proposed, wherein a public database was used; this data was prepared, resized for training, and assessed with two network structures. The U-net network had the best performance, in terms of metrics, in the semantic segmentation task [65]. Then a DL server was employed to perform image segmentation and obtain the RoI [81].

2.2. Post Processing

In this step, a set of OpenCV functions were applied with Python 3.7. For example, in the first function, morphological operators like “Erode” and “Dilate” were applied to the images, then the “FindCountours” function was applied to help extract their contours. The contour can be defined as a curve that joins all the continuous points at the boundary of the PV installation. So, the “ContourArea” function was then used to find the area of the previous contour. Following this pattern, the area was compared with 400 others to filter the bigger area and eliminate the little areas belonging to false positives. Then the “ApproxPolyDP” function was used to approximate a shape of the contour to another shape with fewer vertices. Subsequently, the “DrawContour” function was used to draw the resulting contour [82,83]. Finally, the “Erode” morphological operators were used again, to expand the known area and compensate for the limitations of the mask with regard to the CPP method and some faulty occurrences caused by the false positives of the DL server. The pseudocode of the openCV functions used is shown in Algorithm A1 in the Appendix A.

2.3. 2D Coverage Path Planning Method in the GUI Interface

In previous studies of CPPs, there were many existing methods from which to select to solve the CPP problem. In this work, three methods based on CPP were selected, considering the following criteria: time of execution, ease of implementation, and more, which were used to cover the RoI. The methods were selected according to [38,39].
The boustrophedon exact cellular decomposition (BECD), which was proposed by [84], was the first selected. The CPP method is delineated in Figure 2.
The second was grid-based spanning tree coverage (GBSTC), which works with cellular decomposition, first proposed by [85], and depicted in Figure 3.
The third method selected for this project was grid-based wavefront coverage (GBWC), first proposed by [86]. The method is illustrated in Figure 4. Each of these CPP methods is explained in more detail in the following sections.
(a). The boustrophedon exact cellular decomposition (BECD) Method: This method takes the robot’s free space and obstacles and splits them into cells. These cells are covered by the robot using a back-and-forth pattern from the initial point to the final point, using maneuvers of 90 degrees to change direction from south to north or vice versa, as shown in Figure 2. This method improves the trapezoid decomposition technique, as it exploits the structure of the polygon to determine the start and end of an obstacle, and thus is able to divide the free space into a few cells that do not require a redundant step, and it permits the coverage of curved areas [84].
(b). Grid-Based Spanning Tree Coverage (GBSTC): This method is based on approximate cell decomposition and differs from the previous method in that the following postulates were considered. First, the method divides the space into grids of side L. Second, the robot only moves in perpendicular directions to the sides of the grid. Third, every grid is subdivided into four grids of side L / 2 . Finally, GBSTC discards space that is partly occupied by obstacles. Consequently, considering these previous postulates, the method consists of several stages: in the first stage, a graphic structure is defined, S ( N , E ) , where N is nodes, defined as the central point of each grid, and E is edges, defined as the line segments connecting the centers of adjoining grids, as shown in Figure 3a. In the second stage, the method builds a spanning tree for S, and employs this tree to plan a cover path as follows. Starting in grid I with a sub-grid of side L / 2 , the robot begins by travelling between adjoining sub-grids along a path that moves around the spanning tree at a constant distance and in a counterclockwise direction, finishing when the initial sub-grid, I, is found again, which means it is also in the final point, F [85]. An example of this method is illustrated in Figure 3b. The approximation depends on the side length, L, of the grid.
(c). grid-based wavefront coverage (GBWC): The first grid-based method proposed for CPP, GBWC is an offline method that uses a grid representation, in addition to applying a full CPP method. The method requires an initial grid, I, and a final grid, F. A distance transformation that propagates a wavefront from the final to the initial point is used to assign a specific number to each item on the grid. That is, the method first assigns a zero to the final item, and then a one to all its surrounding grids. Then all unmarked grids adjoining those marked one are numbered two. The process is incrementally repeated until the wavefront reaches the initial grid [86], as illustrated in Figure 4a.
Once the distance transformation is determined, a coverage path can be found by starting at the initial grid, I, and selecting the adjoining grid with the highest number that has not been explored. If two or more are unexplored, and adjoining grids share the same number, one of them is randomly selected, as shown in Figure 4b.

2.3.1. Metrics

The metrics evaluate the performance of the three CPP methods. Such assessment can be performed by considering five commonly used metrics to evaluate the effectiveness of the proposed CPP methods both theoretically and in simulation (dynamically) [7,41,43,62]. These five metrics are covered path length, flight time, energy consumption, redundancy of points traveled, and percentage of coverage of the total area. Each of these metrics is described below.
The L c t metric (covered path length) is the length of the entire path covered by the UAV from the initial to the final point. For a trajectory in a 2D plane composed of n points, assuming the initial point as ( x 1 , f ( x 1 ) ) and the end point as ( x n , f ( x n ) ) , the L c t can be computed as shown in Equation (1):
L c t = i = 1 n 1 ( x i + 1 x i ) 2 + ( f ( x 1 + 1 ) f ( x i ) ) 2
where ( x i , f ( x i ) ) , in which i = 1 , 2 , , n , representing all the n points of the UAV flight path in 2D coordinates. More details can be found in [87].
The flight time metric of the PX4 SITL Gazebo model is the time required to travel the total flight path (takeoff, path travel, and landing) with a dynamic speed that considers the inertia and the variation in speed due to UAV turns angles. These data are collected through sensors in the Gazebo plugins [88].
The redundancy of points traveled, R % , corresponds to the number of points that are visited more than once from the total number of points that the path contains, Equation (2).
R % = P p c P v m o
where P p c , and P v m o correspond to the number of points the trajectory contains, and the number of points visited more than once.
The percentage of coverage of the total area, C % , measures the number of effectively covered points within the total number thereof by the points the area to be covered contains, given by Equation (3).
C % = P v P T
where P v , and P T correspond, respectively, to the total number of points visited and the total number of points in the area.
Moreover, the number of maneuvers metric, which is the number of turning maneuvers the UAV performs on a path, is often used as the main performance metric in coverage [89,90].
The energy consumption metric is computed from the voltage and current data from the power module of the PX4 SITL model [91]. Its value depends on parameters, such as the CPP width between lines and the speed and the height of the UAV at the time of implementing the CPP, which were configured in the interface and were simulated in Gazebo.
To validate the results of the methods described above, the BECD, GBSTC, and GBWC methods were implemented in two UAVs, simulated in Gazebo, and their performances were assessed by the metrics presented above [39]. The next section describes the results and compares the models in detail.

2.4. Simulation and Validation Platform

Based on [92], the Gazebo platform was selected to execute the simulation experiments, as it has extensive documentation on its webpage. In addition, it is the most mentioned and used simulation platform in previous work [73,74] that implemented path planning or CPP, used UAV sensors, or deployed several UAVs [92]. Gazebo also allows the modeling of commercial UAVs using the PX4 autopilot software [91], as shown in Figure 5. In addition, this figure shows one of the experiments conducted with the Typhoon UAV, flown over the CSUSL plant. The UAV sonar sensor, represented by blue lines, is also shown, as is the UAV camera, in the box at the upper right.
The integration of Gazebo [92], PX4 [91], Python, and the CPP methods was implemented using ROS (Robot Operating System) as middleware [93]. This tool allows communication among nodes. The nodes are processes, and each node has a task associated with it, such as sending a Mavlink command to control the UAV trajectory and permitting reading messages from the UAV to discern its status in flight, using a simulation mode referred to as software in the loop (SITL). This simulator provides the ability to run different vehicles, such as a plane, copter, or rover, without a need for any microcontrollers or hardware [94]. In addition, two PX4 autopilot rotary wing UAVs, the 3DR Iris and Typhoon UAV, were chosen because they have good maneuverability and are more suitable for small-scale CPP missions. Furthermore, these UAVs have been widely used in other research [67,70,95].
The GCS software (QGroundControl) was selected to validate the CPP calculated for each UAV and PV plant obtained, because it is the most compatible tool with the PX4 autopilot. It is also recommended on the PX4 webpage. On other hand, another compatible tool, namely a GUI, was designed using Qt to modify and vary the parameters and to convert the way points from RoI pixels to geo-referenced points [96].

3. Results and Discussion

The procedure described above—DL-server, post-processing, CPP, and metrics evaluation—are represented in Figure 1, was applied to three different PV plant images. The results obtained are described in the following sections.

3.1. Results with Deep Server, and OpenCV Functions

OpenCV functions and a DL server were combined to accurately extract the mask and to trade off the DL server results of the PV plant area. Then, the CPP methods used the mask as a region of interest (RoI).
The stages to obtain the results of the RoI in the three images are shown in Figure 6, Figure 7 and Figure 8. In the first stage, a high-resolution image from Google Maps, of predefined height and width, was obtained, and used as an input image in the DL server, depicted in Figure 6a. In the second stage, the output image of the DL server was the mask, shown in Figure 6b. In the third stage, the opening function was used in the mask, Figure 6c. In the fourth stage, the RoI was obtained using the draw contour method, as seen in Figure 6d. Finally, the RoI was blended with the input image with the aim of comparing the results in Figure 6e. The results were satisfactory and can be adapted, depending on the environment.

3.2. Results of the CPP Method

The CCP method results were obtained from six experiments that implemented the five metrics previously described. Each experiment was conducted by selecting a PV plant and a UAV, then choosing the CPP method and the width, speed, and height of the UAV over the flight path. All of this was done in the implemented GUI interface. The six tests were conducted by varying the CPP width parameter, while other parameters were not varied, and the same experimental conditions were maintained throughout.
For these experiments two UAVs were selected, the Typhoon [97] and 3DR Iris [98], as mentioned in Section 2.4. Three simulated PV plants were also chosen, as highlighted in Section 2. The metrics referenced in Section 2.3.1 were assessed for each test. Battery consumption was obtained from the GCS software with the SITL parameter activated. The rest of the data was collected from the Gazebo simulations. The experiments are explained in detail in the following sections, and some results are discussed.

3.2.1. The Three First Experiments with a Typhoon UAV

The three first CPP experiments were conducted with a Typhoon UAV, varying the CPP width between 0 and 20 m, depending on the PV plant being covered. For example, in the first and second PV plants, Unioeste 1, and Arak, respectively, the CPP widths were varied between 5 and 20 m. In the third CSUSL PV plant, the width was varied between 1 and 8 m, due to the method’s restrictions on running in small areas with a large width. Then, in each of these experiments, the resulting metrics were annotated in Tables, as shown in Appendix B.
The logged metrics were utilized to draw a clustered columns and lines chart for experiments 1, 2 and 3, performed with the Typhoon UAV, to highlight the most important information and to determine their correlations with each other. In these graphs the vertical axis was scaled logarithmically and the horizontal axis was labeled with the type of CPP method in use; in this way, it is possible to visualize the correlation of flight time, number of maneuvers, and path length covered with the energy consumption by each CPP method, as shown in Figure 9, Figure 10 and Figure 11. In addition, the relationship between the redundancy metric and the CPP width is identified, showing that this metric is higher in BECD, but does not affect energy consumption, which is the primary metric of interest.
The UAVs’ energy consumption, in conjunction with the CPP widths, were obtained from the energy consumption results logs, then a new Table 1 was composed of three large columns, corresponding to all the assessed PV plants, and each PV plant column is made up of three CPPs with their respective values of consumed energy. The rows of the table contain the CPP widths in increasing order from top to bottom, as shown in Table 1. Where it can be observed, some parameters, such as the energy consumption with regard to CPP width, were due to the size of the RoI. Additionally, it can be observed that the greater the CPP width, the lower the energy consumption, as seen in the columns of Table 1. For example, for the BECD CPP in Unioeste 1, with a width of 5 m, the percentage of energy consumed was 98%, while, for a width of 20, the energy consumed was 29%. It is also observed that the larger the PV plant, the UAV consumed all its energy in CPPs with narrow widths; on the other hand, if the plant is small, with the same width, the UAV does not have energy consumption problems. As can be seen in the row with a width of 8, where, for the Unioeste 1 and Arak PV plants, a lot of energy was consumed, between 88% and 52%, unlike the CSUSL plant, in which where the energy consumed was very little, between 7% and 5%. In the experiments for which energy consumption, with respect to CPP width, cannot be observed, ( N / A ) was used to annotate these results, which happened when the CPP width was very large with respect to the RoI, as this does not allow the generation of the route, or when the CPP width was very small with regard to the RoI causing a flight path in which the UAV consumes all its energy. In short, the UAV used can be undersized or oversized with respect to its intended PV plant.
The graphs in Figure 12 were constructed from Table 1 by quintic polynomial interpolations, which requires six data points to form a curve that passes through all given data points [99], where the abscissa axis is the CPP width and the ordinate axis is energy consumption. For the experiment conducted at the PV plant Unioeste 1, the graph in Figure 12a shows that the BECD method had the lowest energy consumption when the CPP width was in the range of 5 to 16 m, while GBSTC and GBWC had the lower energy consumption when the CPP width was in the range of 16 to 20 m. For the experiment tested at the Arak PV plant, the graph in Figure 12b shows that the BECD method had the lowest energy consumption when the CPP width was in the range of 5 to 10 m, while the GBSTC and GBWC had similar energy consumption when the CPP width was in the range of 10 to 15 m, and GBWC had the lowest energy consumption when the CPP width was in the range of 15 and 20 m. For tests conducted at the CSUSL PV plant, the graph in Figure 12c shows that the BECD method had the lowest energy consumption when the CPP width was in the range of 1 to 5.5 m, while the method GBSTC had the lowest energy consumption when the CPP width was in the range of 5.5 to 8 m. All plants were recreated in a simulation environment with their real dimensions.

3.2.2. The Last Three Experiments with 3DR Iris UAV

The last three CPP experiments were conducted with a 3DR Iris UAV, varying the CPP width between 1 and 20 m, depending on the PV plant covered. For example, in the first and second PV plants, Unioeste 1 and Arak, respectively, the CPP widths were varied between 8 and 20 m; in the third PV plant, CSUSL, the width was varied between 1 and 8 m, due to the restrictions of the method requiring it be run in small areas with a large width. Then, in each of these experiments, the resulting metrics were annotated in tables, as shown in Appendix C.
As in the previous chapter, the recorded metrics were used to create bar and line graphs for Experiments 4, 5 and 6, which are those conducted with the 3DR Iris UAV, to highlight the most important information and to determine correlations among the metrics. In these graphs, the vertical and horizontal axes were scaled logarithmically and labeled in the same manner as above, with the aim of visualizing the correlations between flight time, number of maneuvers and path length covered, with regard to energy consumption by each CPP method, as shown in Figure 13, Figure 14 and Figure 15 . Furthermore, the relationship between the redundancy metric and CPP width is identified, showing behavior similar to the previous UAV’s data.
The UAV energy consumption and CPP widths were obtained from the energy consumption results logs, then a new Table 2 was established, in which three large columns correspond to the PV plants, where each PV plant column contains three CPPs with their respective values of consumed energy, and the rows of the table contain the CPP widths in an increasing direction from top to bottom. Here it can be observed, as in the last table, that some parameters’ values, such as the energy consumption by CPP width, are due to the size of the RoI. It can also be observed that the greater the CPP width, the lower the energy consumption, as seen in the columns of Table 2. For example, using BECD at Unioeste 1 with a width of 10 m, the percentage of energy consumed was 100%, while for a width of 20 m, the energy consumed was 48%. It is also observed that, for larger PV plants, the UAV consumed all its energy in a CPP with a small width, whereas, for small plants of the same width, the UAV did not have energy waste problems. As can be seen in the row corresponding to a CPP width of 8 m, at the Unioeste 1 and Arak plants a lot of energy was consumed, between 100%, and 88%, unlike at the CSUSL plant, where the energy consumed was very little, between 9%, and 8%. Similarly, in the previous experiments, wherein the UAV’s energy was exhausted and the CPP width did not allow the generation of the route or when the CPP width was large with regard to the RoI, the results were scored with ( N / A ). To summarize, the UAV used is suitable for the last PV plant.
On the other hand, the 3DR Iris UAV, with its design, size, autonomy, and performance is suitable when the PV plant to be inspected is small, of an approximate size of 5000 square meters or less, such as the CSUSL PV plant, at 1344 square meters. Owing to such PV plant sizes, any coverage method used in this work can be used to cover an area with a CPP width of 1 meter, while minimizing the metrics of path length covered, flight time and maneuverability, to obtain lower energy consumption, as shown in the Table 2. In addition, this UAV did not perform well at large or medium-sized plants, since, to cover them using the CPP method, the width that must be provided is greater than 8 m; therefore, though the UAV’s inspection of the PV plant had no high-resolution cameras on board, they were needed; without them, the inspection cannot be guaranteed.
The graphs in Figure 16 were constructed from Table 2 using quintic polynomial interpolations, wherein the abscissa axis is CPP width and the ordinate axis is energy consumption.
Additionally, the graph in Figure 16a shows that the BECD method had the lowest energy consumption; when the CPP width spanned the entire test range, the GBSTC and GBWC methods showed a higher energy consumption at the Unioeste 1 PV plant. The graph in Figure 16b shows that the BECD method had the lowest energy consumption when the CPP width was in the range of 8 to 10 m, whereas BECD and GBWC had similar energy consumption, leaving the GBWC method with the lowest consumption, when the CPP width was in the range of 10 to 15 m. On the other hand, the GBWC method had the lowest energy consumption when the CPP width was in the range of 15 and 20 m; this test was conducted at the Arak PV plant. Finally, the graph in Figure 16c shows that the BECD method had the lowest energy consumption when the CPP width was in the range of 1 to 3 m, and the GBWC method had the lowest consumed energy when the CPP width was in the range of 3 to 8 m. The test was performed at CSUSL’s PV plant.
Summary Tables of the results of the metrics for each of the tests with the Typhoon UAV and 3DR Iris UAV are shown. All files, and logs for the experiments are available on GitHub at [100].

3.3. Discussion

The proposed strategy allows a semi-automatic and faster solution for achieving effective results when inspecting PV plants in geometrically simple areas, with certain limitations, although some results obtained in this work are theoretical results, such as from CPP width, for example; in practice, a camera of 14 megapixels is not adequate to inspect a PV plant with a CPP width of 20 m [37,101].
The results obtained in this work indicate that the most adequate method is BECD for a specific range of CPP widths, although it also shows adequate performance for various CPP widths in some RoIs. The GBWC showed a good performance when using a CPP width greater than 7 m in some RoIs, and all showed a relationship with the area to be covered, as shown in Table 1 and Table 2.
An analysis of the metrics used in this work, such as redundancy, R % , which does not represent a significant factor when comparing these three CPP methods, showed that, although the BECD was the method with the highest redundancy, it was also the method with the lowest consumption of energy. On the other hand, the other metrics, such as the L c t metric, flight time metric, and the number of maneuvers, are directly related to energy consumption, as can be seen in Appendix B and Appendix C. It also helps to conclude that the BECD method is more suitable for widths in a range between 0 and 7 m, due to the lower number of maneuvers in these ranges, as other authors have also mentioned [102,103]. The percentage of coverage of the total area, C % , always showed that coverage was total
The L c t metric, flight time and number of maneuvers were directly related to the consumption of energy for all the experiments performed, as seen in Figure 9, Figure 10 and Figure 11 and Figure 13, Figure 14 and Figure 15, and as confirmed by other authors [89,90], in addition to helping to reinforce that the most appropriate CPP for certain ranges is the BECD.
The BECD method is the best method among the three methods tested, in a specific width range, from 0 to 7 m, for all the RoIs tested, which means that for a 10 Megapixel camera with a horizontal field of view of 7 m, the CCP method’s good images, in terms of what to inspect of a PV plant can be obtained [23].
The implemented BECD method divided the RoI into small regions, and then, over these regions, it implemented the round-trip coverage pattern, called boustrophedon [38,39]. This pattern allows spending less energy consumption with small widths, due to its low number of maneuvers compared with the other methods, as shown in Appendix B and Appendix C.
On the other hand, the GBWC method allowed lower energy consumption, according to the results of the simulations, in widths greater than a certain value, but depending on RoI size; this method also allows an approximate coverage outside the RoI, a characteristic that is very important in this type of application, and that, perhaps, is not very attractive for terrestrial robotics, from which this type of method originates [86].
A RoI with many concave points is a great challenge for performance metrics, since they increase with the distance of travel, and also increase the number of maneuvers and therefore increase energy consumption, according to these results in the following works [67,103], a more detailed analysis on these characteristics was made.
In future works, the methods (BECD, GBWC) could be implemented in UAVs with characteristics similar to those used in this work; these characteristics are shown in Table A1 and Table A5. Additionally, with these UAVs, an inspection could be conducted in at least one PV plant. Where one can think about the implementation of an expert system that selects the coverage path between the two types of methods (BECD, GBWC), according to the CPP width size, required for a camera with a certain resolution, and focal length.
Finally, the results obtained with the BECD, and GBWC methods differ from the results obtained by other authors [34,35,36,37]. Who did not implement CPPs to solve these types of problems; they used other types of solutions that have restrictions when inspecting PV plants with UAVs. On the contrary, this work considers the CPPs, and obtained interesting results for future real implementation. The proposed CPP will increase the possibility of using inexpensive UAV systems for the inspection of PV systems on roofs of houses, and commercial buildings, and also, of the use of CPP with small widths to complete inspections at centimeter scales of the panel where the flaws can be better seen.

4. Conclusions

In this work, a method for implementing CPP in UAV for PV plant inspection was presented. The method consisted of a series of steps, one of these was the deployment of a DL-based U-net model to establish a DL service system from which to extract the limits of PV plants by extracting the boundaries of PV plants from an image. To summarize, the method was accurate, and fast without depending on the image, with low request latency and response.
This experiment focused on three novel path planning methods in the PV inspection missions in order to find the best path for covering each of the three PV power plants with less energy consumption. A GUI interface was used to order the UAV’s maneuvers in the inspection of the simulated PV plants. The results of each CPP method in the simulation were compared. The best CPPs was the BECD, for a range of CPP widths of 0 to 7 m. These path planning algorithms can be performed by any multirotor UAV that receives Mavlink commands, can carry a camera sensor, and transmit real-time video to GCS.
Performance on the CPP tasks was measured using two different types of flying robots, a Typhoon UAV and a 3DR Iris UAV. It was demonstrated that the Typhoon UAV (or one with similar characteristics) is better suited for large or medium-sized PV plants (such as those in deserts, plains, and hills); instead, the drone-like 3DR Iris UAV is more suitable for small-sized PV generation (such as on roofs and rooftops, canopies, and facades). The proposed strategy allows such comparisons to be made and enables the selection of the most suitable UAV for each type of installation.
The values obtained for the metrics collected from each of the tests show a correlation between covered path length, flight time, number of maneuvers as regards energy consumption, and ensuring that the CPPs implemented in UAVs to inspect photovoltaic plants could be similar when implemented in real plants. The results also help to predict the energy consumption of a given UAV when performing a plant inspection.

Author Contributions

Conceptualization, A.é.P.-G., N.B.-M., Á.J.-D. and J.B.C.-Q.; methodology, A.é.P.-G.; software, A.é.P.-G., N.B.-M.; validation, A.é.P.-G.; formal analysis, A.é.P.-G.; investigation, A.é.P.-G., N.B.-M.; resources, A.é.P.-G., Á.J.-D. and J.B.C.-Q.; data curation, A.é.P.-G.; writing–original draft preparation, A.é.P.-G.; writing–review and editing, A.é.P.-G., N.B.-M., Á.J.-D. and J.B.C.-Q.; visualization, A.é.P.-G.; supervision, Á.J.-D. and J.B.C.-Q.; project administration, A.é.P.-G., Á.J.-D. and J.B.C.-Q.; funding acquisition, A.é.P.-G., Á.J.-D. and J.B.C.-Q. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Colombia Scientific Program within the framework of the so-called Ecosistema Científico (Contract No. FP44842-218-2018).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The models used in the computational experiment are available at GitHub in [100].

Acknowledgments

The authors gratefully acknowledge the support from the Colombia Scientific Program within the framework of the call Ecosistema Científico (Contract No. FP44842-218-2018). The authors also want to acknowledge Universidad de Antioquia for its support through the project “estrategia de sostenibilidad”.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
BECDboustrophedon exact cellular decomposition
GBSTCgrid-based spanning tree coverage
GBWCgrid-Based wavefront coverage
DLdeep learning
UAVunmanned aerial vehicle
PVphotovoltaic
O and Moperation and maintenance
CAGRcompound annual growth rate
LCOElevelized cost of energy
List of Symbols
L c t covered path length
R % redundancy of points traveled
C % percentage of coverage of the total area

Appendix A. Algorithm

In this step, a series of OpenCV functions, in python 3.7 and in jupyter-lab, were applied.
Algorithm A1: OpenCV functions.
Applsci 11 12093 i001

Appendix B. Tables of Typhoon UAV

Table A1. Yuneec Typhoon UAV technical specification.
Table A1. Yuneec Typhoon UAV technical specification.
Typhoon UAV
Applsci 11 12093 i002
Dimensions520 × 457 × 310 mm (20.5 × 18 × 12.2 inches)
Weight1980 g (69.8 ounces)
Battery5400 mAh 4S/ 14.8 V (79.9 Wh)
Camera12.4 megapixels, 14mm/F2.8
Flight Timeup to 25 min
Flight Speed20o m/s
Payload Capacity10.400 g
MotorAC YUNH520120
Table A2. Experiment 1 with three CPP methods implemented using Typhoon UAV over Unioeste 1 PV Plant.
Table A2. Experiment 1 with three CPP methods implemented using Typhoon UAV over Unioeste 1 PV Plant.
WidthCPPRedundacyDistanceFlight TimeManeuverEnergy Consumed
5BECD8.237591.731.2939288
814.184751.419.366874
107.343615.914.544556
1213.362987.212.223947
1515.332388.510.182838
2013.332388.5102229
5GBSTC1.277154.643527100
82.274444.322.4717885
101.833486.216.3510062
122.762842.114.1910454
154.382288.9211.177442
202.6716,655.18.215432
5GBWC3.667270.838.41392100
88.514732.324.217588
107.953707.916.388363
1211.983095.214.357855
1512.412447.712.027845
2010.671779.38.13931
Table A3. Experiment 2 with three CPP methods implemented using Typhoon UAV over Arak PV Plant.
Table A3. Experiment 2 with three CPP methods implemented using Typhoon UAV over Arak PV Plant.
WidthCPPRedundacyDistanceFlight TimeManeuverEnergy Consumed
5BECD12.844612.521.0812580
818.843021.213.527452
1025.522339.87.46241
1228.892029.99.366236
1512.841547.957.163528
2036.591126.95.252521
5GBSTC2.834308.427.2836594
83.962766.816.2811962
105.732089.912.0711945
127.411809.710.038537
158.751373.77.46629
209.36966.65.093320
5GBWC9.074497.528.3439094
84.262789.615.0119363
104.692078.411.29843
1214.071922.49.245835
156.251360.427.024626
209.761002.85.042519
Table A4. Experiment 3 with three CPP methods implemented using Typhoon UAV over CSUSL PV Plant.
Table A4. Experiment 3 with three CPP methods implemented using Typhoon UAV over CSUSL PV Plant.
WidthCPPRedundacyDistanceFlight TimeManeuverEnergy Consumed
1BECD8.971579.49.4719235
220.05814.95.288620
322.09525.13.446214
421.11406.13.043811
642.86296.22.19279
886.67198.61.43207
1GBSTC1.251289.913.154247
22.64680.66.2315423
34.55448.54.158815
45.56370.233.235812
65.71233.22.08308
840177.31.38196
1GBWC1.371420.711.5646243
24.22705.96.3316824
311.04506.14.016915
48.89404.43.054412
613.89278.72.1248
813.33157.51.28145

Appendix C. Tables of 3DR Iris UAV

Table A5. 3D Robotics UAV 3DR Iris technical specification.
Table A5. 3D Robotics UAV 3DR Iris technical specification.
3D Robotics 3DR Iris
Applsci 11 12093 i003
Dimensions10 cm in height, 55 cm motor-to-motor
Weight1282 g
Battery5100 mAh 3S
CameraN/A
Flight Time15–20 mins
Flight Speed11 m/s
Payload Capacity400 g
MotorAC 2830, 950 kV
Table A6. Experiment 4 with three CPP methods implemented using UAV 3DR Iris over Unioeste 1 PV Plant.
Table A6. Experiment 4 with three CPP methods implemented using UAV 3DR Iris over Unioeste 1 PV Plant.
WidthCPPRedundacyDistanceFlight TimeManeuverEnergy Consumed
10BECD7.343619.114.484596
1213.362987.112.243980
1515.332391.310.043564
1718.522234.19.132560
1813.791870.37.592652
2013.331773.87.252248
10GBSTC1.833485.516.39100100
122.76282714.1210492
154.382280.311.237473
176.382101.510.366868
184.61790.48.525558
202.671651.28.215455
10GBWC7.953716.516.2983100
1211.983094.314.378695
1512.41244212.057878
1714.812255.810.194667
1811.491883.78.474357
2010.671783.38.183954
Table A7. Experiment 5 with three CPP methods implemented using the UAV 3DR Iris over Arak PV Plant.
Table A7. Experiment 5 with three CPP methods implemented using the UAV 3DR Iris over Arak PV Plant.
WidthCPPRedundacyDistanceFlight TimeManeuverEnergy Consumed
7BECD15.423292.715.318095
818.843020.214.27488
1025.522339.312.526270
1228.892030.49.575060
1531.251548.57.273545
2036.591133.35.212526
7GBSTC5.732090.443527100
83.962757.817.0617998
105.732090.412.3611976
127.411809.310.188563
158.7513,674.286648
209.769715.073326
5GBWC3.667270.838.41392100
84.692078.811.47982
104.692078.811.479871
1214.071922.79.435859
156.251360.087.074643
206.251360.087.074625
Table A8. Experiment 6 with three CPP methods implemented using the UAV 3DR Iris over CSUSL PV Plant.
Table A8. Experiment 6 with three CPP methods implemented using the UAV 3DR Iris over CSUSL PV Plant.
WidthCPPRedundacyDistanceFlight TimeManeuverEnergy Consumed
1BECD10.061724.810.3921152
215.389786.129530
319.3597.24.186221
426.67441.53.334717
647.22316.72.33012
866.67197.41.47209
1GBSTC1.771416.415.3162672
23.61697.47.222035
35.85508.6510824
45.56373.43.286517
619.44270.12.223212
840198.81.52199
1GBWC3.941414.915.362672
25.53817.47.3621838
37.65534.368622
48.89404.43.054415
613.89274.72.132411
820184.91.41168

References

  1. REN21®. GLOBAL STATUS REPORTS. 2021. Available online: https://www.ren21.net/reports/global-status-report/ (accessed on 14 August 2021).
  2. Alkhraijah, M.; Alowaifeer, M.; Alsaleh, M.; Alfaris, A.; Molzahn, D.K. The Effects of Social Distancing on Electricity Demand Considering Temperature Dependency. Energies 2021, 14, 473. [Google Scholar] [CrossRef]
  3. Mey, A. Most U.S. Utility-Scale Solar Photovoltaic Power Plants Are 5 Megawatts or Smaller-Today in Energy-U.S. Energy Information Administration (EIA). Available online: https://www.eia.gov/todayinenergy/detail.php?id=38272# (accessed on 3 October 2021).
  4. Maghrabie, H.M.; Abdelkareem, M.A.; Al-Alami, A.H.; Ramadan, M.; Mushtaha, E.; Wilberforce, T.; Olabi, A.G. State-of-the-Art Technologies for Building-Integrated Photovoltaic Systems. Buildings 2021, 11, 383. [Google Scholar] [CrossRef]
  5. Hao, P.; Zhang, Y.; Lu, H.; Lang, Z. A novel method for parameter identification and performance estimation of PV module under varying operating conditions. Energy Convers. Manag. 2021, 247, 114689. [Google Scholar] [CrossRef]
  6. IRENA. Future of Solar Photovoltaic: Deployment, Investment, Technology, Grid Integration, and Socio-Economic Aspects. 2019. Available online: https:https://www.irena.org (accessed on 24 October 2021).
  7. Grimaccia, F.; Leva, S.; Dolara, A.; Aghaei, M. Survey on PV modules common faults after an O and M flight extensive campaign over different plants in Italy. IEEE J. Photovoltaics 2017, 7, 810–816. [Google Scholar] [CrossRef] [Green Version]
  8. Poulek, V.; Šafránková, J.; Černá, L.; Libra, M.; Beránek, V.; Finsterle, T.; Hrzina, P. PV Panel and PV Inverter Damages Caused by Combination of Edge Delamination, Water Penetration, and High String Voltage in Moderate Climate. IEEE J. Photovoltaics 2021, 11, 561–565. [Google Scholar] [CrossRef]
  9. Jamil, W.J.; Rahman, H.A.; Shaari, S.; Salam, Z. Performance degradation of photovoltaic power system: Review on mitigation methods. Renew. Sustain. Energy Rev. 2017, 67, 876–891. [Google Scholar] [CrossRef]
  10. Kaplani, E. PV cell and module degradation, detection and diagnostics. In Renewable Energy in the Service of Mankind Vol II, Selected Topics from the World Renewable Energy Congress WREC, London, UK, 3–8 August 2014; Springer: Cham, Switzerland, 2016; pp. 393–402. [Google Scholar]
  11. Di Lorenzo, G.; Araneo, R.; Mitolo, M.; Niccolai, A.; Grimaccia, F. Review of O and M Practices in PV Plants: Failures, Solutions, Remote Control, and Monitoring Tools. IEEE J. Photovoltaics 2020, 10, 914–926. [Google Scholar] [CrossRef]
  12. Donovan, C.W. Renewable Energy Finance: Funding the Future of Energy; World Scientific Publishing Co. Pte. Ltd.: Singapore, 2020. [Google Scholar]
  13. Narvarte, L.; Fernández-Ramos, J.; Martínez-Moreno, F.; Carrasco, L.; Almeida, R.; Carrêlo, I. Solutions for adapting photovoltaics to large power irrigation systems for agriculture. Sustain. Energy Technol. Assess. 2018, 29, 119–130. [Google Scholar] [CrossRef] [Green Version]
  14. Grimaccia, F.; Leva, S.; Niccolai, A.; Cantoro, G. Assessment of PV plant monitoring system by means of unmanned aerial vehicles. In Proceedings of the 2018 IEEE International Conference on Environment and Electrical Engineering and 2018 IEEE Industrial and Commercial Power Systems Europe (EEEIC/I&CPS Europe), Palermo, Italy, 12–15 June 2018; pp. 1–6. [Google Scholar]
  15. Guerrero-Liquet, G.C.; Oviedo-Casado, S.; Sánchez-Lozano, J.; García-Cascales, M.S.; Prior, J.; Urbina, A. Determination of the Optimal Size of Photovoltaic Systems by Using Multi-Criteria Decision-Making Methods. Sustainability 2018, 10, 4594. [Google Scholar] [CrossRef] [Green Version]
  16. Grimaccia, F.; Aghaei, M.; Mussetta, M.; Leva, S.; Quater, P.B. Planning for PV plant performance monitoring by means of unmanned aerial systems (UAS). Int. J. Energy Environ. Eng. 2015, 6, 47–54. [Google Scholar] [CrossRef] [Green Version]
  17. Shen, K.; Qiu, Q.; Wu, Q.; Lin, Z.; Wu, Y. Research on the Development Status of Photovoltaic Panel Cleaning Equipment Based on Patent Analysis. In Proceedings of the 2019 3rd International Conference on Robotics and Automation Sciences (ICRAS), Wuhan, China, 1–3 June 2019; pp. 20–27. [Google Scholar]
  18. Azaiz, R. Flying Robot for Processing and Cleaning Smooth, Curved and Modular Surfaces. U.S. Patent App. 15/118,849, 2 March 2017. [Google Scholar]
  19. Libra, M.; Daneček, M.; Lešetickỳ, J.; Poulek, V.; Sedláček, J.; Beránek, V. Monitoring of Defects of a Photovoltaic Power Plant Using a Drone. Energies 2019, 12, 795. [Google Scholar] [CrossRef] [Green Version]
  20. Li, X.; Li, W.; Yang, Q.; Yan, W.; Zomaya, A.Y. An unmanned inspection system for multiple defects detection in photovoltaic plants. IEEE J. Photovoltaics 2019, 10, 568–576. [Google Scholar] [CrossRef]
  21. Alsafasfeh, M.; Abdel-Qader, I.; Bazuin, B.; Alsafasfeh, Q.; Su, W. Unsupervised fault detection and analysis for large photovoltaic systems using drones and machine vision. Energies 2018, 11, 2252. [Google Scholar] [CrossRef] [Green Version]
  22. Gallardo-Saavedra, S.; Franco-Mejia, E.; Hernández-Callejo, L.; Duque-Pérez, Ó.; Loaiza-Correa, H.; Alfaro-Mejia, E. Aerial thermographic inspection of photovoltaic plants: Analysis and selection of the equipment. In Proceedings of the 2017 Proceedings ISES Solar World Congress, IEA SHC, Abu Dhabi, UAE, 29 October–2 November 2017. [Google Scholar]
  23. Leva, S.; Aghaei, M.; Grimaccia, F. PV power plant inspection by UAS: Correlation between altitude and detection of defects on PV modules. In Proceedings of the 2015 IEEE 15th International Conference on Environment and Electrical Engineering (EEEIC), Rome, Italy, 10–13 June 2015; pp. 1921–1926. [Google Scholar]
  24. Aghaei, M.; Dolara, A.; Leva, S.; Grimaccia, F. Image resolution and defects detection in PV inspection by unmanned technologies. In Proceedings of the 2016 IEEE Power and Energy Society General Meeting (PESGM), Boston, MA, USA, 17–21 July 2016; pp. 1–5. [Google Scholar]
  25. Quater, P.B.; Grimaccia, F.; Leva, S.; Mussetta, M.; Aghaei, M. Light unmanned aerial vehicles (UAVs) for cooperative inspection of PV plants. IEEE J. Photovoltaics 2014, 4, 1107–1113. [Google Scholar] [CrossRef] [Green Version]
  26. Oliveira, A.K.V.; Aghaei, M.; Madukanya, U.E.; Rüther, R. Fault inspection by aerial infrared thermography in a pv plant after a meteorological tsunami. Rev. Bras. Energ. Sol. 2019, 10, 17–25. [Google Scholar]
  27. Tsanakas, J.A.; Chrysostomou, D.; Botsaris, P.N.; Gasteratos, A. Fault diagnosis of photovoltaic modules through image processing and Canny edge detection on field thermographic measurements. Int. J. Sustain. Energy 2015, 34, 351–372. [Google Scholar] [CrossRef]
  28. Niccolai, A.; Grimaccia, F.; Leva, S. Advanced Asset Management Tools in Photovoltaic Plant Monitoring: UAV-Based Digital Mapping. Energies 2019, 12, 4736. [Google Scholar] [CrossRef] [Green Version]
  29. Thrower, N.J.; Jensen, J.R. The orthophoto and orthophotomap: Characteristics, development and application. Am. Cartogr. 1976, 3, 39–56. [Google Scholar] [CrossRef]
  30. Yao, Y.Y.; Hu, Y.T. Recognition and location of solar panels based on machine vision. In Proceedings of the 2017 2nd Asia-Pacific Conference on Intelligent Robot Systems (ACIRS), Wuhan, China, 16–18 June 2017; pp. 7–12. [Google Scholar]
  31. Leva, S.; Aghaei, M. Power Engineering: Advances and Challenges Part B: Electrical Power. In Chapter 3: Failures and Defects in PV Systems Review and Methods of Analysis; Taylor & Francis Group, CRC Press: Boca Raton, FL, USA, 2018. [Google Scholar]
  32. Kim, D.; Youn, J.; Kim, C. Automatic photovoltaic panel area extraction from uav thermal infrared images. J. Korean Surv. Soc. 2016, 34, 559–568. [Google Scholar] [CrossRef] [Green Version]
  33. Li, X.; Yang, Q.; Lou, Z.; Yan, W. deep learning Based Module Defect Analysis for Large-Scale Photovoltaic Farms. IEEE Trans. Energy Convers. 2019, 34, 520–529. [Google Scholar] [CrossRef]
  34. Ding, Y.; Cao, R.; Liang, S.; Qi, F.; Yang, Q.; Yan, W. Density-Based Optimal UAV Path Planning for Photovoltaic Farm Inspection in Complex Topography. In Proceedings of the 2020 Chinese Control And Decision Conference (CCDC), Hefei, China, 22–24 August 2020; pp. 3931–3936. [Google Scholar]
  35. Luo, X.; Li, X.; Yang, Q.; Wu, F.; Zhang, D.; Yan, W.; Xi, Z. Optimal path planning for UAV based inspection system of large-scale photovoltaic farm. In Proceedings of the 2017 Chinese Automation Congress (CAC), Jinan, China, 20–22 October 2017; pp. 4495–4500. [Google Scholar] [CrossRef]
  36. Salahat, E.; Asselineau, C.A.; Coventry, J.; Mahony, R. Waypoint Planning for Autonomous Aerial Inspection of Large-Scale Solar Farms. In Proceedings of the IECON 2019-45th Annual Conference of the IEEE Industrial Electronics Society, Lisbon, Portugal, 14–17 October 2019; Volume 1, pp. 763–769. [Google Scholar] [CrossRef]
  37. Sizkouhi, A.M.M.; Esmailifar, S.M.; Aghaei, M.; De Oliveira, A.K.V.; Rüther, R. Autonomous path planning by unmanned aerial vehicle (UAV) for precise monitoring of large-scale PV plants. In Proceedings of the 2019 IEEE 46th Photovoltaic Specialists Conference (PVSC), Chicago, IL, USA, 16–21 June 2019; pp. 1398–1402. [Google Scholar]
  38. Choset, H.M.; Lynch, K.M.; Hutchinson, S.; Kantor, G.; Burgard, W.; Kavraki, L.; Thrun, S.; Arkin, R.C. Principles of Robot Motion: Theory, Algorithms, and Implementation; MIT Press: Cambridge, MA, USA, 2005. [Google Scholar]
  39. Cabreira, T.M.; Brisolara, L.B.; Ferreira Jr, P.R. Survey on coverage path planning with unmanned aerial vehicles. Drones 2019, 3, 4. [Google Scholar] [CrossRef] [Green Version]
  40. Sebbane, Y.B. Intelligent Autonomy of UAVs: Advanced Missions and Future Use; CRC Press: Boca Raton, FL, USA, 2018. [Google Scholar]
  41. Choset, H. Coverage for robotics–a survey of recent results. Ann. Math. Artif. Intell. 2001, 31, 113–126. [Google Scholar] [CrossRef]
  42. Veerajagadheswar, P.; Ping-Cheng, K.; Elara, M.R.; Le, A.V.; Iwase, M. Motion planner for a Tetris-inspired reconfigurable floor cleaning robot. Int. J. Adv. Robot. Syst. 2020, 17, 1729881420914441. [Google Scholar] [CrossRef] [Green Version]
  43. Coombes, M.; Fletcher, T.; Chen, W.H.; Liu, C. Optimal Polygon Decomposition for UAV Survey coverage path planning in Wind. Sensors 2018, 18, 2132. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  44. Islam, N.; Rashid, M.M.; Pasandideh, F.; Ray, B.; Moore, S.; Kadel, R. A Review of Applications and Communication Technologies for Internet of Things (IoT) and unmanned aerial vehicle (UAV) Based Sustainable Smart Farming. Sustainability 2021, 13, 1821. [Google Scholar] [CrossRef]
  45. Pham, H.X.; La, H.M.; Feil-Seifer, D.; Deans, M. A distributed control framework for a team of unmanned aerial vehicles for dynamic wildfire tracking. In Proceedings of the 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, BC, Canada, 24–28 September 2017; pp. 6648–6653. [Google Scholar]
  46. Kaljaca, D.; Vroegindeweij, B.; van Henten, E. Coverage trajectory planning for a bush trimming robot arm. J. Field Robot. 2020, 37, 283–308. [Google Scholar] [CrossRef] [Green Version]
  47. Chang, W.; Yang, G.; Yu, J.; Liang, Z.; Cheng, L.; Zhou, C. Development of a power line inspection robot with hybrid operation modes. In Proceedings of the 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, BC, Canada, 24–28 September 2017; pp. 973–978. [Google Scholar]
  48. Mansouri, S.S.; Kanellakis, C.; Fresk, E.; Kominiak, D.; Nikolakopoulos, G. Cooperative coverage path planning for visual inspection. Control Eng. Pract. 2018, 74, 118–131. [Google Scholar] [CrossRef]
  49. Galceran, E.; Carreras, M. A survey on coverage path planning for robotics. Robot. Auton. Syst. 2013, 61, 1258–1276. [Google Scholar] [CrossRef] [Green Version]
  50. Bormann, R.; Jordan, F.; Hampp, J.; Hägele, M. Indoor coverage path planning: Survey, implementation, analysis. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, QLD, Australia, 21–25 May 2018; pp. 1718–1725. [Google Scholar]
  51. Nam, L.; Huang, L.; Li, X.; Xu, J. An approach for coverage path planning for UAVs. In Proceedings of the 2016 IEEE 14th International Workshop on Advanced Motion Control, AMC 2016, Auckland, New Zealand, 22–24 April 2016; pp. 411–416. [Google Scholar] [CrossRef] [Green Version]
  52. Dai, R.; Fotedar, S.; Radmanesh, M.; Kumar, M. Quality-aware UAV coverage and path planning in geometrically complex environments. Ad Hoc Netw. 2018, 73, 95–105. [Google Scholar] [CrossRef]
  53. Yao, P.; Cai, Y.; Zhu, Q. Time-optimal trajectory generation for aerial coverage of urban building. Aerosp. Sci. Technol. 2019, 84, 387–398. [Google Scholar] [CrossRef]
  54. Majeed, A.; Lee, S. A new coverage flight path planning algorithm based on footprint sweep fitting for unmanned aerial vehicle navigation in urban environments. Appl. Sci. 2019, 9, 1470. [Google Scholar] [CrossRef] [Green Version]
  55. Elfes, A.; Campos, M.; Bergerman, M.; Bueno, S.; Podnar, G. A robotic unmanned aerial vehicle for environmental research and monitoring. In Proceedings of the First Scientific Conference on the Large Scale Biosphere-Atmosphere Experiment in Amazonia (LBA), LBA Central Office, CPTEC/INPE, Belém, Pará, Brazil, 25–30 June 2000; pp. 12630–12645. [Google Scholar]
  56. Saeed, A.; Abdelkader, A.; Khan, M.; Neishaboori, A.; Harras, K.A.; Mohamed, A. On realistic target coverage by autonomous drones. ACM Trans. Sens. Netw. (TOSN) 2019, 15, 1–33. [Google Scholar] [CrossRef]
  57. Lingelbach, F. Path planning using probabilistic cell decomposition. In Proceedings of the IEEE International Conference on Robotics and Automation, 2004. Proceedings, ICRA’04, New Orleans, LA, USA, 26 April–1 May 2004; Volume 1, pp. 467–472. [Google Scholar]
  58. Khanam, Z.; Saha, S.; Ehsan, S.; Stolkin, R.; Mcdonald-Maier, K. coverage path planning Techniques for Inspection of Disjoint Regions With Precedence Provision. IEEE Access 2021, 9, 5412–5427. [Google Scholar] [CrossRef]
  59. Khiati, W.; Moumen, Y.; Habchi, A.E.; Zerrouk, I.; Berrich, J.; Bouchentouf, T. Grid Based approach (GBA): A new approach based on the grid-clustering algorithm to solve a CPP type problem for air surveillance using UAVs. In Proceedings of the 2020 Fourth International Conference On Intelligent Computing in Data Sciences (ICDS), Fez, Morocco, 21–23 October 2020; pp. 1–5. [Google Scholar] [CrossRef]
  60. Juliá, M.; Gil, A.; Reinoso, O. A comparison of path planning strategies for autonomous exploration and mapping of unknown environments. Auton. Robot. 2012, 33, 427–444. [Google Scholar] [CrossRef]
  61. Rodriguez-Esparza, E.; Zanella-Calzada, L.A.; Oliva, D.; Heidari, A.A.; Zaldivar, D.; Pérez-Cisneros, M.; Foong, L.K. An efficient Harris hawks-inspired image segmentation method. Expert Syst. Appl. 2020, 155, 113428. [Google Scholar] [CrossRef]
  62. Garcia-Garcia, A.; Orts-Escolano, S.; Oprea, S.; Villena-Martinez, V.; Martinez-Gonzalez, P.; Garcia-Rodriguez, J. A survey on deep learning techniques for image and video semantic segmentation. Appl. Soft Comput. 2018, 70, 41–65. [Google Scholar] [CrossRef]
  63. Long, J.; Shelhamer, E.; Darrell, T. Fully convolutional networks for semantic segmentation. In Proceedings of the IEEE conference on computer vision and pattern recognition, Boston, MA, USA, 7–12 June 2015; pp. 3431–3440. [Google Scholar]
  64. Ronneberger, O.; Fischer, P.; Brox, T. U-net: Convolutional networks for biomedical image segmentation. In International Conference on Medical Image Computing and Computer-Assisted Intervention; Springer: Munich, Germany, 2015; pp. 234–241. [Google Scholar]
  65. Pérez-González, A.; Jaramillo-Duque, A.; Cano-Quintero, J.B. Automatic Boundary Extraction for Photovoltaic Plants Using the Deep Learning UNet Model. Appl. Sci. 2021, 11, 6524. [Google Scholar] [CrossRef]
  66. Sizkouhi, A.M.M.; Aghaei, M.; Esmailifar, S.M.; Mohammadi, M.R.; Grimaccia, F. Automatic boundary extraction of large-scale photovoltaic plants using a fully convolutional network on aerial imagery. IEEE J. Photovoltaics 2020, 10, 1061–1067. [Google Scholar] [CrossRef]
  67. Di Franco, C.; Buttazzo, G. coverage path planning for UAVs photogrammetry with energy and resolution constraints. J. Intell. Robot. Syst. 2016, 83, 445–462. [Google Scholar] [CrossRef]
  68. Zefri, Y.; ElKettani, A.; Sebari, I.; Ait Lamallam, S. Thermal infrared and visual inspection of photovoltaic installations by UAV photogrammetry application case: Morocco. Drones 2018, 2, 41. [Google Scholar] [CrossRef] [Green Version]
  69. Coopmans, C.; Podhradskỳ, M.; Hoffer, N.V. Software-and hardware-in-the-loop verification of flight dynamics model and flight control simulation of a fixed-wing unmanned aerial vehicle. In Proceedings of the 2015 Workshop on Research, Education and Development of Unmanned Aerial Systems (RED-UAS), Cancun, Mexico, 23–25 November 2015; pp. 115–122. [Google Scholar]
  70. Roggi, G.; Niccolai, A.; Grimaccia, F.; Lovera, M. A Computer Vision Line-Tracking Algorithm for Automatic UAV Photovoltaic Plants Monitoring Applications. Energies 2020, 13, 838. [Google Scholar] [CrossRef] [Green Version]
  71. Coates, E.M.; Fossen, T.I. Geometric Reduced-Attitude Control of Fixed-Wing UAVs. Appl. Sci. 2021, 11, 3147. [Google Scholar] [CrossRef]
  72. Tullu, A.; Endale, B.; Wondosen, A.; Hwang, H.Y. Machine Learning Approach to Real-Time 3D Path Planning for Autonomous Navigation of unmanned aerial vehicle. Appl. Sci. 2021, 11, 4706. [Google Scholar] [CrossRef]
  73. Pitonakova, L.; Giuliani, M.; Pipe, A.; Winfield, A. Feature and performance comparison of the V-REP, Gazebo and ARGoS robot simulators. In Annual Conference Towards Autonomous Robotic Systems; Springer: Bristol, UK, 25–27 July 2018; pp. 357–368. [Google Scholar]
  74. Akcakoca, M.; Atici, B.M.; Gever, B.; Oguz, S.; Demirezen, U.; Demir, M.; Saldiran, E.; Yuksek, B.; Koyuncu, E.; Yeniceri, R.; et al. A simulation-based development and verification architecture for micro uav teams and swarms. In AIAA Scitech 2019 Forum; AIAA: San Diego, CA, USA, 2019; p. 1979. [Google Scholar]
  75. Unoeste Terá Maior Usina Solar de Geração Distribuída de SP-Unoeste. Available online: http://www.unoeste.br/noticias/2019/3/unoeste-tera-maior-usina-solar-de-geracao-distribuida-de-sp (accessed on 11 October 2021).
  76. ABA Newsletter. Available online: http://www.csus.edu/aba2/newsletters/fall2012/abagreennews.html (accessed on 11 October 2021).
  77. Abadi, M.; Barham, P.; Chen, J.; Chen, Z.; Davis, A.; Dean, J.; Devin, M.; Ghemawat, S.; Irving, G.; Isard, M.; et al. TensorFlow: A System for Large-Scale Machine Learning. In 12th USENIX Symposium on Operating Systems Design and Implementation (OSDI 16); USENIX Association: Savannah, GA, USA, 2016; pp. 265–283. [Google Scholar]
  78. Flask. 2021. Available online: https://flask.palletsprojects.com/en/2.0.x/ (accessed on 31 October 2021).
  79. Mavlink/Mavlink. 2021. Available online: https://github.com/mavlink/mavlink (accessed on 24 October 2021).
  80. Li, J.; Zhou, Y.; Lamont, L. Communication architectures and protocols for networking unmanned aerial vehicles. In Proceedings of the 2013 IEEE Globecom Workshops (GC Wkshps), Atlanta, GA, USA, 9–13 December 2013; pp. 1415–1420. [Google Scholar] [CrossRef]
  81. Sarker, I.H. Machine Learning: Algorithms, Real-World Applications and Research Directions. SN Comput. Sci. 2021, 2, 160. [Google Scholar] [CrossRef] [PubMed]
  82. Bradski, G. The openCV library. Dobb’S J. Softw. Tools Prof. Program. 2000, 25, 120–123. [Google Scholar]
  83. OpenCV: OpenCV-Python Tutorials. Available online: https://docs.opencv.org/master/d6/d00/tutorial_py_root.html (accessed on 18 October 2021).
  84. Choset, H.; Pignon, P. coverage path planning: The boustrophedon cellular decomposition. In Field and Service Robotics; Springer: London, UK, 1998; pp. 203–209. [Google Scholar]
  85. Gabriely, Y.; Rimon, E. Spanning-tree based coverage of continuous areas by a mobile robot. Ann. Math. Artif. Intell. 2001, 31, 77–98. [Google Scholar] [CrossRef]
  86. Zelinsky, A.; Jarvis, R.A.; Byrne, J.; Yuta, S. Planning paths of complete coverage of an unstructured environment by a mobile robot. In Proceedings of the 1993 IEEE/Tsukuba International Workshop on Advanced Robotics, Tsukuba, Japan, 8–9 November 1993; Volume 13, pp. 533–538. [Google Scholar]
  87. Ceballos, N.M.; Valencia, J.; Giraldo, A.A. Simulation and assessment educational framework for mobile robot algorithms. J. Autom. Mob. Robot. Intell. Syst. 2014, 8, 53–59. [Google Scholar] [CrossRef]
  88. Jaeyoung, L. PX4 Gazebo Plugin Suite for MAVLink SITL and HITL. 2021. Available online: https://github.com/PX4/PX4-SITL_gazebo/blob/ffb87ef4a312564cf91791bd5a9d683aacd085a6/models/iris/iris.sdf.jinja (accessed on 7 December 2021).
  89. Torres, M.; Pelta, D.A.; Verdegay, J.L.; Torres, J.C. coverage path planning with unmanned aerial vehicles for 3D terrain reconstruction. Expert Syst. Appl. 2016, 55, 441–451. [Google Scholar] [CrossRef]
  90. Araújo, J.; Sujit, P.; Sousa, J. Multiple UAV area decomposition and coverage. In Proceedings of the 2013 IEEE Symposium on Computational Intelligence for Security and Defense Applications (CISDA), Singapore, 16–19 April 2013; pp. 30–37. [Google Scholar] [CrossRef]
  91. PX4 Drone Autopilot. 2021. Available online: https://github.com/PX4/PX4-Autopilot (accessed on 23 September 2021).
  92. Koenig, N.; Howard, A. Design and use paradigms for Gazebo, an open-source multi-robot simulator. In Proceedings of the 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566), Sendai, Japan, 28 September–2 October 2004; Volume 3, pp. 2149–2154. [Google Scholar] [CrossRef] [Green Version]
  93. Quigley, M.; Conley, K.; Gerkey, B.; Faust, J.; Foote, T.; Leibs, J.; Wheeler, R.; Ng, A.Y. ROS: An open-source Robot Operating System. In Proceedings of the ICRA Workshop on Open Source Software, Kobe, Japan, 12–17 May 2009; p. 5. [Google Scholar]
  94. Silano, G.; Iannelli, L. CrazyS: A Software-in-the-Loop Simulation Platform for the Crazyflie 2.0 Nano-Quadcopter. In Robot Operating System (ROS): The Complete Reference (Volume 4); Koubaa, A., Ed.; Springer International Publishing: Cham, Swotzerland, 2020; pp. 81–115. [Google Scholar] [CrossRef]
  95. Salamh, F.E.; Karabiyik, U.; Rogers, M.K. RPAS Forensic Validation Analysis Towards a Technical Investigation Process: A Case Study of Yuneec Typhoon H. Sensors 2019, 19, 3246. [Google Scholar] [CrossRef] [Green Version]
  96. QGroundControl Ground Control Station. 2021. Available online: https://github.com/mavlink/qgroundcontrol (accessed on 31 October 2021).
  97. Yuneec Typhoon H-Yuneec Futurhobby. Available online: https://yuneec-futurhobby.com/yuneec-typhoon-h-pro-realsense (accessed on 28 October 2021).
  98. 3DR Iris-The Ready to fly UAV Quadcopter. Available online: http://www.arducopter.co.uk/iris-quadcopter-uav.html (accessed on 28 October 2021).
  99. Tan, S.H.; Md Ali, J. Quartic and quintic polynomial interpolation. In Proceedings of the 20th National Symposium on Mathematical Sciences: Research in Mathematical Sciences: A Catalyst for Creativity and Innovation, Putrajaya, Malaysia, 18–20 December 2012; Volume 1522, pp. 664–675. [Google Scholar]
  100. Peréz-González, A. Andresperez86/CPP_GUI. 2021. Available online: https://github.com/andresperez86/CPP_GUI (accessed on 29 October 2021).
  101. Jordan, S.; Moore, J.; Hovet, S.; Box, J.; Perry, J.; Kirsche, K.; Lewis, D.; Tse, Z.T.H. State-of-the-art technologies for UAV inspections. IET Radar Sonar Navig. 2018, 12, 151–164. [Google Scholar] [CrossRef]
  102. Ghaddar, A.; Merei, A.; Natalizio, E. PPS: Energy-Aware Grid-Based coverage path planning for UAVs Using Area Partitioning in the Presence of NFZs. Sensors 2020, 20, 3742. [Google Scholar] [CrossRef] [PubMed]
  103. Öst, G. Search Path Generation with UAV Applications Using Approximate Convex Decomposition. 2012, p. 52. Available online: http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-77353 (accessed on 29 October 2021).
Figure 1. Stages for CPP with semantic segmentation for UAV in PV plants.
Figure 1. Stages for CPP with semantic segmentation for UAV in PV plants.
Applsci 11 12093 g001
Figure 2. The RoI and the path generated from the initial point (I) to the final point (F) by BECD.
Figure 2. The RoI and the path generated from the initial point (I) to the final point (F) by BECD.
Applsci 11 12093 g002
Figure 3. (a) Approximate cell decomposition in grids, and sub grids. (b), coverage path generated with the GBSTC method.
Figure 3. (a) Approximate cell decomposition in grids, and sub grids. (b), coverage path generated with the GBSTC method.
Applsci 11 12093 g003
Figure 4. (a) Wavefront distance transforms for the selection of the initial position (I) and final position (F). (b) Coverage path generated using the wavefront distance transforms with the selection of the initial position (I).
Figure 4. (a) Wavefront distance transforms for the selection of the initial position (I) and final position (F). (b) Coverage path generated using the wavefront distance transforms with the selection of the initial position (I).
Applsci 11 12093 g004
Figure 5. UAV over A PV plant, simulated in Gazebo.
Figure 5. UAV over A PV plant, simulated in Gazebo.
Applsci 11 12093 g005
Figure 6. Steps of boundary extraction by the DL server and OpenCV functions in the Unioeste 1 PV plant.
Figure 6. Steps of boundary extraction by the DL server and OpenCV functions in the Unioeste 1 PV plant.
Applsci 11 12093 g006
Figure 7. Steps of boundary extraction by DL server, and OpenCV functions in Arak PV plant.
Figure 7. Steps of boundary extraction by DL server, and OpenCV functions in Arak PV plant.
Applsci 11 12093 g007
Figure 8. Steps of boundary extraction by DL server, and OpenCV functions in CSUSL PV plant.
Figure 8. Steps of boundary extraction by DL server, and OpenCV functions in CSUSL PV plant.
Applsci 11 12093 g008
Figure 9. Performance metrics of Typhoon UAV on Unioeste PV plant (Experiment 1).
Figure 9. Performance metrics of Typhoon UAV on Unioeste PV plant (Experiment 1).
Applsci 11 12093 g009
Figure 10. Performance metrics of Typhoon UAV on Arak PV plant (Experiment 2).
Figure 10. Performance metrics of Typhoon UAV on Arak PV plant (Experiment 2).
Applsci 11 12093 g010
Figure 11. Performance metrics of Typhoon UAV on CSUSL PV plant (Experiment 3).
Figure 11. Performance metrics of Typhoon UAV on CSUSL PV plant (Experiment 3).
Applsci 11 12093 g011
Figure 12. Performance test of Typhoon UAV varying the CPP width.
Figure 12. Performance test of Typhoon UAV varying the CPP width.
Applsci 11 12093 g012
Figure 13. Performance metrics of 3DR Iris UAV on Unioeste PV plants (Experiment 4).
Figure 13. Performance metrics of 3DR Iris UAV on Unioeste PV plants (Experiment 4).
Applsci 11 12093 g013
Figure 14. Performance metrics of 3DR Iris UAV on Arak PV plants (Experiment 5).
Figure 14. Performance metrics of 3DR Iris UAV on Arak PV plants (Experiment 5).
Applsci 11 12093 g014
Figure 15. Performance metrics of 3DR Iris UAV on CSUSL PV plants (Experiment 6).
Figure 15. Performance metrics of 3DR Iris UAV on CSUSL PV plants (Experiment 6).
Applsci 11 12093 g015
Figure 16. Performance test of UAV 3DR Iris varying the CPP width.
Figure 16. Performance test of UAV 3DR Iris varying the CPP width.
Applsci 11 12093 g016
Table 1. Comparison of three CPP methods with regard to CPP width, for a simulated Typhoon UAV at three PV plants.
Table 1. Comparison of three CPP methods with regard to CPP width, for a simulated Typhoon UAV at three PV plants.
Energy Consumption
PV PlantUnioeste 1 (35,975 m 2 )Arak (25,056 m 2 )CSUSL (1344 m 2 )
CPPBECDGBSTCGBWCBECDGBSTCGBWCBECDGBSTCGBWC
Width (m)
1 N / A N / A N / A N / A N / A N / A 35%47%43%
2 N / A N / A N / A N / A N / A N / A 20%23%24%
3 N / A N / A N / A N / A N / A N / A 14%15%15%
4 N / A N / A N / A N / A N / A N / A 11%12%12%
598%100%100%80%94%94% N / A N / A N / A
6 N / A N / A N / A N / A N / A N / A 9%8%8%
874%85%88%52%62%63%7%6%5%
1056%62%63% 41.5 %45%43%5% N / A 5%
1247%54%55%36%37%35% N / A N / A N / A
1538%43%45%27%29%26% N / A N / A N / A
2029%32%31%21%20%19% N / A N / A N / A
Table 2. Comparison of three CPP methods with regard to CPP width, simulated in a 3DR Iris UAV on three PV plants.
Table 2. Comparison of three CPP methods with regard to CPP width, simulated in a 3DR Iris UAV on three PV plants.
Energy Consumption
PV PlantUnioeste 1 (35,975 m 2 )Arak (25,056 m 2 )CSUSL (1344 m 2 )
CPPBECDGBSTCGBWCBECDGBSTCGBWCBECDGBSTCGBWC
Width (m)
1 N / A N / A N / A N / A N / A N / A 52%72%72%
2 N / A N / A N / A N / A N / A N / A 30%35%38%
3 N / A N / A N / A N / A N / A N / A 21%24%22%
4 N / A N / A N / A N / A N / A N / A 17%17%15%
5 N / A N / A N / A N / A N / A N / A N / A N / A N / A
6 N / A N / A N / A N / A N / A N / A 12%12%11%
8100%100%100%88%98%98%9%9%8%
1096%100%100%70%76%71%7% N / A 7%
1280%92%95%60%63%59% N / A N / A N / A
1564%73%78%45%48%43% N / A N / A N / A
1760%68%67% N / A N / A N / A N / A N / A N / A
2048%55%54%26%26%25% N / A N / A N / A
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Pérez-González, A.; Benítez-Montoya, N.; Jaramillo-Duque, Á.; Cano-Quintero, J.B. Coverage Path Planning with Semantic Segmentation for UAV in PV Plants. Appl. Sci. 2021, 11, 12093. https://doi.org/10.3390/app112412093

AMA Style

Pérez-González A, Benítez-Montoya N, Jaramillo-Duque Á, Cano-Quintero JB. Coverage Path Planning with Semantic Segmentation for UAV in PV Plants. Applied Sciences. 2021; 11(24):12093. https://doi.org/10.3390/app112412093

Chicago/Turabian Style

Pérez-González, Andrés, Nelson Benítez-Montoya, Álvaro Jaramillo-Duque, and Juan Bernardo Cano-Quintero. 2021. "Coverage Path Planning with Semantic Segmentation for UAV in PV Plants" Applied Sciences 11, no. 24: 12093. https://doi.org/10.3390/app112412093

APA Style

Pérez-González, A., Benítez-Montoya, N., Jaramillo-Duque, Á., & Cano-Quintero, J. B. (2021). Coverage Path Planning with Semantic Segmentation for UAV in PV Plants. Applied Sciences, 11(24), 12093. https://doi.org/10.3390/app112412093

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop