Next Article in Journal
Fast and High-Quality Monocular Depth Estimation with Optical Flow for Autonomous Drones
Next Article in Special Issue
Improved Image Synthesis with Attention Mechanism for Virtual Scenes via UAV Imagery
Previous Article in Journal
Service Function Chain Scheduling in Heterogeneous Multi-UAV Edge Computing
Previous Article in Special Issue
A Novel UAV Visual Positioning Algorithm Based on A-YOLOX
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Automatic Visual Inspection of Oil Tanks Exterior Surface Using Unmanned Aerial Vehicle with Image Processing and Cascading Fuzzy Logic Algorithms

1
Department of Mechanical Engineering, Faculty of Engineering, University of Malaya, Kuala Lumpur 50603, Malaysia
2
Faculty of Engineering, Taiz University, Taiz 9674, Yemen
3
Faculty of Computer Science & Information Technology, University of Malaya, Kuala Lumpur 50603, Malaysia
4
Department of Computer Science, Faculty of Information and Communication Technology, International Islamic University Malaysia, Kuala Lumpur 53100, Malaysia
5
Department of Computer Science and Software Engineering, College of Information Technology, United Arab Emirates University, Al Ain 15551, United Arab Emirates
*
Author to whom correspondence should be addressed.
Drones 2023, 7(2), 133; https://doi.org/10.3390/drones7020133
Submission received: 19 December 2022 / Revised: 31 January 2023 / Accepted: 9 February 2023 / Published: 13 February 2023
(This article belongs to the Special Issue Intelligent Image Processing and Sensing for Drones)

Abstract

:
This paper presents an automatic visual inspection of exterior surface defects of oil tanks using unmanned aerial vehicles (UAVs) and image processing with two cascading fuzzy logic algorithms. Corrosion is one of the defects that has a serious effect on the safety of the surface of oil and gas tanks. At present, human inspection, and climbing robots inspection are the dominant approach for rust detection in oil and gas tanks. However, there are many shortcomings to this approach, such as taking longer, high cost, and covering less surface area inspection of the tank. The purpose of this research is to detect the rust in oil tanks by localizing visual inspection technology using UAVs, as well as to develop algorithms to distinguish between defects and noise. The study focuses on two basic aspects of oil tank inspection through the images captured by the UAV, namely, the detection of defects and the distinction between defects and noise. For the former, an image processing algorithm was developed to improve or remove noise, adjust the brightness of the captured image, and extract features to identify defects in oil tanks. Meanwhile, for the latter aspect, a cascading fuzzy logic algorithm and threshold algorithm were developed to distinguish between defects and noise levels and reduce their impact through three stages of processing: The first stage of fuzzy logic aims to distinguish between defects and low noise generated by the appearance of objects on the surface of the tank, such as trees or stairs, and reduce their impact. The second stage aims to distinguish between defects and medium noise generated by shadows or the presence of small objects on the surface of the tank and reduce their impact. The third stage of the thresholding algorithm aims to distinguish between defects and high noise generated by sedimentation on the surface of the tank and reduce its impact. The samples were classified based on the output of the third stage of the threshold process into defective or non-defective samples. The proposed algorithms were tested on 180 samples and the results show its superiority in the inspection and detection of defects with an accuracy of 83%.

1. Introduction

Oil products are one of the most important sources of energy and have a vital impact on countries’ economic sectors as demand for it increases gradually with continuous developing of other industrial and commercial sectors. This results in a need for storage tanks to store as much oil as possible.
The exterior surface of oil tanks is commonly affected by corrosion and dust due to chemical reactions between the surface and both oil and air substances, necessitating continuous inspection, and monitoring.
Corrosion is one of the biggest problems for companies in the oil sector due to the cost of repairing or replacing the damaged parts with non-corroded ones. Corrosion can be defined as the destructive attack of a substance by interaction with its environment [1] which promotes the tendency of these unstable metals to return to their more stable natural form. Regular inspection and monitoring of these tanks are the most important ways to reduce risk and corrosion because they help to early detect the damage, prolong the life of these tanks, and prevent the closure of the oil facility or the suspension of production processes.
In addition, such inspections provide protection against legal and financial accountability for leakage caused by the corrosion process and its destructive impact on the environment. Analysis of major refinery accidents over the past 35 years has shown that loss of containment due to corrosion has contributed up to 25% of these accidents [2]. It has been noticed that corrosion causes 42% of the failure mechanisms in all engineering structures [2]. The damage corrosion causes maintenance costs to be increase in the range of 3–5% of the total products’ costs in developed countries [3]. In the oil and gas industry alone, the cost of repairing damage caused by corrosion is $1.372 billion which includes surface pipelines ($589 million), expenses of pipelining ($463 million), and another $320 million in corrosion-related capital expenditure [4]. In addition, corrosion can significantly reduce the annual income by up to $10 billion during maintenance time [5], e.g., in US, the annual cost of corrosion damage is $170 billion [6].
Despite improvements in the design process and the selection of metals for better construction of these tanks, this is not enough to ensure their safety. The introduction of modern inspection techniques, such as climbing robots and manned UAVs, has been rapidly evolving in an effort to solve the problem with more sophisticated techniques.
UAVs are used in several fields, including inspection, and monitoring, searching for missing persons, and monitoring illegal immigrants, monitoring vital infrastructure, and detecting hidden corrosion in aluminum structures and checking railway surface defects, etc. Several researchers are developing UAV inspection systems for oil and gas tank inspection, as they can play a significant role in reducing inspection time, cost, and risks to overall required maintenance.
The traditional inspection and maintenance process is very expensive and, due to its complex nature and dangerous environment, time consuming, but it is necessary to avoid the catastrophic risks that may be inflicted on the environment and humans (as a result of the effects of corrosion and the consequences of neglecting the inspection process and monitoring). The industry’s oil and gas plants need to be maintained regularly to keep their components running with high safety and efficiency. Regular testing and equipment inspection has a great effect on the costs of maintenance and daily operational processes [7].
The structure of the oil and gas and oil industry is so complex with high hazards, thus maintenance, inspection, and repairs in such places include high risks to employers. The maintenance operators must regularly climb up high-rise oil facilities, such as storage tanks, flare stacks, boilers, chimneys, and cooling towers, to inspect their surfaces. Based on the performance of the inspection, two common methods are used to inspect the oil and gas industry, namely, manual, and climbing robot inspections.
The climbing robot inspection system is nowadays used widely to inspect oil and gas facilities [8]. This system has resulted in high cost savings in daily operational costs. There are many kinds of climbing robot systems that can access high-rise buildings; they imitate mammals, reptiles, and insects when climbing, using several movement methods to climb such as jumping sliding, extension, and swinging. Many problems can occur during climbing robots’ operations in high-rise buildings, such as flexibility, motors overheating, power-consumption stability, slippage on the surface being climbed, and climbing between neighboring surfaces.
The current climbing robots used to inspect outdoor storage tanks have limited movement and commonly work based on remote control which has resulted in a decrease in flexibility. As an example, the MATS climbing robot with 5-DOF has excellent maneuverability, but needs a place to dock. A prototype model called Walloid, that is able to choose an adhesion method which increases the robustness and flexibility needed for industrial applications, has been designed for offshore oil and gas facility inspection [9].
A robot based on bio-inspired principle, called Sticky Bot, has adhesive material on the bottom of its feet to enable it to hold onto all surface types [10]. The non-destructive test is usually performed to inspect the metallic plates for corrosion and determine the presence of defects without damaging the surface. A climbing robot for corrosion observation on cooling towers used by the oil and gas industry has been developed by a fusion of wheel electrodes and adhesion operation [11]. Such robots have increased performance efficiency in comparison with humans, but need a special mechanical design for materials used in climbing and good analysis of system dynamics. Climbing robot applications are confined to some types of structures such as those with cylinder shapes.
The alternative solution is to use UAVs to inspect surfaces with a simple and straightforward mechanism. UAV technology has been widely utilized in the gas and oil industry to inspect high-rise facilities with a better efficiency and sustainability, in comparison with climbing robots or manual methods. UAV inspection depends on data analysis of a range of sensors data that need to be acquired, processed, stored, and well analyzed. Unlike climbing robots that need a suspension system and scaffolds, UAVs can move freely and perform inspections with high efficiency and reliability.
UAV inspection is accomplished using high-definition (HD) cameras and infra-red (IR) sensors that can carry out the risk-based inspection (RBI) for gas and oil equipment with the standards of API RP-580, API 579-1, and ASME FFS-1. UAVs can also test piping with API 570 (pipe inspection code) and tanks of standard API 653 (tank inspection repair reconstruction) [12].
One of the first drone inspection systems was introduced in 2010 to inspect an onshore oil refinery in the UK. This helped operators to gain an understanding of the condition of the equipment without any need for shutdown or exposing the operators to a risky situation [13]. Such systems allow engineers to inspect high-rise critical equipment in the oil and gas industry (such as vents, ducting, pipes, and chimneystacks), reduce the time for maintenance, and prioritize components’ maintenance without a need to shut down the facilities. The UAV flies manually under the control of a certified pilot who enables the drone to fly along the facilities which require inspection using normal/thermal cameras and sensor of hydrocarbon leakage determination, etc. The acquired data is then analyzed to find defects on the surfaces such as corrosion, hairline cracks, and leakages. Since the drones can carry cameras with 4K video recording and optical zoom, along with various other sensors, there is no need to fly the drone too close to the inspected structures or other risk areas.
Drone-based inspection in the oil and gas industry is getting more attention due to four reasons, namely, (i) its ability to inspect areas that are potentially hazardous; (ii) it is a cost effective as well as efficient inspection method; (iii) its ability to inspect a large area in short time; and (iv) the operation of the drone does not require a highly skilled inspector. Many third-party companies are now offering UAV inspection solutions for the oil and gas industry.
This paper is aimed at developing a UAV-based visual inspection system for high-rise oil tanks. Such facilities must be continuously inspected to avoid hazardous surface leakage once it has appeared. The contribution of this paper is related to a combination of theoretical and experimental techniques. The theoretical aspect relies on developing a classification algorithm based on the fusing of image processing and two cascading fuzzy logic (FL) and threshold processes. The experimental works present defect detection on high-rise oil tanks as a challenging subject which needs further improvement through research.

2. Automatic Visual Inspection System

The inspection of the oil tanks is performed through four main stages as shown in Figure 1, namely, UAV scanning, image processing, AI, and thresholding process.

2.1. UAV Scanning

The Pro Mavic UAV, shown in Figure 2, was used for inspection of oil tanks in this work. It can fly for a period of up to 27 min at a speed of 40 mph. It also uses data that is recorded with GPS information to ensure that the UAV lands in an accurate location.
The camera built into the Mavic Pro UAV is the smallest 3-axis camera, which has the feature of recording both images and video. With a 90-degree tilt of the camera, it produces video with blurry side scenes, or with black bars.
The camera supports 4K shooting at 30 frames per second, as well as 1080p full HD video shooting at 96 frames per second, so it is expected to support slow motion video shooting. The controller is connected to the UAV within a range of 4.3 miles, with live scenes from the plane at a 1080p display quality.

2.2. Image Processing-based Defects Detection

Images taken by UAVs are often inconsistent or lack specific behaviors and trends. The image is likely to contain many errors and distortions making it very complicated to handle. Therefore, image processing is required to remove the noise to accurately assess the defects on the oil tanks.
Image processing consists of several steps to prepare clear images that are contrast-adjusted and free from the blurring which results from the heterogeneity of lighting, the appearance of objects on the sides of the images, etc.
This project is focused on inspection all sides of the tank except for the stairs, and the top and bottom.

2.2.1. Pre-Processing of Captured Image

Four main operations are applied on the captured image in pre-processing stages, namely, cropping, resizing, RGB (red green blue) conversion and brightness adjusting as shown in Figure 3. Cropping involves the removal of unwanted parts in an image. The areas located under the tank have no significance and thus are most likely to be removed.
Resizing controls image dimensions, length and width, and allocates a fixed size value to all input images, reducing execution time and data processing speed. This process allows for easy image handling and the option of obtaining a square image that can be divided into four equal parts. The images after implementation of cropping and resizing operations are shown in Figure 4b.
The RGB color image is converted to gray to reduce the image size in order to increase the processing speed and facilitate its handling with some image processing instructions. The image after conversion process from RGB to gray as shown in Figure 4c,d.
Brightness allows adjustment of the high and low pixel values that affect the homogeneity of the image. Figure 4d shows an image after adjustment operation.

2.2.2. Image Processing of the Prepared Image

The image in the pre-processing stage needs a further process using image processing tools to prepare it for defects detection. The image has been divided into four parts of equal dimension, as shown in Figure 5, to increase classification accuracy, reduce the effect of noise, and choose an appropriate threshold if the image is inhomogeneous.
The importance of the division is outlined in the following points:
-
When the threshold value is specified it will be determined according to the grayscale values of the whole image, while if it is divided into four parts it will be determined for a specific area. Thus, picture noise effects will occur on one part rather than the whole image.
-
When filters are used, one can obtain an enhanced and higher quality image with four parts rather than if they are used on the entire image.
The performance of the edge filter is improved with the four image parts as it depends on the threshold value in the classification process. The smaller the image size, the higher the filtration efficiency. Accuracy of the fuzzy logic algorithm in the classification process will be higher with four image parts.
-
Filtration of Image: filtering is a technique used to eliminate noise and unwanted things from images. Two main filters are used in this work to eliminate the effects of noise from i402mages, namely Gaussian and Prewitt filters as follows:
Gaussian Filter
The Gaussian filter uses a 2D convolution operator which is suitable for blurring images and eliminating noise as show in Figure 6:
  • Prewitt Filter
The Prewitt filter is utilized to detect two kinds of object edges, namely, vertical and horizontal as shown in Figure 7. The detection of edge is performed by calculation of the pixels gradient in the images.
-
Morphological image processing: Morphology operations help to extract useful features of the detected object such as shape, convex-hull skeletons, and boundaries. It depends on the division of images into small pieces, called a structuring element. The structuring element is a kind of array that defines the current processed pixels and their neighbors. It is a typically preferred method for choosing the element that has the same shape as the required, e.g., for finding lines, a function called “strel” can be used to extract it as shown in Figure 8.
Dilation: The dilation operation enlarges the boundaries of the foreground pixels in grayscale images, by increasing their pixels size and reducing the holes size within such boundaries as shown in Figure 8.
-
Bwareaopen: The instruction bwareaopen is a morphological operation that works based on removing all connected components with fewer pixels than a specified value of pixels from a binary or grayscale image, which results in another binary or grayscale image as shown in Figure 9.
-
Filling: The imfill function performs the filling of an object to make it similar to the foreground in the binary images. As shown in Figure 10, imfill modifies those pixels in the connected background that have a value of zeros making them similar to the foreground pixels which have values of ones. The process will stop when the boundary of the object is reached.
-
Inverting Image: Inverting the image is to make the black pixel similar to the defect instead of the white pixel in order be more visible to the viewer as shown in Figure 11.

2.3. Fuzzy Logic Based Classification

The fuzzy logic algorithm is one of the best AI algorithms that resembles human behavior in thinking and decision-making [14,15]. During the inspection of oil tanks there are three main sources of noise that appear on the surface of the tank: heterogeneity of illumination; the presence of objects, and the presence of sediments or dirt. These factors are resulted in three different levels of noise as follows:
The first level is low noise which is noise caused by the presence of objects on the surface of the tank, such as ladders, trees, or valves at the bottom of the tank, or other objects on the tank’s surface. The second level is medium noise, which is that caused by small objects or shade from asymmetric lighting or small sediments. The third level is high noise, caused by dirt or large sediments.
In the presence of low noise, one can distinguish easily between the noise and defects, however, distinguishing between defects and high-level noise is more complicated. To overcome this problem, the fuzzy logic algorithm is implemented in two cascading stages to distinguish defects from low and medium noise to minimize its impact on the inspection process and help detect defects as shown in Figure 12, Figure 13, Figure 14, Figure 15, Figure 16, Figure 17 and Figure 18.

2.3.1. First Stage Fuzzy Logic Algorithm

The first stage fuzzy logic aims to distinguish between defects and low noise and remove such noise from the image. Low noise is located mostly at the bottom or the corners of images. This stage of fuzzy logic is applied to distinguish between defects and low noise according to the image’s pixel location. After identifying the position of low noise that is affecting the inspection, the process of noise elimination will be applied.
The x and y coordinate systems are used as two input sets of the first stage in fuzzy logic inference system which are measured in pixels as shown in Figure 13. The image has been divided into four parts and each part divided into nine regions based on x and y locations as shown in Figure 15. Each region has a length and width that are described in x and y directions, respectively, as shown in Figure 13. The degree to which these regions are affected by noise varies from place to place. The side regions of the image are considered to be those most affected by noise, while regions in the middle of the image are less affected.
The first input set x utilizes three linguistic variables: x = {x1, x2, x3}, where x1, x2, and x3 have a total range of 500 pixels in the x coordinate system, as illustrated in Figure 13a. In a similar way, the y input set utilizes three linguistic variables: y = {y1, y2, y3}, where y1, y2, and y3 x3 have a total range of 500 pixels in the y coordinate system, as illustrated in Figure 13b. The membership functions of the input sets are shown in Figure 13. The shape of membership is chosen to be in triangular form after conducting some trials.
The classification of the detected defects into noise or defects is utilized as the output set of the first stage in the fuzzy logic inference system as shown in Figure 14. It is called “condition” and has four linguistic variables: Condition = {big defect, low defect, medium noise, low noise} with a range equal to 100%, starting from the big defect as the maximum range and ending with low noise as the minimum as shown in Figure 14. The four parts of the images have the same output membership function as shown in Figure 14.
In the output set of the first stage of fuzzy logic, the low noise can be considered if the condition set has a value above 70%, which will be eliminated. However, if the condition has a value less than 30%, the decision is to consider it as a defect.
We were unable to decide the condition values located in the range (30–70)% at this first stage, thus these values will be classified at the second fuzzy logic stage. The first stage fuzzy logic rules are formed for each part of the image separately based on x and y input sets’ locations which are mapped to output set as shown as follows: Numbers in tables (1,2,3,4) indicate the output sets linguistic variables: (4) for low noise, (3) for medium noise, (2) for low defect and (1) for big defect.
1- Part first rule
The rules are illustrated in Table 1.
2- Part second rule
The rules are illustrated in Table 2.
3- Part third rule
The rules are illustrated in Table 3.
4- Part fourth rule
The rules are illustrated in Table 4.
The image after the implementation of the first stage fuzzy logic is shown in Figure 16.

2.3.2. Second Stage Fuzzy Logic Algorithm

In the second step of fuzzy logic inference system, the output crisp values of the first stage fuzzy logic which is located in the range (70–30)% are used as the input set of the second fuzzy logic. The regions surrounding the central region of the tank are the most difficult to distinguish between low defects and medium noise. The fuzzy logic second stage is intended to detect the intermediate noise produced by the shade, which can be found at the bottom, top, left, or right of the tank. This stage is applied to the four parts separately.
The input set of second stage fuzzy logic is the outputs of first stage fuzzy logic which have values between 30% and 70%. The four parts of the images have the same input membership function as shown in Figure 17. The second stage fuzzy logic in the classification process also depends on the pixel density in regions that the first stage fuzzy logic could not classify.
Table 5 shows a range of linguistic variables to the input sets X and Y.
According to the concentration of black pixels in these regions, the decision was made to classify as follows:
The number of black pixels between 0 and 2000 is very small and can be considered as a defect. If the number of black pixels is between 1600 and 8000, we can consider it as medium noise caused by a shadow, because the area on which the shadow is located looks more homogeneous. Thus, when the edge detection filter is used with such shadow area, it will only show the edges of the shadow, which is larger and has higher number of black pixels in comparison with defect. If the number of black pixels is higher than 6000, this can be considered as a defect but impure. Through experience, the largest number of black pixels that can be considered as a defect is half of the inspected image area, e.g., if the image pixels area is 2MP, then the number of black pixels (the defect) will not exceed 1MP, half of the image size.
To calculate the percentage of pixel density, Equation (1) is used: The percentage of pixel density is
Pd % = N tbp H image
where Ntbp is the number of true black pixels in the inspected image and Himage is half the area of the inspected image.
The fuzzy logic second stage has two input sets; the first is the output of the first stage of fuzzy logic and is expressed by the variable X2, while the second input is the black pixel density, expressed by the variable Y2.
X2 is the first input set with three linguistic variables: X2 = {x12, x22, x32}, where x12, x22, and x32 are ranges of the output first stage fuzzy logic which are located between 30% and 70% as shown in Figure 17a with equal range in all parts image.
Y2 is the second input set with three linguistic variables: Y2 = {y12, y22, y32}, where y12, y22, and y32 are ranges of the black pixel density ranged between 0 -1 as shown in Figure 17b.
The output set of the fuzzy logic second stage classifies the defects on the object into medium noise-2 (non-defects), big defects-2, and low defects-2. It is called condition2 with three linguistic variables: Condition2 = {big defect_2, low defect_2, medium noise_2} with a range of 100%, which starts from the big defect as a small value in the range and ends with medium noise as a maximum of the range as shown in Figure 17c.
As the four parts of the images have the same input and output membership function in the second FL stage, only one part has been represented in Figure 17. Table 6 shows the relationship between the first input set and output that are used to build the rules of the classification process.
The rules of the fuzzy logic second stage for first, second, third, and fourth parts are formed as shown in Table 7:
The second stage of the fuzzy logic in the classification process depends on two important factors: the first is the location of the pixels that have met the 30–70% condition of the output set in the first stage; the second factor is the density of the pixels in the output of the first fuzzy logic stage. The second stage of fuzzy logic is implemented to distinguish the medium noise from defects, and then will be eliminated from the output set images once they have a value greater than 70%.
Output values which are less than 70% will be classified as big defect and low defect as shown in Figure 17. Thus, there is a need for thresholding or anther fuzzy logic stage. Figure 18 shows the output of fuzzy logic second stage.

2.4. Thresholding Process

After the second stage of the fuzzy logic is implemented, the four parts of the image are collated back into one image to prepare them for a new stage of processing. The collated image is input to the third stage of processing (threshold process) as shown in Figure 19.
The image is then divided into 100 equidimensional cells as shown in Table 8. The cells where the high noise is concentrated are located above and below the image as shown in Figure 19. The group of cells located above and below the image is shown in Table 8.
Through the experimental values, the threshold value in the cells located was estimated at 70% of total black pixels that resulted from the four parts of the image from the second stage FL (3000–3500 pixels in this case). The flow chart of threshold processing is illustrated in Figure 20.
This stage in the classification process depends on the pixel density within the cells. All cells where the number of black pixels is greater than the threshold value will be deleted. Figure 21 shows the final image after the thresholding process. Figure 22 shows the original image and the stages it went through during processing.

3. Experimental Results and Discussion

To show evidence of the proposed inspection system, many experiments (as depicted in Figure 23) were conducted on real oil tanks with various parameters and conditions to evaluate the proposed algorithm.

3.1. Experimental Results

To check the performance of the inspection system, it must be tested on a wide range of samples with different parameters with several tests run under different conditions. Hence, several tests for the inspection system were conducted during different hours of the day as well as testing the system on tanks of different colors, shapes, and sizes. The worst inspection cases dealt with those parameters that show the limitations of the proposed algorithms.
Although several experiments were performed before making a decision, only the significant and important cases will be discussed. The samples presented in this sub-section have the following features: One contains pure defects and low noise, which is one of the easiest inspection cases that the system can detect. Another contains all low, medium, and high noise levels in different regions with different levels of defects, which is one of the worst cases of inspection due to it being difficult for the system to detect, which made us focus on it more during the inspection.
As shown in Figure 24, the results indicate that there is no noise in the middle region of the tank in all images, while some defects may exist without any noise. As shown in Figure 24b,d,e,f, the lower regions are the lowest noise regions, and in Figure 24a,c the lower areas and the tank corners are the lowest noise areas. The three noise levels may not combine into one sample in real systems.
The algorithm of image processing as described in Section 2.2 was implemented to detect the surface defects of the six types of oil tanks in Figure 23. The image processing results for all samples are shown in Figure 24.
As mentioned above, the image processing algorithm cannot eliminate all noise from an image and thus is unable to make a definitive decision if the tank has defects or noise. Noises affecting the classification process have been divided into three different levels, namely, low, medium, and high noise. These noises cannot be eliminated completely by the image processing algorithm due to several factors affecting the processing, such as heterogeneity of illumination on the surface of the tank, the presence of objects appearing on the surface of the tank, and the presence of sediments or dirt on the surface of the tank.
It is clear that the peripheral, lower regions, and corners of the tanks are strongly affected by noise as shown in Figure 23. In Figure 24, the results indicate that there is no noise in the middle region of the tank in all images, while some defects may exist without any noise affecting it. As shown in Figure 24, the lower regions are the noisiest regions (low noise), while the tank corners are the lowest noise regions. One cannot distinguish between defects and the three levels of noise until the algorithms of cascading fuzzy logic and the thresholding have been implemented on the image.
The fuzzy logic inference system as designed in Section 2.3 is applied to remove the noise effects caused by the appearance of the above-mentioned factors on the surface of the tank. The fuzzy logic first stage aims to remove the low noise as shown in Figure 25. All the outputs of the first stage of fuzzy logic have values between 30 and 70% which means that values below 30% can be classified as pure defects, while those above 70% can be classified as low noise. All the samples shown in Figure 23 will have the same output values that are between 30 and 70% during the implementation of the first stage of the fuzzy logic as shown in Figure 25. In the first sample as shown in Figure 25a, the first part has output values confined between 40 and 80%, which indicates the presence of low noise and the absence of pure defects, whereas the second, third, and fourth parts have the same output values confined between 20 and 80%, indicating the presence of pure defects and low noise.
Similarly, in the second sample as shown in Figure 25b, the first part has output values confined between 40 and 65% which indicates that there is no low noise and pure defects, while the second, third, and fourth parts have the same output values confined between 20 and 80%, and this indicates the presence of pure defects and low noise.
In the third sample as shown in Figure 25c, the first part has output values confined between 40 and 78%, (indicates the presence of low noise with no pure defects); the second part has output values confined between 20 and 60% (indicates the presence of pure defects with no low noise), while the third and fourth parts have the same output values, ranging from 20 to 80% (indicates the presence of pure defects and low noise).
In the fourth sample as shown in Figure 25d, the first part has output values confined between 40 and 67% (indicates the absence of any low noise and pure defects), the second part has output values confined between 22 and 60% (indicates the presence of pure defects with no low noise), the third and fourth parts have the same output values, which range between 20 and 80% (indicates the presence of pure defects and low noise).
In the fifth sample as shown in Figure 25e, the first part has output values confined between 40 and 80% (indicates the presence of low noise with no pure defects), however, the second, third, and fourth parts have confined output values between 20 and 80% (indicates the presence of pure defects and low noise).
In the sixth sample shown in Figure 25f, the first part has output values confined between 40 and 78% (indicates the presence of low noise with no pure defects), and the second, third, and fourth parts have confined output values between 20 and 80% (indicates the presence of pure defects and low noise).
As shown in Figure 26, the results indicate that there is some medium noise which exists between the output values of the first stage of the fuzzy logic located between 30 and 70%. The regions around the central region of the tank are the most difficult to differentiate between defects and medium noise.
The second stage of the fuzzy logic inference system as designed in Section 2.3.2, was applied to remove the medium noise effects caused by small dirt, heterogeneity in lighting, and the appearance of small objects on the surface of the tank. The fuzzy logic second stage aims to eliminate medium-scale noise as illustrated in Figure 27. All the outputs of the fuzzy logic second stage have values between 0 and 70% and this means that values less than 70% can be classified as defects, while values above 70% can be classified as medium noise.
All samples, as shown in Figure 23, have the same output values between 0 and 70% after the second stage of fuzzy logic was executed. In the first sample as shown in Figure 27a, only the fourth part contains output values greater than 70%, which indicates the presence of medium noise. In the second and third samples, as shown in Figure 27b,c, the second, third, and fourth parts have output values greater than 70%, which indicates the presence of medium noise. In the fourth sample, as shown in Figure 27d, only the fourth part contains output values greater than 70%, indicating the presence of medium noise. In the fifth and sixth sample, as shown in Figure 27e,f, the second, third, and fourth parts have output values greater than 70% (indicates the presence of medium noise).
Figure 23 shows the presence of high noise in some images. The results indicate the presence of some high noise caused by large sediment or dirt on the surface of the tank as shown in Figure 28. The third stage of the threshold algorithm aims to remove the high noise that can be located above and below the tank as shown in Section 2.4. As shown in Figure 28, there are numbers of cells that show black pixel density.
The threshold value in the cells located above the tank is estimated at 3500 black pixels, while those located below the tank are estimated at 3000 black pixels. Cells in which the black pixel exceeds the threshold value are classified as high noise, while others are classified as defects.
In the first, second, third, and fourth samples, as shown in Figure 28a–d, there are no cells where the number of black pixels exceeds the threshold value, indicating the absence of high noise. In the fifth sample, as shown in Figure 28e, cell 68 is the only one where the number of black pixels exceeds the threshold value, and this indicates the presence of high noise in this cell. In the sixth sample, as shown in Figure 28f, there are several cells (2,3,12,13,14) where the number of black pixels exceeds the threshold value, which indicates the presence of high noise.

3.2. Results Discussion and Evaluation

To measure the reliability and accuracy of the proposed UAV-based inspection process, the experiments for all the above cases were repeated 30 times for each tank with a total of 180 trials. This test shows the limits of the visual inspection system capability to detect defects and distinguish them from noise. The final classification process was performed by visually viewing the samples and classifying them into defective pure or defective impure samples.
Pure defects are those that are free of any noise while impure defects still contain some noise. The current study of the samples shown in Figure 23 proves that the first, second, third, fourth, and sixth samples contain pure defects and only the fifth sample contains impure defects. Figure 29 shows the classification results after executing the three stages of processing for each tank with 30 trials, in which the proposed algorithms gave an average of the right decisions equal to 83.33%, 86.66%, 80%, 86.66%, 76.66%, and 86.66% for trials in tanks 1, 2, 3, 4, 5, and 6, respectively, as shown in Figure 29. Thus, a successful classification accuracy among all trials in the six tanks is around 83.33%.
An average of 26.77% of the trials were associated with big noise that was wrongly classified, even though most of the noise was removed. The reasons behind the incorrect classification results are as follows: Some of the noise from the first stage of processing still remained in the second stage and was wrongly classified as less than 30% or higher than 70%, so such noise was not entered into second stage.
There are some failures that appear as a result of the uncontrolled and random cropping process. In order to overcome this dilemma, the input image must be cropped with a strong focus, making it more proportional and compatible with the specified restrictions, so it is necessary to crop an exact image during the execution of this process.
The fuzzy logic inference system as designed in Section 2.2 was applied to remove the noise effects. The fuzzy logic first stage was aimed at eliminating the effects of the low-scale noise as depicted in Figure 25.

4. Conclusions

This study has contributed to research on automatic inspection and defects detection of oil and gas tanks. This method includes the use of a drone capable of moving in all directions to ensure safe movement between tanks. It also includes a high-resolution imaging camera equipped with Wi-Fi technology, which is fixed in the front of the UAV and can be rotated by means of a control device to make it perpendicular to the tank so that the captured image is accurate. An image processing algorithm was developed with appropriate filters to extract the features of the inspected objects such as cracks, defects, and edges of objects on the samples, but it was still affected by several levels of noise. Three levels of noise were eliminated by using three stages of processing, two stages using the cascading fuzzy logic algorithm, and the third stage using the thresholding algorithm. The cascade fuzzy logic algorithm was implemented in two stages to distinguish between the low and medium noise from the defects. The first and second stages were able to eliminate the low and medium noise levels, respectively, while the third stage eliminates the high noise level. Low noise was calculated from output of the fuzzy logic first stage that had high condition values (first stage output set) of 70%, while output values with low condition values less than 30% were classified as pure defects. Medium noise was calculated from the output of the second stage fuzzy logic that had a condition (second stage output set) value greater than 70%. High noise was determined from the threshold stage output, where cells in the upper part of the image were classified as high noise if their density value was greater than 70% of the total black pixels, while cells in the lower part of the image were classified as high noise if their density value are more than 75% black pixels. Then the samples were categorized based on the third stage output of the thresholding process into defective or non-defective samples.
The results illustrate that the proposed inspection system is able to detect the defects with several types of oil tanks. The system was tested on 20 samples and the results showed its superiority in the inspection and detection of defects with an accuracy of 80%.

Author Contributions

Conceptualization, M.A.H.A., N.A.-D.N.A., A.A.H., M.H.G.A., T.S.M. and A.N.A.; Data curation, M.A.H.A.; Formal analysis, M.B. and M.A.H.A.; Funding acquisition, M.A.H.A., R.A. and Y.N.; Investigation, M.A.H.A.; Methodology, M.A.H.A., N.A.-D.N.A., A.A.H., M.H.G.A., T.S.M. and A.N.A.; Project administration, M.A.H.A.; Resources, M.A.H.A.; Software, M.A.H.A., N.A.-D.N.A., A.A.H., M.H.G.A., T.S.M. and A.N.A.; Supervision, M.B. and M.A.H.A.; Validation, M.A.H.A., N.A.-D.N.A., A.A.H., M.H.G.A., T.S.M. and A.N.A.; Visualization, M.A.H.A.; Writing—original draft, M.B., M.A.H.A., N.A.-D.N.A., A.A.H., M.H.G.A., T.S.M., A.N.A., R.A. and Y.N.; Writing—review and editing, M.A.H.A., J.R. and S.T. All authors have read and agreed to the published version of the manuscript.

Funding

This works is supported by Universiti Malaya (UM) and Ministry of Transportation (MOT) through Private-Research University grants, PV045-22 and GPF020A-2023.

Data Availability Statement

Data sharing not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Roberge, P.R. Handbook of Corrosion Engineering; McGraw-Hill: New York, NY, USA, 2000. [Google Scholar]
  2. Alum, M.; Eze, T. The New Faces of Corrosion and Damage Detection in Oil and Gas Facilities: A Brief of What has Worked So Far and How it Can Work for You. In Proceedings of the SPE Nigeria Annual International Conference and Exhibition, Virtual, 11–13 August 2020. [Google Scholar]
  3. Uhlig, H.H. The cost of corrosion in the United States. Chem. Engng. News 1946, 27, 2764. [Google Scholar] [CrossRef]
  4. Benjamin, A.; Cunha, D.; Campello, G.C.; Roveri; Silva, R.; Guerreiro, J.N.C. Fatigue Life Assessment of a Drilling Riser Containing Corrosion Pits. Proceedings of Offshore Technology Conference (OTC), NACE International Oil and Gas Production, Houston, TX, USA, 5–8 May 2008. [Google Scholar]
  5. Champion Technologies. Corrosion Mitigation for Complex Environments; Champion Technologies: Houston, TX, USA, 2012. [Google Scholar]
  6. Tuttle, R.N. Corrosion in oil and gas production. J. Pet. Technol. 1987, 39, 756–762. [Google Scholar] [CrossRef]
  7. Felsch, T.; Strauss, G.; Perez, C.; Rego, J.M.; Maurtua, I.; Susperregi, L.; Rodríguez, J.R. Robotized Inspection of Vertical Structures of a Solar Power Plant Using NDT Techniques. Robotics 2015, 4, 103–119. [Google Scholar] [CrossRef]
  8. Berns, K.; Hillenbrand, C.; Luksch, T. Climbing robots for commercial applications–a survey. In Proceedings of the 6th International Conference on Climbing and Walking Robots CLAWAR, London, UK, 17 September 2003; pp. 17–19. [Google Scholar]
  9. Moghaddam, A.F.; Lange, M.; Mirmotahari, O.; Hovin, M. Novel mobile climbing robot agent for offshore platforms, World Academy of Science. Eng. Technol. 2012, 68, 1353–1359. [Google Scholar]
  10. Kim, S.; Spenko, M.; Trujillo, S.; Heyneman, B.; Santos, D.; Cutkosky, M.R. Smooth vertical surface climbing with directional adhesion. IEEE Trans. Robot. 2008, 24, 65–74. [Google Scholar]
  11. Cleavinger, K.W.J. Flare system inspections for OLEFINS facilities. In Proceedings of the AIChE 2012 Spring National Meeting, Houston, TX, USA, 2–5 April 2012. [Google Scholar]
  12. Cohen, M. 7 Facts That Make the Oil and Gas Asset Inspections Risky and Costly. 17 May 2017. Available online: https://info.qii.ai/blog/7-facts-that-make-the-oil-and-gas-inspections-risky-and-costly (accessed on 11 February 2023).
  13. Shukla, A.; Karki, H. Application of robotics in onshore oil and gas industry—A review Part I. Robot. Auton. Syst. 2016, 75, 490–507. [Google Scholar] [CrossRef]
  14. Ali, M.A.H.; Lun, A.K. A cascading fuzzy logic with image processing algorithm–based defect detection for automatic visual inspection of industrial cylindrical object’s surface. Int. J. Adv. Manuf. Technol. 2019, 102, 81–94. [Google Scholar] [CrossRef]
  15. Ali, M.A.H.; Alshameri, M.A. An intelligent adjustable spanner for automated engagement with multi-diameter bolts/nuts during tightening/loosening process using vision system and fuzzy logic. Int. J. Adv. Manuf. Technol. 2019, 101, 2795–2813. [Google Scholar] [CrossRef]
Figure 1. Visual inspections stages for oil tanks.
Figure 1. Visual inspections stages for oil tanks.
Drones 07 00133 g001
Figure 2. Pro Mavic UAV.
Figure 2. Pro Mavic UAV.
Drones 07 00133 g002
Figure 3. Image Pre-processing operations.
Figure 3. Image Pre-processing operations.
Drones 07 00133 g003
Figure 4. Image cropping and resizing: (a) Original image of the whole tank captured by the UAV (b) the image after implementation of cropping and resizing operation (c) the image after conversion process from RGB to gray (d) the image after adjustment.
Figure 4. Image cropping and resizing: (a) Original image of the whole tank captured by the UAV (b) the image after implementation of cropping and resizing operation (c) the image after conversion process from RGB to gray (d) the image after adjustment.
Drones 07 00133 g004
Figure 5. The image after division into four equal parts. (a) pre-processed image part 1. (b) pre-processed image part 2. (c) pre-processed image part 3. (d) pre-processed image part 4.
Figure 5. The image after division into four equal parts. (a) pre-processed image part 1. (b) pre-processed image part 2. (c) pre-processed image part 3. (d) pre-processed image part 4.
Drones 07 00133 g005
Figure 6. Image after applying Gaussian filtration.
Figure 6. Image after applying Gaussian filtration.
Drones 07 00133 g006
Figure 7. Image after applying prewitt edge detection.
Figure 7. Image after applying prewitt edge detection.
Drones 07 00133 g007
Figure 8. Image after applied dilation.
Figure 8. Image after applied dilation.
Drones 07 00133 g008
Figure 9. Image after applied bwareaopen.
Figure 9. Image after applied bwareaopen.
Drones 07 00133 g009
Figure 10. Image after applied filling process.
Figure 10. Image after applied filling process.
Drones 07 00133 g010
Figure 11. Image after applied reverse process.
Figure 11. Image after applied reverse process.
Drones 07 00133 g011
Figure 12. Flow chart diagrams for the cascading fuzzy logic process.
Figure 12. Flow chart diagrams for the cascading fuzzy logic process.
Drones 07 00133 g012
Figure 13. The input sets of the first stage fuzzy logic: (a) input sets(x); (b) input sets(y).
Figure 13. The input sets of the first stage fuzzy logic: (a) input sets(x); (b) input sets(y).
Drones 07 00133 g013
Figure 14. Output set conditions.
Figure 14. Output set conditions.
Drones 07 00133 g014
Figure 15. Regions of input sets in first stage fuzzy logic.
Figure 15. Regions of input sets in first stage fuzzy logic.
Drones 07 00133 g015
Figure 16. Image after applied FL1.
Figure 16. Image after applied FL1.
Drones 07 00133 g016
Figure 17. Input and output sets of second FL stage: (a) first input set (b) second input set (c) output set (condition).
Figure 17. Input and output sets of second FL stage: (a) first input set (b) second input set (c) output set (condition).
Drones 07 00133 g017
Figure 18. Image after applied FL2.
Figure 18. Image after applied FL2.
Drones 07 00133 g018
Figure 19. Input image to the third stage of processing (threshold process).
Figure 19. Input image to the third stage of processing (threshold process).
Drones 07 00133 g019
Figure 20. Flow chart diagrams for the third stage of processing (threshold process).
Figure 20. Flow chart diagrams for the third stage of processing (threshold process).
Drones 07 00133 g020
Figure 21. Final image after thresholding process.
Figure 21. Final image after thresholding process.
Drones 07 00133 g021
Figure 22. Results of all processing stages.
Figure 22. Results of all processing stages.
Drones 07 00133 g022
Figure 23. Samples of oil tanks used for experiments.
Figure 23. Samples of oil tanks used for experiments.
Drones 07 00133 g023
Figure 24. Samples after image processing process.
Figure 24. Samples after image processing process.
Drones 07 00133 g024
Figure 25. Evaluation of the first stage of fuzzy logic before eliminating low noise.
Figure 25. Evaluation of the first stage of fuzzy logic before eliminating low noise.
Drones 07 00133 g025
Figure 26. Evaluation of the first stage of fuzzy logic after eliminating low noise.
Figure 26. Evaluation of the first stage of fuzzy logic after eliminating low noise.
Drones 07 00133 g026
Figure 27. Evaluation of the second stage of fuzzy logic before eliminating low noise.
Figure 27. Evaluation of the second stage of fuzzy logic before eliminating low noise.
Drones 07 00133 g027
Figure 28. Pixel density in cells processed using the thresholding process.
Figure 28. Pixel density in cells processed using the thresholding process.
Drones 07 00133 g028aDrones 07 00133 g028b
Figure 29. The results of classifying samples of tanks 1–6 (1 indicates right decision on defect detection and 0 indicates false decision on defect detection).
Figure 29. The results of classifying samples of tanks 1–6 (1 indicates right decision on defect detection and 0 indicates false decision on defect detection).
Drones 07 00133 g029
Table 1. Fuzzy rules in image_part 1 in first FL stage.
Table 1. Fuzzy rules in image_part 1 in first FL stage.
Y3Y2Y1
X3321
X2322
X1444
Table 2. Fuzzy rules in image_part 2 in first FL stage.
Table 2. Fuzzy rules in image_part 2 in first FL stage.
Y3Y2Y1
X3433
X2322
X1421
Table 3. Fuzzy rules in image_part 3 in first FL stage.
Table 3. Fuzzy rules in image_part 3 in first FL stage.
Y3Y2Y1
X3334
X2223
X1123
Table 4. Fuzzy rules in image_part 4 in first FL stage.
Table 4. Fuzzy rules in image_part 4 in first FL stage.
Y3Y2Y1
X3123
X2223
X1444
Table 5. Range variables.
Table 5. Range variables.
Range x1, y1[1 166]
Range x2, y2[166 334]
Range x3, y3[334 500]
Table 6. Rules of second stage FL.
Table 6. Rules of second stage FL.
Input Sets of Second Stage (X)Less Than 2000 Pixels(1600–8000) PixelMore Than 6000
Output
x1 (30_50)%Big defect 2Low defect_2Big defect_2
x2 (40_60)%Big defect 2Medium noise_2Big defect_2
x3 (50_70)%Low defect_2Medium noise_2Big defect_2
Table 7. Fuzzy rules in image_parts 1,2,3, and 4 in second FL stage.
Table 7. Fuzzy rules in image_parts 1,2,3, and 4 in second FL stage.
Y12Y22Y32
X12121
X22131
X32233
Where 1 is “big defect” in output sets; 2 is “low defects” in output sets; and 3 is “medium noise” in output sets.
Table 8. The group of cells being processed in the threshold process.
Table 8. The group of cells being processed in the threshold process.
12345678910
11121314151617181920
212223 282930
616263 686970
71727374757677787980
81828384858687888990
919293949596979899100
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Ali, M.A.H.; Baggash, M.; Rustamov, J.; Abdulghafor, R.; Abdo, N.A.-D.N.; Abdo, M.H.G.; Mohammed, T.S.; Hasan, A.A.; Abdo, A.N.; Turaev, S.; et al. An Automatic Visual Inspection of Oil Tanks Exterior Surface Using Unmanned Aerial Vehicle with Image Processing and Cascading Fuzzy Logic Algorithms. Drones 2023, 7, 133. https://doi.org/10.3390/drones7020133

AMA Style

Ali MAH, Baggash M, Rustamov J, Abdulghafor R, Abdo NA-DN, Abdo MHG, Mohammed TS, Hasan AA, Abdo AN, Turaev S, et al. An Automatic Visual Inspection of Oil Tanks Exterior Surface Using Unmanned Aerial Vehicle with Image Processing and Cascading Fuzzy Logic Algorithms. Drones. 2023; 7(2):133. https://doi.org/10.3390/drones7020133

Chicago/Turabian Style

Ali, Mohammed A. H., Muhammad Baggash, Jaloliddin Rustamov, Rawad Abdulghafor, Najm Al-Deen N. Abdo, Mubarak H. G. Abdo, Talep S. Mohammed, Ameen A. Hasan, Ali N. Abdo, Sherzod Turaev, and et al. 2023. "An Automatic Visual Inspection of Oil Tanks Exterior Surface Using Unmanned Aerial Vehicle with Image Processing and Cascading Fuzzy Logic Algorithms" Drones 7, no. 2: 133. https://doi.org/10.3390/drones7020133

APA Style

Ali, M. A. H., Baggash, M., Rustamov, J., Abdulghafor, R., Abdo, N. A. -D. N., Abdo, M. H. G., Mohammed, T. S., Hasan, A. A., Abdo, A. N., Turaev, S., & Nukman, Y. (2023). An Automatic Visual Inspection of Oil Tanks Exterior Surface Using Unmanned Aerial Vehicle with Image Processing and Cascading Fuzzy Logic Algorithms. Drones, 7(2), 133. https://doi.org/10.3390/drones7020133

Article Metrics

Back to TopTop