Next Article in Journal
A Framework to Develop Urban Aerial Networks by Using a Digital Twin Approach
Previous Article in Journal
Parallel Multiobjective Multiverse Optimizer for Path Planning of Unmanned Aerial Vehicles in a Dynamic Environment with Moving Obstacles
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Indoor Stockpile Reconstruction Using Drone-Borne Actuated Single-Point LiDARs

by
Ahmad Alsayed
1,2 and
Mostafa R. A. Nabawy
1,3,*
1
Department of Mechanical, Aerospace and Civil Engineering, The University of Manchester, Manchester M13 9PL, UK
2
Department of Mechanical Engineering, Umm Al-Qura University, Makkah 5555, Saudi Arabia
3
Aerospace Engineering Department, Faculty of Engineering, Cairo University, Giza 12613, Egypt
*
Author to whom correspondence should be addressed.
Drones 2022, 6(12), 386; https://doi.org/10.3390/drones6120386
Submission received: 10 November 2022 / Revised: 24 November 2022 / Accepted: 25 November 2022 / Published: 29 November 2022

Abstract

:
A low-cost, yet accurate approach for stockpile volume estimation within confined storage spaces is presented. The novel approach relies on actuating a single-point light detecting and ranging (1D LiDAR) sensor using a micro servo motor onboard a drone. The collected LiDAR ranges are converted to a point cloud that allows the reconstruction of 3D stockpiles, hence calculating the volume under the reconstructed surface. The proposed approach was assessed via simulations of a wide range of mission operating conditions while mapping two different stockpile shapes within the Webots robotic simulator. The influences from modulating the drone flight trajectory, servo motion waveform, flight speed, and yawing speed on the mapping performance were all investigated. For simple rectangular trajectories, it was found that having longer trajectories that are adjacent to the storage walls provides best reconstruction results with reasonable energy consumption. On the other hand, for short rectangular trajectories within the storage middle space, the yawing speed at corners must be decreased to ensure good reconstruction quality, although this can lead to relatively high energy consumption. Comparing the volumetric error values, the average error from the proposed 1D LiDAR system, when operating at 6°·s−1 maximum yawing speed at the corners, was 0.8 ± 1.1%, as opposed to 1.8 ± 1.7%, and 0.9 ± 1.0% from the 2D and 3D LiDAR options, respectively. Moreover, compared to 2D and 3D LiDARs, the proposed system requires less scanning speed for data acquisition, is much lighter, and allows a substantial reduction in cost.

1. Introduction

Stockpile storages are an essential element of any manufacturing plant to enable its continuous operation but are one of the most challenging confined spaces for robotic inspection. In fact, such spaces typically suffer from poor lighting and visibility conditions, lack of global positioning, uneven and slippery terrains, and sensor interference [1,2]. Missions with drones have matured in the last few decades enabling significant advantages over other mobile robotic platforms in tasks such as underground mine inspections [3], search and rescue missions [4], and 3D map reconstructions [5]. In particular, employing drones has become a more effective solution for missions within geometrically constrained environments, i.e., confined spaces. As such, their deployment for stockpile volume estimation missions within confined storage spaces are expected to enable more efficient and safer outcomes when compared to current field survey procedures. In fact, the “drones” solution becomes very appealing when considering the drawbacks of current field survey procedures including inaccurate volume estimates, time consumption, and exposure of surveying crews to hazardous conditions, especially when measurements are taken over unstable stockpiles or in dark and dusty environments [6,7].
Drones or, generally, unmanned air vehicles (UAVs) vary substantially in size, control approach, and configuration. Size-wise, drones have been developed at a wide range of scales, including from drones at the size of the Global Hawk, passing by mini-UAV systems (e.g., [8,9,10]), to micro air vehicle developments (e.g., [11,12,13]), and, most recently, the so-called nano and pico air vehicles (e.g., [14,15,16,17]). Drones have also been ideal platforms for demonstrating novel control approaches that are hard to test on manned air vehicles, such as morphing [18,19,20], fluidic control [21,22,23,24], and even using novel control allocation methods via conventional control surfaces [25,26,27]. Configuration-wise, Hassanalian and Abdelkefi [28] categorized drones in terms of their mission capability into HTOL (horizontal take-off and landing), VTOL (vertical take-off and landing), hybrid models (such as tilt-wing, tilt-rotor, tilt-body, and ducted fan configurations), helicopter, heli-wing, and unconventional types. Adding to this, the development of bioinspired flapping wing configurations has also attracted tremendous attention in the last two decades [29,30,31,32]. That said, Shahmoradi et al. [33] illustrated that fixed-wing and rotary-wing drone configurations are the most common within research and commercial applications for 3D mapping missions. In fact, fixed-wing drones, classified within the HTOL category, are the favourite option for surveying large areas in short periods of time. On the other hand, rotary-wing (particularly multirotor) drones, classified within the VTOL category, offer the ability to hover as well as fly in all directions, making them ideal for indoor missions.
Aerial surveying and mapping missions have commonly used photogrammetry or LiDAR sensors [34,35,36] within a wide range of applications, such as forestry [37], agriculture [38], and geoengineering [39]. Basically, in photogrammetry, 3D models are created based on the large amounts of captured 2D images [40], whereas LiDAR sensors measure distances using laser pulses to objects [41,42]. Clearly, aerial missions can deliver advantages to mapping applications, and in what follows, the focus of the discussion will be on studies that have used drones to map and estimate the volume of stockpiles, starting with studies that demonstrated such application in outdoor areas [43,44,45,46]. Ajayi and Ajulo [43] estimated a stockpile volume using both ground terrestrial laser scanning (TLS) and drone photogrammetry approaches. Their results showed that the percentage difference between the two methods to the actual volume was 2.94% and −2.31%, respectively. While the absolute error of both approaches was close, drone photogrammetry was more time efficient compared to TLS. A similar study by Son et al. [44] also compared TLS and drone photogrammetry for estimating the volume of waste stockpiles. The waste volume computed using the TLS approach was 41,226 m3, whereas drone photogrammetry led to volumes ranging between 41,256 m3 to 43,042 m3, i.e., 0.1% to 4.4% more volume compared to the TLS method. That said, the TLS approach required placing ground control points (GCPs) at the waste heights, which is usually a difficult task that can be unsafe at times. Rohizan et al. [45] estimated the volume of a truncated cone stockpile using drone photogrammetry and compared it to a “conventional” method. The conventional method required manual measurements of the dimensions of the truncated cone stockpile, including the perimeters of the base and top and the length of the slant. Their results showed a 2.5% difference between the two methods. As a final example, Kaamin et al. [46] demonstrated the use of drones for image acquisition of the volumetric change of a stockpile at a landfill over a two-month duration. However, an assessment of the accuracy of such estimation was not possible due to the absence of information on the actual volume of the landfill. While the studies discussed above confirmed the feasibility of deploying drones for stockpile volume estimation missions, all of the flight missions were conducted outdoors in open areas. Notably, when stockpiles are placed in dark and confined spaces, photogrammetry becomes infeasible. This is besides the additional challenges introduced to the mission due to operation in confined spaces, including drone localization [47], navigation within GPS-denied environments [48], obstacle avoidance [49], and coverage path planning for unknown environments [50].
Several studies have demonstrated the use of drones for indoor missions, such as for 3D mapping of gas distribution [51], sewer systems inspection [52], fire and smoke detection [53], and underground mines inspection and mapping [3,54]. For a comprehensive discussion of such aerial missions within confined spaces, the reader is referred to the work by Alsayed et al. [1]. Despite the several demonstrations of the use of drones for different applications within confined spaces, their use for indoor stockpile volume estimation missions is relatively very limited. In fact, in a previous study [1], the feasibility of such missions was demonstrated and LiDAR sensors were shown to be the most adequate choice. In May 2022, a company named “Flyability”, specialised in aerial confined spaces inspection, announced their new generation of drones capable of mapping stockpiles within dusty and dark confined spaces. This capability is mainly because of the upgrade of the drone platform with a 3D LiDAR scanner [55]. However, the new platform has lower flight endurance, compared to their previous platform, due to the extra weight and the data processing requirements of the newly added 3D LiDAR. Notably, there have also been some non-aerial approaches to stockpile volume estimation within confined spaces. For example, Manish et al. [56] presented a mapping system called stockpile monitoring and reporting technology (SMART) for indoor applications. The system consisted of two 3D LiDAR sensors and one RGB camera mounted on the same extended tripod. A stockpile volume was estimated with 1% volumetric error when compared to a reference volume estimated from TLS. However, this approach required manually moving and rotating the tripod into several locations around the stockpile. Another example is the scanning system proposed by Xu et al. [57] who used two 2D LiDAR scanners and a range finder sliding on a rail mounted at the top of the storage space. The system showed less than 1% volumetric error when compared to results from a Geo-SLAM scanner. However, its drawbacks include being expensive in terms of infrastructure costs and the limitation of its deployment to a single storage space, as a new system will be needed for every new stockpile storage space.
It is evident, from the previous examples, that the multi-dimensional laser scanning technology (including LiDAR technology) is the most common surveying and mapping tool used within aerial missions in confined spaces. In fact, LiDAR sensors are better at penetrating smoke, darkness, and dust than visible light sensors such as cameras [58,59,60]. 3D LiDARs are the most demonstrated sensors within 3D scanning missions, including indoor stockpile volume estimation [56], terrain surveying [61], plant volume prediction [62], documentation of historical monuments [63], and automatic driving [64]. However, 3D LiDARs are generally more expensive and heavier compared to their 1D and 2D counterparts. This motivated several studies to investigate using cheaper and lighter 2D LiDAR options to generate reconstructed 3D maps, e.g., [65,66,67,68,69,70,71,72]. That said, and while being able to provide accurate estimates, 2D LiDARs still remain much more expensive and heavier when compared to 1D LiDARs.
The motivation behind this work is to better exploit 1D LiDARs and enhance the accuracy of their 3D reconstructed maps to similar levels of that achieved by 2D/3D LiDARs, while staying cost competitive. We propose a simple, yet effective approach to achieve our objective through actuating a single-point LiDAR (1D LiDAR) fitted on a small drone in an oscillatory motion, enabling it to capture more data points, hence improving the reconstructed 3D maps. The approach is demonstrated through stockpile volume estimation aerial drone missions within fully confined storage spaces. The proposed solution was evaluated within the Webots robot simulation toolset. The impact of changing the operating conditions on the mapping and volume reconstruction quality was thoroughly investigated, and results obtained using the proposed actuated 1D LiDAR were compared to those obtained from 2D/3D LiDAR scanners. We show that, for applications (such as stockpiles reconstruction, underground mines draft mapping, and greenhouse plant growth detecting) that need strict cost control and involve mapping only, rather than simultaneous localization and mapping (SLAM) [73], the proposed actuated 1D LiDAR is a convenient solution. The rest of the paper is organized as follows: In Section 2, the models needed for reconstruction are detailed, including a description of the proposed system, waveforms adopted for actuation, point cloud reconstruction, and a proposed metric to assess the reconstruction quality. Section 3 describes the simulation setup, explaining the drone platform and control system, the desired trajectories, and simulation operational parameters. Comprehensive assessments and discussions of the simulation results are presented in Section 4, and the concluding remarks and future work are finally presented in Section 5.

2. Reconstruction Modelling

2.1. System Overview

A single-point LiDAR sensor, or a laser range finder, operates based on the time-of-flight principle. Briefly, a narrow 1D laser beam of an appropriate wavelength is shot. The laser beam, then, hits an object in the scene and returns to the sensor. Knowing the time it takes for the beam to return, combined with the speed of light in the medium, range measurements can be obtained. Our proposed solution to enhance the reconstruction of 3D maps from a 1D LiDAR sensor is to actuate it about one axis using a micro servo motor fitted on a small drone, as illustrated in Figure 1. The servo motor shaft oscillates within ± 90 ° in a plane perpendicular to the drone’s forward motion. Of note is that, in what follows, the word “drone” will be used to refer to a “quadcopter” air vehicle.

2.2. Servo Motor Motion Profile

Assuming the servo motor oscillates at its maximum speed through a rectangular waveform, the corresponding angle positions will thus follow a triangular waveform, as shown in Figure 2. However, this arrangement may not be practical as the motor, in reality, cannot change its speed direction in an instantaneous fashion when it reaches the maximum angular positions at + π / 2 and π / 2 . As such, to generate more realistic waveforms for the servo motor motion, the following waveform variation was employed for simulations [74,75]:
γ t = γ m a x sin 1 C γ sin 1 C γ cos 2 π f t ,
γ ˙ t = γ m a x sin 1 C γ 2 π C γ sin 2 π f t 1 C γ 2 cos 2 2 π f t ,
where γ and γ ˙ are the servo motor instantaneous angle and angular velocity, respectively;   γ m a x is the angle amplitude set here to π / 2 ; C γ is a parameter that controls the shape of the waveform and can vary between zero (sinusoidal waveform) to unity (triangular waveform for angle); f is the frequency; and t is the time. Figure 2 shows how the proposed variation provides a suitable means to control the shape of the waveform for both angular velocity and position.

2.3. 3D Point Cloud Reconstruction

The collected ranges from the 1D LiDAR along the flight path are converted to the ground coordinate frame, G , as shown in Figure 3. This process involves defining the LiDAR ranges, r , in the 3D coordinates system of the LiDAR coordinate frame, L , using the servo motor angles, γ , as follows:
P L = x L y L z L = r cos γ 0 sin γ ,
where P L defines a 3D point relative to the coordinate frame L .
Next, a point P L is converted to its corresponding point P G in the coordinate frame G to generate the 3D point cloud reconstruction, as follows:
P G = R L G P L + T L G
where the collected points in P G represents the 3D reconstructed map of the scanned environment; and R L G and T L G are the rotation matrix and translation vector from coordinate frame   L to coordinate frame G , respectively. Here, the rotation matrix is obtained as follows:
R L G = 1 0 0 0 cos ψ L sin ψ L 0 sin ψ L cos ψ L cos θ L 0 sin θ L 0 1 0 sin θ L 0 cos θ L cos ϕ L sin ϕ L 0 sin ϕ L cos ϕ L 0 1 1 1 ,
where ϕ L , θ L , and ψ L are the LiDAR orientations around its x L , y L , and z L axes, respectively. On the other hand, the translation vector is written as:
T L G = x G y G z G
where x G , y G , and z G are the LiDAR 3D coordinates in the coordinate frame G .

2.4. Evaluating the Stockpile Volume

The volume of the scanned environment (stockpile in this study) is estimated by calculating the volume under the generated reconstructed 3D map. First, a 3D surface is fitted over P G , i.e., the points that are defined in the coordinate frame G . Because of the simplicity of the actuated 1D LiDAR concept, the reconstructed 3D map of the stockpile is expected to be of low density. Hence, the MATLAB fit function [76] was used with a cubic interpolation scheme to represent the fitted surface. Then, a uniform 2D grid was defined as G x y across the fitted surface. The height above each G x y was defined as H z , completing the generation of a 3D grid. Therefore, the total volume of the stockpile was evaluated as the sum of the 3D grids as follows:
V e v a l u a t e d = H z G x y .
The percentage volumetric error, V e , between the evaluated and actual volumes was then obtained as follows:
V e = V e v a l u a t e d V a c t u a l V a c t u a l × 100 % .

2.5. Reconstruction Ratio

As a further means to evaluate the quality of reconstruction for the mapping missions, here, a metric so-called projected reconstrued area ratio, R a , is proposed. This metric is defined as the ratio between the projected area of the generated surface, a p , to the total area to be mapped, A t :
R a = a p A t   .
As such, R a provides a loose indication of whether the stockpile was partially or fully reconstructed. Of note is that because the surface of a stockpile is generated across the collected points, the quality of reconstruction of that stockpile is limited to how these collected points are distributed within the total area being mapped. For instance, the stockpile example shown in Figure 4 is partially reconstructed, and the white areas (non-reconstructed areas) are because no points were collected from these regions.

3. Simulation Setup

3.1. Simulation Environment and Stockpiles Design

The proposed actuated 1D LiDAR system was used for stockpile aerial scanning within a fully confined storage space while onboard a quadcopter. The simulation model to replicate actual missions was developed using the open-source simulation software Webots (2021a) (Cyberbotics Ltd., Lausanne, Switzerland) [77]. The simulation environment has three main parts: (1) the robotic platform, (2) the storage environment, and (3) the stockpile. The robotic platform (drone) will be discussed in detail in Section 3.2. The storage was realized within the simulation environment using the provided tools within the Webots libraries. The stockpile was then incorporated as an STL file. Of note is that the 3D model of the stockpile was generated by setting random heights within a range between one and seven meters. The locations of the points used to allocate these random heights were distributed uniformly within a 2D plan with the same dimensions as the storage space, see Figure 5a. Then, a 3D surface was fitted over the points, as shown in Figure 5b, and the actual volume under the surface was evaluated in a similar way to that described in Section 2.4. The 3D surface was then converted into an STL 3D model file using the function provided by [78], as shown in Figure 5c. Finally, the 3D model of the stockpile was implemented in the simulation environment, as shown in Figure 5d,e.
In this work, two stockpiles were investigated. The first was developed using random heights while setting the heights of the points along the edges/walls to zero. The second stockpile had no constraints; hence all of its points were allowed random heights. As a result, the first stockpile had less material volume near the walls, Figure 5d, whereas the second had more material volume by the walls, Figure 5e. This was decided so that the effectiveness of the proposed system can be fairly assessed. Of note is that, typically, most volume estimation errors are due to the regions close to walls where it is harder to reconstruct the 3D map as these regions can only be scanned from one side. While this can be overcome by flying very close to the walls, it is clearly unsafe to conduct such flight mission scenarios with a drone. In contrast, the areas in the middle of stockpiles are easier to map as they can be scanned from several directions.

3.2. Robotic Platform

A quadcopter drone from the Webots in-build libraries was used for the mission. The drone in the simulation is programmed with an attitude stability control system using the data from the onboard inertial measurement unit (IMU) and gyro. The objective of the control system is to provide flight stability via changing the speeds of the four motors using gains to achieve the required altitude, heading, pitching, and rolling references. There are four user inputs: reference altitude, z r , rolling gain, ϕ G , pitching gain, θ G , and heading gain, ψ G . While a user can fly the drone using the provided keyboard program, in this paper, this was replaced with the autonomous control system to follow the desired trajectory and speed. This autonomous control system is thus deemed necessary to allow convenient controllability and, hence, repeatability in implementing the trajectories of the conducted tests. Robots within Webots can be controlled with controllers written in C/C++, Java, Python, or MATLAB. In this work, the quadcopter controller was written in Python.
The proposed actuated 1D LiDAR system was attached to the drone while employing a servo motor and a 1D LiDAR sensor from the simulation libraries. The servo motor is a rotational motor with a position sensor. The motor properties, such as its maximum velocity and minimum and maximum positions, can be adjusted to achieve the motion profiles described in Section 2.2.
The overall block diagram of the drone control system is illustrated in Figure 6. The drone positions ( x , y , z ), orientations ( ϕ , θ , ψ ), and angular speeds ( ϕ ˙ , θ ˙ , ψ ˙ ) are obtained directly from the simulation. The objective of the altitude controller was to minimize the error in the altitude, z , by satisfying the following condition:
lim t e z = z r z = 0 ,
where e z is the altitude tracking error and z r is the reference altitude input. To achieve this goal, a proportional–integral–derivative (PID) controller was applied to obtain the altitude gain, z G , as follows:
z G = k P e Z + k I 0 t e Z d t + k D d e Z t d t + B ,  
where k P ,   k I , and k D are the proportional, integral, and derivative tuning parameters gains, respectively. Moreover, t is the time, and B is the bias component that provides a steady offset.
The objective of the heading controller was to minimize the error in the heading yaw angle, ψ , by satisfying the following condition:
lim t e ψ = ψ r ψ = 0 ,
where e ψ is the heading angle tracking error and ψ r is the required angle to the reference trajectory waypoint ( x r , y r ). The angle ψ r was estimated as follows:
ψ r = tan 1 y r y x r x + π , ( x r x < 0 ) tan 1 y r y x r x + 2 π , ( x r x > 0 ) ( y r y < 0 ) tan 1 y r y x r x , otherwise
where the range of ψ r is defined as 0 ° ψ r < 360 ° . Equation (12) was achieved with a proportional controller to feed the heading gain, ψ G , to the drone attitude control, as follows:
ψ G = k P e ψ , 1 ψ G 1 .
where k P is the tuning gain.
The drone speed, V , was controlled via the pitching gain, θ G . To obtain the relationship between the input θ G and the output V , a flight test within the simulation was conducted with different θ G values. To ensure smooth yawing, the speed controller should consider both e ψ and the normal distance, D , to the next reference trajectory waypoint ( x r , y r ). In other words, θ G was cast to be directly proportional to D and inversely proportional to e ψ , allowing the definition of the following relationship:
θ G = k D e ψ , 0 θ G θ G , m a x
where k is a tuning coefficient, and θ G , m a x is the pitching gain that is required to reach the desired flight speed, V . The value of D was calculated as follows:
D = x r x 2 + y r y 2 .
There was no requirement for drone rolling. Therefore, the rolling gain, ϕ G , was set to zero.

3.3. Desired Trajectory Waypoints

In the current study, the storage spaces were depicted as rectangular. While this was decided to allow more simplicity to the investigation, the rectangular shape is indeed the most common configuration in most industrial plants. Given the storage planform rectangular shape, it was decided that the desired drone trajectory for the mapping mission would also follow a rectangular shape. As such, the trajectory was determined by four waypoints distributed close to the storage corners, as shown in Figure 7. The parameters L and W in Figure 7 are the storage length and width, respectively. Moreover, the parameters T L and T W were defined as follows:
T L = P t r a j   L ,
T W = P t r a j   W ,
where P t r a j is defined as a percentage that defines how much the trajectory path is offset from the walls. This definition was introduced to enable a preliminary analysis of the best allocations of waypoints within a storage for the mission purpose under consideration, as will be shown later in the results section. Of note is that, depending on the specifications of the used LiDAR scanner, more allocation points may be needed to adequately define the trajectory. This can be the case when the storage dimensions are greater than double the LiDAR range. In such cases, adding more allocation trajectory points allows prevention of un-scanned areas in the middle regions.

3.4. Simulation Parameters

Table 1 shows the considered simulation parameters. The desired flight speed, V , was set to a low speed, following the recommendation in [79] for risky and confined space environments. Of note is that the flight speed was achieved via the pitching gain, θ G , m a x , as described in Section 3.2. Figure 8 shows the relationship between the recorded flight speed, V , versus a range of θ G , m a x from 0 to 5. This relationship was obtained from primary flight tests within the simulation, designed to collect data correlating speed to gain values.
The servo motor waveform frequency, f , was obtained from Equation (2) after setting the desired maximum angular speed to 5 rad·s−1. This servo motor speed was decided for being a common operating speed, equivalent to around 0.2 s/60° [80]. A generic servo motor was selected from the simulator library whose maximum absolute angular speeds were set according to the desired parameters. Similarly, the 1D LiDAR implemented was from the simulator library and fitted to the servo motor. The LiDAR range and rate were taken based on the common 1D LiDAR sensor: LIDAR-Lite v3 [81]. Finally, the dimensions of the storage space considered in the mapping missions were based on available information on typical storage sizes within industrial plants [82].
It is worth mentioning that the proposed actuated 1D LiDAR system suffers from having a relatively slow oscillation speed due to limitations from using a low-cost servo motor. This makes the range data collected by the LiDAR dependant on its initial angular position at the start of the mission, and this can influence the quality of reconstruction achieved. To compensate for this effect, each test was repeated with 20 different initial angular positions for the servo motor, as demonstrated in Figure 9. Then, the mean volumetric error, V e ¯ , and standard deviation from the 20 tests were evaluated. Of note is that the 20 initial angular positions are split into 10 positions while rotating in the clockwise direction and the same 10 positions while rotating in the anti-clockwise direction. This is to account for the fact that the LiDAR ranges can also differ depending on the rotation direction.

4. Results and Discussion

4.1. Baseline Testcase

The simulations considered as a baseline testcase in this work are based on the values documented in Table 1 and the semi-triangular waveform servo motion profile described in Section 2.2. Here, the system performance was measured based on the volumetric error, V e , between the evaluated stockpile volume from a mapping mission to the actual volume, Equation (8). Moreover, the two different stockpiles, one with less volume by the walls and the second with more volume by the walls, shown in Figure 5d,e, respectively, were considered for investigation. The 2D grid size, G x y , in Equation (7) was set to 0.1 m2. This was decided after carrying out a sensitivity analysis showing that using smaller grids than this size has no effect on the accuracy of the obtained results. Figure 10a shows an example of the point cloud, P G , obtained from using the proposed actuated 1D LiDAR scanning for stockpile 1 at P t r a j = 20 % ; whereas, Figure 10b shows the fitted surface across the point cloud in Figure 10a. Values of the mean stockpile volumetric error and standard deviation (due to servo motor initial positions) versus the trajectory waypoints allocation percentages are shown in Figure 11. It is worth mentioning that all simulations conducted did not consider the effect of noise from the sensors, which can lead to additional uncertainty in the estimated volumetric error results. This is based on an assessment for the effect of noise, presented in Appendix A, which showed that such effect can be argued to be minor.
Generally, V ¯ e increases with P t r a j mainly because as P t r a j increases, the total flight distance decreases, as shown in Figure 12. This reduction in the flight distance can lead to blind spots, i.e., spots where the LiDAR cannot capture sufficient data. Figure 12 shows the projected scans on the XY-plane, where it can be seen from Figure 12c,d that blind spots (white regions in the figure) are more significant at maximum P t r a j and are mostly at the corners (recall the LiDAR scanning plane is perpendicular to the flight line). The reduction in the scanned area (or presence of more blind spots) can be assessed using the proposed projected reconstructed area ratio, R a , defined in Section 2.5. As P t r a j   increases from 5% to 30%, the mean value of R a reduces from ≈100% to 89% for stockpile 1 and from ≈100% to 95% for stockpile 2.
The mean volumetric error values from scanning both stockpiles are almost zero when using the trajectory that is closest to the walls, i.e., P t r a j = 5 %. The V ¯ e values from scanning stockpile 1 (with less material near the walls) then increase, but at a lower rate compared to those of stockpile 2 till P t r a j reaches 20%. After P t r a j = 20 %, the value of V ¯ e for stockpile 1 starts to increase at a rate that is higher than that of stockpile 2 reaching a value of 6.1% as P t r a j reaches 30%. This steeper change is because the material by the walls of stockpile 1 have low height and, hence, become more prone to be in a shadow region as P t r a j   increases significantly. On the other hand, the rate of increase in V ¯ e for stockpile 2 (with more material near the walls) remains consistent for the whole P t r a j values, and V ¯ e   reaches 4.3% as P t r a j reaches 30%.
Of note is that the influence of the initial position of the servo motor becomes more noticeable at higher P t r a j values, as can be seen from the larger standard deviation values and, hence, the larger shaded areas at the higher P t r a j cases in Figure 11. This is mainly because fewer data will be collected from the storage corners for such P t r a j   cases (as shown in Figure 12c,d), and, hence, changing the initial position of the servo motor will have a more noticeable influence on the amount of collected data from these regions. As a result, the mean volumetric error was estimated with almost zero uncertainty for P t r a j = 5%, but with ±0.5% and ±0.4% uncertainties for P t r a j = 30%, for stockpiles 1 and 2, respectively.

4.2. Influence of Varying Operational Parameters

In this section, we assess the influence of varying the operational parameters stated in Table 1. The main aim is to identify whether a considerable improvement in the mapping performance can be achieved at other operational conditions. We start with investigating the effect of the oscillation motion waveform of the servo motor. The motion waveform in the previous demonstrations was the semi-triangular waveform shown in Figure 2. Here, we further test the system performance when using a sinusoidal waveform, i.e., when C γ in Equation (1) approaches zero. The frequency, f , of the sinusoidal wave was adjusted to 0.5 Hz to enable the same desired maximum angular speed of the semi-triangular waveform. The simulation test results of scanning both stockpiles compared to the baseline testcase of Section 4.1 are shown in Figure 13. It is evident that when P t r a j ranges from 5% to 20%, both waveforms offer similar performance. As for the higher P t r a j cases of 25% and 30%, the values of V ¯ e from the semi-triangular waveform are slightly lower by an average of 0.3%. This is mainly because of the higher frequency of the semi-triangular waveform compared to the sinusoidal waveform.
In the baseline testcase simulations, the desired flight speed, V , was always set to 1.0 m·s−1, as shown in Table 1. Now, the influence of changing the flight speed is investigated to assess how changing the time duration allowed for scanning along a given trajectory would influence the reconstruction quality. Figure 14 shows the results of this investigation where it is evident that flying at different speeds has no clear impact on enhancing the volume error for both stockpiles. However, it must be stressed that flying indoors at relatively higher speeds, such as the 2 m·s−1 in this demonstration, is not recommended from a flight safety perspective. On the other hand, when taking into account energy budget considerations, flying at a lower speed, such as the 0.5 m·s−1 in this demonstration, leads to more energy consumption, as will be shown later in Section 4.3. Hence, on balance, it seems that the baseline testcase flight speed of 1 m·s−1 is the most convenient choice.
A possible way to enhance the coverage of scanning and reduce blind spot areas, particularly at corners, is by modifying the way the drone turns to a “two-step” turning fashion. This two-step turning, as illustrated in Figure 15, is proposed so that instead of turning to the next waypoint in one 90° step, the drone turns in two-steps each at 45°. Figure 16 shows the volumetric error for the baseline testcase results and that from adopting the two-step turning approach. Generally, the trajectory with the two-step turning at the corners enhances the volumetric error values but with different impact on the two stockpiles. For stockpile 1, the value of V ¯ e was only marginally enhanced at the maximum P t r a j case from 6.1% to 5.7%. However, for stockpile 2, the value of V ¯ e improved for almost all P t r a j cases, as the average value of V ¯ e reduced from 1.6% to 0.9%, and the maximum value for V ¯ e reduced from 4.3% to 2.3% at the maximum P t r a j   case.
The previous demonstration of the two-step turning method showed that tuning the flight trajectory/parameters near the corners can have a considerable influence on the reconstruction quality. As such, to enhance reconstruction quality via reducing the blind spot areas seen in Figure 12c,d, we also tested the effect of lowering the drone’s maximum yawing speed, ψ ˙ m a x . This was mainly decided to allow more scanning time for the LiDAR at the corners. The maximum yawing speed from the recorded flight test data of the previous tests was 30°·s−1 based on default settings within the Webots simulator. Here, the simulation tests were repeated with maximum yawing speeds of 17, 6, and 1°·s−1, representing a good range of possible options and achieved through consistently reducing the limit value of ψ G and the gain value of k P   in Equation (14). Figure 17 shows the results of this effect on the evaluated mean volumetric error. It is evident that a significant improvement in both the mean and standard deviation values is achieved as the yawing speed is lowered. In particular, when comparing the results from the maximum yawing speed case of 1°·s−1 to those of the baseline yawing speed for stockpile 1, a clear enhancement is evident for the P t r a j = 25% and 30% cases, as V ¯ e decreased from 2.6% and 6.1% to 1.1% and 2.2%, respectively. Results of scanning stockpile 2 show an even better enhancement, as the influence of lowering the drone’s maximum yawing speed to 1°·s−1 is evident through the whole cases of P t r a j , as V ¯ e shows an almost flat trend, with the maximum V ¯ e value being reduced from 4.3% to 0.6%. These improved results are mainly due to the ability to capture more data, particularly in the corner regions, allowing for better reconstruction and less blind spots, as illustrated in Figure 18. That said, and despite the huge improvement, this was achieved on the expense of having a significantly higher energy consumption, as will be demonstrated in the next section.
It should be noted that, for stockpile 1, effects of any enhancement are always only evident at high P t r a j values. This is because at lower P t r a j values, the error values are already excellent; hence, there is not big room for improvement. This is not the case for stockpile 2, where enhancements enable improvements over a wider range of P t r a j values.

4.3. Energy Consumption

The drone’s total energy consumption during a scanning mission for each test conducted so far was also recorded. Here, the total energy consumption is used as an indication of the cost of the mission. Of note is that the energy consumed here is for operating the drone and does not include the energy for actuating the servo; however, servo power is expected to be much lower anyway. Robots with motors running within the Webots simulator have a battery feature that, when enabled, can provide an estimation of the consumed energy. This consumed energy sums up the energy consumption from all motors and is obtained by integrating the power consumption over time.
To enable an easy visual comparison, the change in total energy consumption, Δ E t o t a l , is presented with reference to the baseline testcase at a P t r a j value of 5% as follows:
Δ E t o t a l = E t e s t c a s e E b a s e l i n e P t r a j = 5 % E b a s e l i n e P t r a j = 5 % × 100 ,
where E t e s t c a s e is the energy consumption for a specific testcase. Results for this demonstration were plotted versus the trajectory waypoints allocation percentages, P t r a j , in Figure 19.
Several interesting insights can be extracted from Figure 19. First, and as expected, the total energy consumption decreases with the increase in P t r a j as the flight path gets shorter for the same given set of flight parameters. Second, the slope of the variation is mainly influenced by the flight speed, with a steeper decrease for lower flight speeds and a shallower decrease for higher flight speeds. Third, it is evident that reducing the flight speed from V = 1 m·s−1 to V = 0.5   m·s−1 has an influential effect on Δ E t o t a l , almost doubling the energy consumption, as the drone took longer times to complete the same tasks. However, the highest energy consumption for the cases considered is for the case with the maximum yawing speed of 1°·s−1. Remarkably, a small sacrifice in the reconstruction quality via adopting the maximum yawing speed case of 6°·s−1 can make a huge difference in energy consumption leading to a modest average increase in Δ E t o t a l by 22%, with respect to the baseline case. For completeness, we also show in Figure 19 the energy consumption for other cases of flight speed, two-step turning, or other maximum yawing speeds; however, these cases are not recommended either for safety considerations or due to their limited enhancement of the reconstruction quality as discussed in the previous sections.

4.4. Recommended Operational Parameters

In this section, we attempt to summarise the main outcomes from the previous assessments in terms of identifying operational flight parameters that would lead to the most convenient scanning results. For the rectangular trajectories considered in this study, it is evident that flight trajectories that are closest to the walls lead to the best reconstruction quality; however, these also lead to a higher energy consumption compared to trajectories that are very close to the middle of the storage space. While these middle trajectories lead to lower energy consumption, being shorter, they can lead to a considerable negative impact on the reconstruction quality. On balance, it seems that a trajectory with P t r a j = 15–20% provides a good compromise, where the volumetric error values remain very similar to those obtained from trajectories with P t r a j = 5% but with around 16% reduction in energy consumption.
If geometric constraints exist within the storage space and only trajectories that are close to the middle space are allowed, then the most influential parameter to modulate is the maximum yawing speed value at the corners. It was shown that lowering the maximum yawing speed to 1°·s−1 for the short trajectories leads to a significant reduction of the volumetric error values. That said, this was achieved on the expense of energy consumption. However, if energy consumption is a main constraint, then a sacrifice in the maximum yawing speed to 6°·s−1 has been shown to lead to average volumetric errors of 1% and 0.5% for stockpiles 1 and 2, respectively, while leading to an extra 22% in Δ E t o t a l , on average, compared to the baseline case.
To further demonstrate the reconstruction quality of the different cases of interest, Figure 20 shows the mean projected reconstrued area ratio, R ¯ a . It can be seen that adopting a maximum yawing speed of 6°·s−1 enhanced the R ¯ a values at the maximum P t r a j from 88.6% to 92.1% and from 94.7% to 96.7% for stockpiles 1 and 2, respectively. Further decrease in the maximum yawing speed to 1°·s−1 led to R ¯ a values at the maximum P t r a j of 94.7% and 97.7% for stockpiles 1 and 2, respectively.
It may be worth mentioning here that there are other ways to enhance the estimated results without increasing the flight time and/or energy consumption, such as flying at a higher altitude and/or increasing the servo motor speed. However, the maximum allowable altitude can be limited by the geometric constraints of the confined space and/or the allowable sensor range. Moreover, using a high speed and precise servo motor for actuation can lead to costs similar to a 2D LiDAR; hence, the whole approach becomes cost ineffective.

4.5. Comparison with 2D and 3D LiDAR Scanners

It is instructive to compare the performance of the proposed scanning approach to that from more complex LiDAR sensors. As such, the volumes of the two stockpiles considered in this study were estimated using the 2D LiDAR, Hokuyo UTM-30LX [83], as well as using the 3D LiDAR, Velodyne Puck LITE [84], representing common choices within the field of robotics and scanning [85,86,87,88]. In terms of the scanning speed, the proposed actuated 1D LiDAR can read 100 points/s, whereas the 2D and 3D LiDARs can read 43,200 points/s and ~300,000 points/s, respectively. This massive increase in scanning speed, especially for the 3D LiDAR, inevitably leads to significantly more computational requirements. In terms of cost, the proposed scanning solution (servo motor and the 1D LiDAR) costs around $130, whereas the 2D and 3D LiDARs cost around $3500 and $6000, respectively (cost figures based on 2022 prices). As for the system weight, the proposed scanning solution (servo motor and the 1D LiDAR) weighs around 40 g, whereas the 2D and 3D LiDARs weigh around 370 g and 830 g, respectively. Clearly, as the weight of the payload increases, this will increase the drone size and/or reduce the maximum flight time, hence, introducing more challenges to indoor inspection missions. All in all, it is clearly evident that the proposed system is largely better from other 2D and 3D LiDAR scanners as far as computational requirements, cost, and weight are concerned. It thus remains to compare favourably in terms of the scanning accuracy to demonstrate the full effectiveness of the proposed solution.
To compare the scanning accuracy of the different scanners, the two stockpiles were scanned based on the same trajectory paths and baseline flight parameters. The point cloud collected by the 2D and 3D LiDAR scanners was transformed into the ground coordinate frame G , as shown in Figure 21. Clearly, it is evident that the 2D and 3D LiDAR scanners provide denser point clouds compared to the proposed actuated 1D LiDAR (please refer to the actuated 1D LiDAR point clouds shown in Figure 12a,c for the same stockpile and flight trajectory paths).
The estimated volumetric error values of the actuated 1D LiDAR (including the 1 and 6°·s−1 maximum yawing speed cases) compared to those from the 2D and 3D LiDAR scanners are shown in Figure 22. Remarkably, the estimated volumetric error, V e , using the 3D LiDAR, at lower P t r a j values, are greater than those from the actuated 1D LiDAR and the 2D LiDAR cases. This is because of the enormous amount of collected data by the 3D LiDAR that superficially led to an evaluated volume that is larger than the actual volume. That said, this increase in collected data was beneficial at the higher P t r a j values, as the 3D LiDAR scanner was able to show low volumetric error values. However, for stockpile 1 at the maximum P t r a j value, the volumetric error from the 3D LiDAR was still as high as 4.0% due to the unscanned areas (the white blind spot areas in Figure 21d). In other words, the use of the most capable 3D sensor did not offer a clear advantage for scanning cases with short trajectories and less material by the walls.
The volumetric error mean and standard deviation values for the two stockpiles and for the different P t r a j cases from the actuated 1D LiDAR (1 and 6°·s−1 maximum yawing speed cases), 2D LiDAR, and 3D LiDAR are 0.4 ± 0.6%, 0.8 ± 1.1%, 1.8 ± 1.7%, and 0.9 ± 1.0%, respectively. Regarding energy consumption, tests using the 2D and 3D LiDARs led to 31% and 64% average increase in Δ E t o t a l compared to the actuated 1D LiDAR baseline testcase, respectively, as illustrated in Figure 23. This is mainly due to the increased payload weight of the 2D and 3D LiDAR scanners.
Table 2 provides a summary comparison for the performance of the different systems investigated in this section. Considering the reconstruction quality (judged via the volumetric error and the projected reconstructed area ratio), it is evident that the proposed 1D LiDAR system with the 1 and 6°·s−1 maximum yawing speed settings is generally capable of achieving better results compared to those obtained from 2D/3D LiDARs. In fact, the 2D LiDAR results are the worst in this respect; however, it provides better energy consumption figures compared to the 3D LiDAR case. That said, when adopting the 1°·s−1 maximum yawing speed in the proposed system, this leads to a much higher energy consumption expense, compared to all other cases. As such, it seems that when operating the actuated 1D LiDAR at maximum yawing speed setting around the 6°·s−1 value, the best compromised results for both reconstruction quality and energy consumption are obtained.

5. Concluding Remarks and Future Work

This work presented a novel 3D mapping approach employing a drone-borne actuated 1D LiDAR that was demonstrated via the stockpile volume estimation application. An actuated single-point LiDAR was fitted on a micro servo motor and attached to a drone. The servo motor was oscillated in a plane perpendicular to the drone’s forward motion. The LiDAR ranges were then transformed to the ground coordinate frame to obtain a reconstructed 3D map of the stockpile, whose volume was calculated by evaluating the volume under the 3D surface. The proposed actuated 1D LiDAR was tested by scanning different stockpiles within the Webots robot simulation environment. The system performance was assessed via the estimated volumetric error, the total energy consumption, and the projected reconstructed area ratio. These metrics were analysed for different flight trajectories by allocating the waypoints at different distances from the storage walls.
In terms of volumetric errors, for the more uneven stockpiles with small material volume by the walls, it was shown that allocating the flight trajectory waypoints at locations from the wall that are up to 20% of the storage space dimensions can enable volumetric error levels that are less than 0.3%. However, locating the waypoints further into the storage space can lead to larger unscanned areas due to blind spots, which in turn lead to larger volumetric error levels up to ~6%. As for more even stockpiles and as the distance between the waypoints and the walls increases, the mean volumetric error showed a less steep but more consistent increasing rate from almost zero to ~4%. When employing a sinusoidal waveform for the servo motion as opposed to a semi-triangular waveform, no significant differences in the scanning performance were evident. Changing the flight speed was also shown to produce minor differences to volumetric error; yet, with a significant influence on the energy consumption. Because most of the issues with unscanned areas are due to those close to the corners, scanning at lower turning speeds was shown to be an effective approach to improve the volumetric error levels for missions with waypoints allocated far from the walls; however, the corresponding increase in total energy consumption can be significant, depending on the maximum yawing speed adopted.
The results of the estimated volumetric errors were compared with that obtained from 2D and 3D LiDAR scanners while fixing all other mission parameters. The 2D LiDAR showed the worst performance among all the cases considered. When adopting a maximum yawing speed of around 6°·s−1, the reconstruction quality of the proposed 1D LiDAR system was similar to that of the 3D LiDAR scanner. However, this was achieved while requiring far less scanning speed for data acquisition, significantly less energy consumption, and much lower system cost and weight.
In the future, we aim to conduct flight experiments to validate the proposed stockpile volume estimation method presented in this paper. Moreover, it is evident that the problem at hand is multi-dimensional with a huge parameter space for investigation, particularly with respect to trajectory shapes. As such, further optimization of the flight trajectory path is needed to seek possible reduction in blind spots and, hence, enhancing outcomes, while simultaneously considering power budget constraints. As the proposed system is low-cost and low-weight, several units can be attached to a single drone at different orientations, or it can be deployed within multiple drone agents to increase the system redundancy during scanning, which, in turn, would also increase the density of the generated point cloud. The current work was concerned with providing a detailed demonstration of the feasibility of the proposed actuated 1D LiDAR system for reconstruction missions; however, all the previously mentioned flight trajectory/system configuration possibilities open lots of directions for future enhancements to the system under different constraints.

Author Contributions

Conceptualization, A.A. and M.R.A.N.; methodology, A.A. and M.R.A.N.; software, A.A.; formal analysis, A.A. and M.R.A.N.; investigation, A.A. and M.R.A.N.; writing—original draft preparation, A.A.; writing—review and editing, M.R.A.N.; supervision, M.R.A.N. All authors have read and agreed to the published version of the manuscript.

Funding

This study forms part of Ahmad Alsayed’s PhD research which is funded by Umm Al-Qura University (Makkah, Saudi Arabia).

Data Availability Statement

The data presented in this study are available within the paper.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A. Noise Effect

All simulations conducted in this work did not consider the effect of noise from the sensors, which can lead to additional uncertainty in the estimated volumetric error results. As such, to better understand such effect, noises were considered for the LiDAR ranges ( r ) in Equation (3), the servo motor angles ( γ ) in Equation (1), the orientations ( ψ , θ , ϕ ) in Equation (5), and the 3D coordinates ( x , y , z ) in Equation (6). The effect of noise from the LiDAR was set to have an uncertainty of ± 2.5 cm for the reading ranges based on results in [81]. The servo motor angles were set to have an uncertainty of ± 1.0 ° based on study [89]. With no available information on the distribution shape of the uncertainties, a normal distribution was assumed for the uncertainties above. The uncertainty in the orientations comes from the IMU, and, according to the study in [90], orientations can be estimated with a standard deviation of 0.143 ° . Lastly, the global 3D coordinates were set to have uncertainties that can be measured with a standard deviation of 0.113 m, as described in [91].
Noises were produced by a random number generator; hence, the simulation results were dependent on these numbers. Therefore, 20 tests were conducted for each stockpile, initial servo motor position, and trajectory pathway to illustrate the uncertainty in the results. The solid lines in Figure A1 show the mean volumetric error and the shaded regions represent the standard deviation, and each point is the mean of 400 tests (20 initial servo position each tested 20 times for noise effect). As expected, noises from the sensors increased the uncertainty of the estimated volume and, hence, may lead to a change in the obtained stockpile volume estimation. This is evident from the larger shaded regions, representing the standard deviation, for the simulations that account for noise effect. Additionally, these shaded areas increase in size, indicating higher uncertainty, as P t r a j   increases. However, the mean volumetric errors show minimal change when compared to simulations that discount the noise effect. Therefore, the noise effect can be argued to be minor.
Figure A1. Mean stockpile volumetric error, V ¯ e , versus the trajectory waypoints allocation percentage, P t r a j , with sensors’ noise effect (blue line) for (a) stockpile 1 and (b) stockpile 2. The solid lines represent the mean volumetric error, and the shaded regions represent the standard deviation.
Figure A1. Mean stockpile volumetric error, V ¯ e , versus the trajectory waypoints allocation percentage, P t r a j , with sensors’ noise effect (blue line) for (a) stockpile 1 and (b) stockpile 2. The solid lines represent the mean volumetric error, and the shaded regions represent the standard deviation.
Drones 06 00386 g0a1

References

  1. Alsayed, A.; Yunusa-Kaltungo, A.; Quinn, M.K.; Arvin, F.; Nabawy, M.R.A. Drone-Assisted Confined Space Inspection and Stockpile Volume Estimation. Remote Sens. 2021, 13, 3356. [Google Scholar] [CrossRef]
  2. Alsayed, A.; Nabawy, M.R.; Yunusa-Kaltungo, A.; Arvin, F.; Quinn, M.K. Towards Developing an Aerial Mapping System for Stockpile Volume Estimation in Cement Plants. In Proceedings of the AIAA Scitech 2021 Forum, Virtual Event, 11–15 & 19–21 January 2021; American Institute of Aeronautics and Astronautics: Reston, VA, USA, 2021. [Google Scholar] [CrossRef]
  3. Turner, R.M.; MacLaughlin, M.M.; Iverson, S.R. Identifying and Mapping Potentially Adverse Discontinuities in Underground Excavations Using Thermal and Multispectral UAV Imagery. Eng. Geol. 2020, 266, 105470. [Google Scholar] [CrossRef]
  4. Surmann, H.; Kaiser, T.; Leinweber, A.; Senkowski, G.; Slomma, D.; Thurow, M. Small Commercial UAVs for Indoor Search and Rescue Missions. In Proceedings of the 2021 7th International Conference on Automation, Robotics and Applications (ICARA), Prague, Czech Republic, 4–6 February 2021; pp. 106–113. [Google Scholar]
  5. Gao, X.; Zhu, L.; Cui, H.; Hu, Z.; Liu, H.; Shen, S. Complete and Accurate Indoor Scene Capturing and Reconstruction Using a Drone and a Robot. IEEE Sens. J. 2021, 21, 11858–11869. [Google Scholar] [CrossRef]
  6. He, H.; Chen, T.; Zeng, H.; Huang, S. Ground Control Point-Free Unmanned Aerial Vehicle-Based Photogrammetry for Volume Estimation of Stockpiles Carried on Barges. Sensors 2019, 19, 3534. [Google Scholar] [CrossRef] [Green Version]
  7. Yilmaz, H.M. Close Range Photogrammetry in Volume Computing. Exp. Tech. 2010, 34, 48–54. [Google Scholar] [CrossRef]
  8. PS, R.; Jeyan, M.L. Mini Unmanned Aerial Systems (UAV)—A Review of the Parameters for Classification of a Mini UAV. Int. J. Aviat. Aeronaut. Aerosp. 2020, 7, 5. [Google Scholar] [CrossRef]
  9. Abdelrahman, M.M.; Elnomrossy, M.M.; Ahmed, M.R. Development of Mini Unmanned Air Vehicles. In Proceedings of the International Conference on Aerospace Sciences & Aviation Technology, Cairo, Egypt, 26–28 May 2009. [Google Scholar]
  10. Goraj, Z.; Cisowski, J.; Frydrychewicz, A.; Grendysa, W.; Jonas, M. Mini UAV Design and Optimization for Long Endurance Mission. In Proceedings of the ICAS Secretariat—26th Congress of International Council of the Aeronautical Sciences 2008, ICAS 2008, Anchorage, AK, USA, 14–19 September 2008; Volume 3, pp. 2636–2647. [Google Scholar]
  11. Shyy, W.; Berg, M.; Ljungqvist, D. Flapping and Flexible Wings for Biological and Micro Air Vehicles. Prog. Aerosp. Sci. 1999, 35, 455–505. [Google Scholar] [CrossRef]
  12. Ifju, P.; Jenkins, D.; Ettinger, S.; Lian, Y.; Shyy, W.; Waszak, M. Flexible-Wing-Based Micro Air Vehicles. In Proceedings of the 40th AIAA Aerospace Sciences Meeting & Exhibit, Reno, NV, USA, 14–17 January 2002; American Institute of Aeronautics and Astronautics: Reston, VA, USA, 2002. [Google Scholar]
  13. Nabawy, M.R.A.; ElNomrossy, M.M.; Abdelrahman, M.M.; ElBayoumi, G.M. Aerodynamic Shape Optimisation, Wind Tunnel Measurements and CFD Analysis of a MAV Wing. Aeronaut. J. 2012, 116, 685–708. [Google Scholar] [CrossRef] [Green Version]
  14. Wood, R.; Finio, B.; Karpelson, M.; Ma, K.; Pérez-Arancibia, N.; Sreetharan, P.; Tanaka, H.; Whitney, J. Progress on ‘Pico’ Air Vehicles. Int. J. Rob. Res. 2012, 31, 1292–1302. [Google Scholar] [CrossRef]
  15. Petricca, L.; Ohlckers, P.; Grinde, C. Micro- and Nano-Air Vehicles: State of the Art. Int. J. Aerosp. Eng. 2011, 2011, 214549. [Google Scholar] [CrossRef] [Green Version]
  16. Floreano, D.; Wood, R.J. Science, Technology and the Future of Small Autonomous Drones. Nature 2015, 521, 460–466. [Google Scholar] [CrossRef] [Green Version]
  17. Chiang, E.T.K.; Urakubo, T.; Mashimo, T. Lift Generation by a Miniature Piezoelectric Ultrasonic Motor-Driven Rotary-Wing for Pico Air Vehicles. IEEE Access 2022, 10, 13210–13218. [Google Scholar] [CrossRef]
  18. Gomez, J.C.; Garcia, E. Morphing Unmanned Aerial Vehicles. Smart Mater. Struct. 2011, 20, 103001. [Google Scholar] [CrossRef]
  19. Ahmed, M.R.; Abdelrahman, M.M.; ElBayoumi, G.M.; ElNomrossy, M.M. Optimal Wing Twist Distribution for Roll Control of MAVs. Aeronaut. J. 2011, 115, 641–649. [Google Scholar] [CrossRef]
  20. Harvey, C.; Gamble, L.L.; Bolander, C.R.; Hunsaker, D.F.; Joo, J.J.; Inman, D.J. A Review of Avian-Inspired Morphing for UAV Flight Control. Prog. Aerosp. Sci. 2022, 132, 100825. [Google Scholar] [CrossRef]
  21. Greenblatt, D.; Williams, D.R. Flow Control for Unmanned Air Vehicles. Annu. Rev. Fluid. Mech. 2022, 54, 383–412. [Google Scholar] [CrossRef]
  22. Shearwood, T.R.; Nabawy, M.R.A.; Crowther, W.J.; Warsop, C. Directional Control of Finless Flying Wing Vehicles—An Assessment of Opportunities for Fluidic Actuation. In Proceedings of the AIAA Aviation 2019 Forum, Dallas, TX, USA, 17–21 June 2019; American Institute of Aeronautics and Astronautics: Reston, VA, USA, 2019. [Google Scholar] [CrossRef]
  23. Warsop, C.; Crowther, W.J. Fluidic Flow Control Effectors for Flight Control. AIAA J. 2018, 56, 3808–3824. [Google Scholar] [CrossRef]
  24. Warsop, C.; Crowther, W. NATO AVT-239 Task Group: Flight Demonstration of Fluidic Flight Controls on the MAGMA Subscale Demonstrator Aircraft. In Proceedings of the AIAA Scitech 2019 Forum, San Diego, CA, USA, 7–11 January 2019; American Institute of Aeronautics and Astronautics: Reston, VA, USA, 2019. [Google Scholar]
  25. Shearwood, T.R.; Nabawy, M.R.A.; Crowther, W.J.; Warsop, C. A Novel Control Allocation Method for Yaw Control of Tailless Aircraft. Aerospace 2020, 7, 150. [Google Scholar] [CrossRef]
  26. Shearwood, T.R.; Nabawy, M.R.A.; Crowther, W.J.; Warsop, C. Yaw Control of Maneuvering Tailless Aircraft Using Induced Drag—A Control Allocation Method Based on Aerodynamic Mode Shapes. In Proceedings of the AIAA AVIATION 2020 FORUM, Virtual Event, 15–19 June 2020; American Institute of Aeronautics and Astronautics: Reston, VA, USA, 2020. [Google Scholar] [CrossRef]
  27. Shearwood, T.R.; Nabawy, M.R.A.; Crowther, W.; Warsop, C. A Control Allocation Method to Reduce Roll-Yaw Coupling on Tailless Aircraft. In Proceedings of the AIAA Scitech 2021 Forum, 11–15 & 19–21 January 2021; American Institute of Aeronautics and Astronautics: Reston, VA, USA, 2021. [Google Scholar] [CrossRef]
  28. Hassanalian, M.; Abdelkefi, A. Classifications, Applications, and Design Challenges of Drones: A Review. Prog. Aerosp. Sci. 2017, 91, 99–131. [Google Scholar] [CrossRef]
  29. Ma, K.Y.; Chirarattananon, P.; Fuller, S.B.; Wood, R.J. Controlled Flight of a Biologically Inspired, Insect-Scale Robot. Science 2013, 340, 1231806. [Google Scholar] [CrossRef] [Green Version]
  30. Jafferis, N.T.; Helbling, E.F.; Karpelson, M.; Wood, R.J. Untethered Flight of an Insect-Sized Flapping-Wing Microscale Aerial Vehicle. Nature 2019, 570, 491–495. [Google Scholar] [CrossRef] [PubMed]
  31. Chen, Y.; Zhao, H.; Mao, J.; Chirarattananon, P.; Helbling, E.F.; Hyun, N.; Seung, P.; Clarke, D.R.; Wood, R.J. Controlled Flight of a Microrobot Powered by Soft Artificial Muscles. Nature 2019, 575, 324–329. [Google Scholar] [CrossRef] [PubMed]
  32. Nabawy, M.R.A.; Marcinkeviciute, R. Scalability of Resonant Motor-Driven Flapping Wing Propulsion Systems. R. Soc. Open Sci. 2021, 8, 210452. [Google Scholar] [CrossRef] [PubMed]
  33. Shahmoradi, J.; Talebi, E.; Roghanchi, P.; Hassanalian, M. A Comprehensive Review of Applications of Drone Technology in the Mining Industry. Drones 2020, 4, 34. [Google Scholar] [CrossRef]
  34. Vilbig, J.M.; Sagan, V.; Bodine, C. Archaeological Surveying with Airborne LiDAR and UAV Photogrammetry: A Comparative Analysis at Cahokia Mounds. J. Archaeol. Sci. Rep. 2020, 33, 102509. [Google Scholar] [CrossRef]
  35. Burdziakowski, P.; Bobkowska, K. Uav Photogrammetry under Poor Lighting Conditions—Accuracy Considerations. Sensors 2021, 21, 3531. [Google Scholar] [CrossRef]
  36. Lewicka, O.; Specht, M.; Stateczny, A.; Specht, C.; Brčić, D.; Jugović, A.; Widźgowski, S.; Wiśniewska, M. Analysis of Gnss, Hydroacoustic and Optoelectronic Data Integration Methods Used in Hydrography. Sensors 2021, 21, 7831. [Google Scholar] [CrossRef]
  37. Johansen, K.; Duan, Q.; Tu, Y.H.; Searle, C.; Wu, D.; Phinn, S.; Robson, A.; McCabe, M.F. Mapping the Condition of Macadamia Tree Crops Using Multi-Spectral UAV and WorldView-3 Imagery. ISPRS J. Photogramm. Remote Sens. 2020, 165, 28–40. [Google Scholar] [CrossRef]
  38. Hobart, M.; Pflanz, M.; Weltzien, C.; Schirrmann, M. Growth Height Determination of Tree Walls for Precise Monitoring in Apple Fruit Production Using UAV Photogrammetry. Remote Sens. 2020, 12, 1656. [Google Scholar] [CrossRef]
  39. Šašak, J.; Gallay, M.; Kaňuk, J.; Hofierka, J.; Minár, J. Combined Use of Terrestrial Laser Scanning and UAV Photogrammetry in Mapping Alpine Terrain. Remote Sens. 2019, 11, 2154. [Google Scholar] [CrossRef] [Green Version]
  40. Iglhaut, J.; Cabo, C.; Puliti, S.; Piermattei, L.; O’connor, J.; Rosette, J.; Uk, A.A.; Gutiérrez Quirós, G. Structure from Motion Photogrammetry in Forestry: A Review. Curr. For. Rep. 2019, 5, 155–168. [Google Scholar] [CrossRef] [Green Version]
  41. Li, X.; Shi, F.; Tang, Y. A Recursive Hull and Signal-Based Building Footprint Generation from Airborne LiDAR Data. Remote Sens. 2022, 14, 5892. [Google Scholar] [CrossRef]
  42. Marchel, Ł.; Specht, C.; Specht, M. Testing the Accuracy of the Modified ICP Algorithm with Multimodal Weighting Factors. Energies 2020, 13, 5939. [Google Scholar] [CrossRef]
  43. Ajayi, O.G.; Ajulo, J. Investigating the Applicability of Unmanned Aerial Vehicles (UAV) Photogrammetry for the Estimation of the Volume of Stockpiles. Quaest. Geogr. 2021, 40, 25–38. [Google Scholar] [CrossRef]
  44. Son, S.W.; Kim, D.W.; Sung, W.G.; Yu, J.J. Integrating UAV and TLS Approaches for Environmental Management: A Case Study of a Waste Stockpile Area. Remote Sens. 2020, 12, 1615. [Google Scholar] [CrossRef]
  45. Rohizan, M.H.; Ibrahim, A.H.; Abidin, C.Z.C.; Ridwan, F.M.; Ishak, R. Application of Photogrammetry Technique for Quarry Stockpile Estimation. IOP Conf. Ser. Earth Environ. Sci. 2021, 920, 012040. [Google Scholar] [CrossRef]
  46. Kaamin, M.; Asrul, N.; Daud, M.E.; Suwandi, A.K.; Sahat, S.; Mokhtar, M.; Ngadiman, N. Volumetric Change Calculation for a Landfill Stockpile Using UAV Photogrammetry. Int. J. Integr. Eng. 2019, 11, 53–62. [Google Scholar]
  47. Antonopoulos, A.; Lagoudakis, M.G.; Partsinevelos, P. A ROS Multi-Tier UAV Localization Module Based on GNSS, Inertial and Visual-Depth Data. Drones 2022, 6, 135. [Google Scholar] [CrossRef]
  48. Bassolillo, S.R.; D’Amato, E.; Notaro, I.; Ariante, G.; Del Core, G.; Mattei, M. Enhanced Attitude and Altitude Estimation for Indoor Autonomous UAVs. Drones 2022, 6, 18. [Google Scholar] [CrossRef]
  49. Aldao, E.; González-deSantos, L.M.; Michinel, H.; González-Jorge, H. UAV Obstacle Avoidance Algorithm to Navigate in Dynamic Building Environments. Drones 2022, 6, 16. [Google Scholar] [CrossRef]
  50. Cabreira, T.; Brisolara, L.; Ferreira, P.R., Jr. Survey on Coverage Path Planning with Unmanned Aerial Vehicles. Drones 2019, 3, 4. [Google Scholar] [CrossRef] [Green Version]
  51. Burgués, J.; Hernández, V.; Lilienthal, A.; Marco, S. Smelling Nano Aerial Vehicle for Gas Source Localization and Mapping. Sensors 2019, 19, 478. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  52. Castaño, A.R.; Romero, H.; Capitán, J.; Andrade, J.L.; Ollero, A. Development of a Semi-Autonomous Aerial Vehicle for Sewerage Inspection. In Robot 2019: Fourth Iberian Robotics Conference; Silva, M.F., Luís Lima, J., Reis, L.P., Sanfeliu, A., Tardioli, D., Eds.; Springer International Publishing: Cham, Switzerland, 2020; pp. 75–86. ISBN 978-3-030-35990-4. [Google Scholar]
  53. Esfahlani, S.S. Mixed Reality and Remote Sensing Application of Unmanned Aerial Vehicle in Fire and Smoke Detection. J. Ind. Inf. Integr. 2019, 15, 42–49. [Google Scholar] [CrossRef]
  54. Hennage, D.H.; Nopola, J.R.; Haugen, B.D. Fully Autonomous Drone for Underground Use. In Proceedings of the 53rd U.S. Rock Mechanics/Geomechanics Symposium, New York City, NY, USA, 23–26 June 2019; American Rock Mechanics Association: Alexandria, VA, USA, 2019. [Google Scholar]
  55. Flyability Elios 3—Digitizing the Inaccessible. Available online: https://www.flyability.com/elios-3 (accessed on 22 August 2022).
  56. Manish, R.; Hasheminasab, S.M.; Liu, J.; Koshan, Y.; Mahlberg, J.A.; Lin, Y.-C.; Ravi, R.; Zhou, T.; McGuffey, J.; Wells, T.; et al. Image-Aided LiDAR Mapping Platform and Data Processing Strategy for Stockpile Volume Estimation. Remote Sens. 2022, 14, 231. [Google Scholar] [CrossRef]
  57. Xu, Z.; Lu, X.; Xu, E.; Xia, L. A Sliding System Based on Single-Pulse Scanner and Rangefinder for Pile Inventory. IEEE Geosci. Remote Sens. Lett. 2022, 19, 7003605. [Google Scholar] [CrossRef]
  58. Magnusson, M. The Three-Dimensional Normal-Distributions Transform: An Efficient Representation for Registration, Surface Analysis, and Loop Detection; Örebro Universitet: Örebro, Sweeden, 2009. [Google Scholar]
  59. Phillips, T.G.; Guenther, N.; McAree, P.R. When the Dust Settles: The Four Behaviors of LiDAR in the Presence of Fine Airborne Particulates. J. Field Robot. 2017, 34, 985–1009. [Google Scholar] [CrossRef]
  60. Ryde, J.; Hillier, N. Performance of Laser and Radar Ranging Devices in Adverse Environmental Conditions. J. Field Robot. 2009, 26, 712–727. [Google Scholar] [CrossRef]
  61. Yilmaz, V. Automated Ground Filtering of LiDAR and UAS Point Clouds with Metaheuristics. Opt. Laser. Technol. 2021, 138, 106890. [Google Scholar] [CrossRef]
  62. Zhou, H.; Zhang, J.; Ge, L.; Yu, X.; Wang, Y.; Zhang, C. Research on Volume Prediction of Single Tree Canopy Based on Three-Dimensional (3D) LiDAR and Clustering Segmentation. Int. J. Remote Sens. 2021, 42, 738–755. [Google Scholar] [CrossRef]
  63. Krátký, V.; Petráček, P.; Nascimento, T.; Čadilová, M.; Škobrtal, M.; Stoudek, P.; Saska, M. Safe Documentation of Historical Monuments by an Autonomous Unmanned Aerial Vehicle. ISPRS Int. J. Geoinf. 2021, 10, 738. [Google Scholar] [CrossRef]
  64. Long, T.; Xie, A.; Ren, X.; Wang, X. Tampering Detection of LiDAR Data for Autonomous Vehicles. In Proceedings of the 2021 40th Chinese Control Conference (CCC), Shanghai, China, 26–28 July 2021; pp. 4732–4737. [Google Scholar]
  65. Bi, S.; Yuan, C.; Liu, C.; Cheng, J.; Wang, W.; Cai, Y. A Survey of Low-Cost 3D Laser Scanning Technology. Appl. Sci. 2021, 11, 3938. [Google Scholar] [CrossRef]
  66. Kang, X.; Yin, S.; Fen, Y. 3D Reconstruction & Assessment Framework Based on Affordable 2D Lidar. In Proceedings of the 2018 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM), Auckland, New Zealand, 9–12 July 2018; pp. 292–297. [Google Scholar]
  67. Palacín, J.; Martínez, D.; Rubies, E.; Clotet, E. Mobile Robot Self-Localization with 2D Push-Broom LIDAR in a 2D Map. Sensors 2020, 20, 2500. [Google Scholar] [CrossRef] [PubMed]
  68. Morales, J.; Martínez, J.; Mandow, A.; Reina, A.; Pequeño-Boter, A.; García-Cerezo, A. Boresight Calibration of Construction Misalignments for 3D Scanners Built with a 2D Laser Rangefinder Rotating on Its Optical Center. Sensors 2014, 14, 20025–20040. [Google Scholar] [CrossRef] [Green Version]
  69. Morales, J.; Plaza-Leiva, V.; Mandow, A.; Gomez-Ruiz, J.; Serón, J.; García-Cerezo, A. Analysis of 3D Scan Measurement Distribution with Application to a Multi-Beam Lidar on a Rotating Platform. Sensors 2018, 18, 395. [Google Scholar] [CrossRef] [Green Version]
  70. Alismail, H.; Browning, B. Automatic Calibration of Spinning Actuated Lidar Internal Parameters. J. Field Robot. 2015, 32, 723–747. [Google Scholar] [CrossRef]
  71. Morales, J.; Martinez, J.L.; Mandow, A.; Pequeno-Boter, A.; Garcia-Cerezo, A. Design and Development of a Fast and Precise Low-Cost 3D Laser Rangefinder. In Proceedings of the 2011 IEEE International Conference on Mechatronics, Istanbul, Turkey, 13–15 April 2011; pp. 621–626. [Google Scholar]
  72. Schubert, S.; Neubert, P.; Protzel, P. How to Build and Customize a High-Resolution 3D Laserscanner Using Off-the-Shelf Components. In Lecture Notes in Computer Science (including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Springer: Berlin/Heidelberg, Germany, 2016; Volume 9716, pp. 314–326. ISBN 9783319403786. [Google Scholar]
  73. Alsayed, A.; Nabawy, M.R.A.; Yunusa-Kaltungo, A.; Quinn, M.K.; Arvin, F. An Autonomous Mapping Approach for Confined Spaces Using Flying Robots. In Towards Autonomous Robotic Systems; Fox, C., Gao, J., Ghalamzan Esfahani, A., Saaj, M., Hanheide, M., Parsons, S., Eds.; Springer International Publishing: Cham, Switzerland, 2021; pp. 326–336. [Google Scholar]
  74. Nabawy, M.R.A.; Crowther, W.J. Aero-Optimum Hovering Kinematics. Bioinspir Biomim 2015, 10, 44002. [Google Scholar] [CrossRef] [Green Version]
  75. Berman, G.J.; Wang, Z.J. Energy-Minimizing Kinematics in Hovering Insect Flight. J. Fluid. Mech. 2007, 582, 153–168. [Google Scholar] [CrossRef]
  76. Mathworks Fit Curve or Surface to Data—MATLAB Fit. Available online: https://uk.mathworks.com/help/curvefit/fit.html (accessed on 12 January 2022).
  77. Michel, O. Webots: Professional Mobile Robot Simulation. J. Adv. Robot. Syst. 2004, 1, 39–42. [Google Scholar]
  78. McDonald, B. Surf2stl. Available online: https://www.mathworks.com/matlabcentral/fileexchange/4512-surf2stl (accessed on 12 January 2022).
  79. How Much Flight Time Is Needed for an Inspection. Available online: https://www.flyability.com/knowledge-base/how-much-flight-time-is-needed-for-an-inspection (accessed on 18 July 2022).
  80. Servo Motors|Types, Properties, Control. Available online: http://www.robotiksistem.com/servo_motor_types_properties.html (accessed on 18 July 2022).
  81. Garmin Ltd. Garmin LIDAR-Lite v3|GPS Sensors. Available online: https://www.garmin.com/en-US/p/557294 (accessed on 28 January 2022).
  82. Buy 40m × 60m Warehouse Storage Industrial Unit Tent. Available online: https://www.galatent.co.uk/product/12636 (accessed on 10 August 2022).
  83. UTM-30LX: Hokuyo. Available online: https://hokuyo-usa.com/products/lidar-obstacle-detection/utm-30lx (accessed on 19 January 2022).
  84. Puck LITE Lightweight Surround Lidar Sensor|Velodyne Lidar. Available online: https://velodynelidar.com/products/puck-lite/ (accessed on 19 January 2022).
  85. Jaakkola, A.; Hyyppä, J.; Yu, X.; Kukko, A.; Kaartinen, H.; Liang, X.; Hyyppä, H.; Wang, Y. Autonomous Collection of Forest Field Reference—The Outlook and a First Step with UAV Laser Scanning. Remote Sens. 2017, 9, 785. [Google Scholar] [CrossRef] [Green Version]
  86. Toth, C.; Jozkow, G.; Grejner-Brzezinska, D. Mapping with Small UAS: A Point Cloud Accuracy Assessment. J. Appl. Geod. 2015, 9, 213–226. [Google Scholar] [CrossRef]
  87. Lee, S.; Har, D.; Kum, D. Drone-Assisted Disaster Management: Finding Victims via Infrared Camera and Lidar Sensor Fusion. In Proceedings of the 2016 3rd Asia-Pacific World Congress on Computer Science and Engineering (APWC on CSE), Nadi, Fiji, 5–6 December 2016; pp. 84–89. [Google Scholar]
  88. Chatziparaschis, D.; Lagoudakis, M.G.; Partsinevelos, P. Aerial and Ground Robot Collaboration for Autonomous Mapping in Search and Rescue Missions. Drones 2020, 4, 79. [Google Scholar] [CrossRef]
  89. Earl, B. Brushed DC Motors|Adafruit Motor Selection Guide|Adafruit Learning System. Available online: https://learn.adafruit.com/adafruit-motor-selection-guide/rc-servos (accessed on 20 January 2022).
  90. Nirmal, K.; Sreejith, A.G.; Mathew, J.; Sarpotdar, M.; Suresh, A.; Prakash, A.; Safonova, M.; Murthy, J. Noise Modeling and Analysis of an IMU-Based Attitude Sensor: Improvement of Performance by Filtering and Sensor Fusion. In Advances in Optical and Mechanical Technologies for Telescopes and Instrumentation II; Navarro, R., Burge, J.H., Eds.; SPIE: Bellingham, WD, USA, 2016; Volume 9912, p. 99126W. [Google Scholar]
  91. Jin, R.; Jiang, J.; Qi, Y.; Lin, D.; Song, T. Drone Detection and Pose Estimation Using Relational Graph Networks. Sensors 2019, 19, 1479. [Google Scholar] [CrossRef] [PubMed]
Figure 1. A schematic illustration of the proposed actuated 1D LiDAR approach for irregular stockpile scanning and volume estimation.
Figure 1. A schematic illustration of the proposed actuated 1D LiDAR approach for irregular stockpile scanning and volume estimation.
Drones 06 00386 g001
Figure 2. Angular velocity and motion profiles of the servo motor using rectangular, semi-rectangular, and sinusoidal speed inputs. Of note is that the semi-rectangular angular velocity variation and its corresponding semi-triangular motion waveform of the servo motor are generated using a control parameter value of C γ = 0.95 .
Figure 2. Angular velocity and motion profiles of the servo motor using rectangular, semi-rectangular, and sinusoidal speed inputs. Of note is that the semi-rectangular angular velocity variation and its corresponding semi-triangular motion waveform of the servo motor are generated using a control parameter value of C γ = 0.95 .
Drones 06 00386 g002
Figure 3. The principle of defining an object point in the coordinate frames   L and G.
Figure 3. The principle of defining an object point in the coordinate frames   L and G.
Drones 06 00386 g003
Figure 4. An example of calculating the projected reconstrued area ratio, R a . The coloured surface represents the projected area of the generated surface ( a p = 2284.8 m2), whereas the total area is A t = 60 × 40 = 2400 m2. Therefore, R a = 2284.8 2400 = 0.952 = 95.2 % . The white regions represent the unreconstructed area (4.8%).
Figure 4. An example of calculating the projected reconstrued area ratio, R a . The coloured surface represents the projected area of the generated surface ( a p = 2284.8 m2), whereas the total area is A t = 60 × 40 = 2400 m2. Therefore, R a = 2284.8 2400 = 0.952 = 95.2 % . The white regions represent the unreconstructed area (4.8%).
Drones 06 00386 g004
Figure 5. (a) Uniformly distributed points within the storage space used for stockpile modelling. (b) The 3D surface is generated by fitting a surface over the predefined points. (c) The STL 3D model file developed from (b). (d) The simulation environment showing the storage and the “first” stockpile originated from the STL file in (c); the front wall is removed to show the inside of the storage. (e) The simulation environment with the “second” stockpile that has more materials near the walls.
Figure 5. (a) Uniformly distributed points within the storage space used for stockpile modelling. (b) The 3D surface is generated by fitting a surface over the predefined points. (c) The STL 3D model file developed from (b). (d) The simulation environment showing the storage and the “first” stockpile originated from the STL file in (c); the front wall is removed to show the inside of the storage. (e) The simulation environment with the “second” stockpile that has more materials near the walls.
Drones 06 00386 g005
Figure 6. Overall block diagram of the quadcopter control system.
Figure 6. Overall block diagram of the quadcopter control system.
Drones 06 00386 g006
Figure 7. Trajectory waypoints allocation with respect to the storage planform.
Figure 7. Trajectory waypoints allocation with respect to the storage planform.
Drones 06 00386 g007
Figure 8. Relationship between drone flight speed, V , and maximum pitching gain, θ G , m a x . From the fitted line, the desired drone speed ( V = 1 m·s−1) can be reached when θ G , m a x = 2.52 .
Figure 8. Relationship between drone flight speed, V , and maximum pitching gain, θ G , m a x . From the fitted line, the desired drone speed ( V = 1 m·s−1) can be reached when θ G , m a x = 2.52 .
Drones 06 00386 g008
Figure 9. Illustration of the different initial angular positions for the 1D LiDAR fitted on top of the oscillating servo motor.
Figure 9. Illustration of the different initial angular positions for the 1D LiDAR fitted on top of the oscillating servo motor.
Drones 06 00386 g009
Figure 10. An example baseline simulation result for scanning stockpile 1 at P t r a j = 20 % . (a) The point cloud ( P G ) generated from the proposed scanning system. (b) The fitted surface across P G , used to evaluate the stockpile volume. The detected walls were removed by reducing the scanned region along the x and y axes until the walls disappeared. The colours encode height from the ground.
Figure 10. An example baseline simulation result for scanning stockpile 1 at P t r a j = 20 % . (a) The point cloud ( P G ) generated from the proposed scanning system. (b) The fitted surface across P G , used to evaluate the stockpile volume. The detected walls were removed by reducing the scanned region along the x and y axes until the walls disappeared. The colours encode height from the ground.
Drones 06 00386 g010
Figure 11. Stockpile mean volumetric error, V ¯ e , of both stockpiles versus the trajectory waypoints allocation percentage, P t r a j , under baseline simulation conditions. The solid lines represent the mean volumetric error, and the shaded regions represent the standard deviation, due to testing at different initial positions of the servo motor.
Figure 11. Stockpile mean volumetric error, V ¯ e , of both stockpiles versus the trajectory waypoints allocation percentage, P t r a j , under baseline simulation conditions. The solid lines represent the mean volumetric error, and the shaded regions represent the standard deviation, due to testing at different initial positions of the servo motor.
Drones 06 00386 g011
Figure 12. The projected scan on the XY-plane for stockpiles 1 and 2. (a,b) Results for the minimum tested P t r a j value of 5% for stockpile 1 and 2, respectively, and (c,d) results for the maximum tested P t r a j value of 30%. The scattered data is the point cloud, P G , and the colors encode height from the ground. The black line is the recorded flight trajectory.
Figure 12. The projected scan on the XY-plane for stockpiles 1 and 2. (a,b) Results for the minimum tested P t r a j value of 5% for stockpile 1 and 2, respectively, and (c,d) results for the maximum tested P t r a j value of 30%. The scattered data is the point cloud, P G , and the colors encode height from the ground. The black line is the recorded flight trajectory.
Drones 06 00386 g012
Figure 13. Stockpile mean volumetric error, V ¯ e , against the waypoints allocation percentage, P t r a j , for different oscillation motion waveforms of the servo motor; (a) stockpile 1 and (b) stockpile 2.
Figure 13. Stockpile mean volumetric error, V ¯ e , against the waypoints allocation percentage, P t r a j , for different oscillation motion waveforms of the servo motor; (a) stockpile 1 and (b) stockpile 2.
Drones 06 00386 g013
Figure 14. Stockpile mean volumetric error, V e , against the waypoints allocation percentage, P t r a j , for different flight speeds. (a) Stockpile 1 and (b) stockpile 2.
Figure 14. Stockpile mean volumetric error, V e , against the waypoints allocation percentage, P t r a j , for different flight speeds. (a) Stockpile 1 and (b) stockpile 2.
Drones 06 00386 g014
Figure 15. Illustration of flight trajectories with a single 90 ° turning angle and two 45 ° turning angles. Of note is that, in this demonstration, the two-step turning started at a distance that is 10% of W before the original waypoints.
Figure 15. Illustration of flight trajectories with a single 90 ° turning angle and two 45 ° turning angles. Of note is that, in this demonstration, the two-step turning started at a distance that is 10% of W before the original waypoints.
Drones 06 00386 g015
Figure 16. Stockpile mean volumetric error, V ¯ e , against the waypoints allocation percentage, P t r a j , for different turning approaches at the corners. (a) Stockpile 1 and (b) stockpile 2.
Figure 16. Stockpile mean volumetric error, V ¯ e , against the waypoints allocation percentage, P t r a j , for different turning approaches at the corners. (a) Stockpile 1 and (b) stockpile 2.
Drones 06 00386 g016
Figure 17. Stockpile mean volumetric errors, V ¯ e , against the waypoints allocation percentage, P t r a j , for different yawing speeds. (a) Stockpile 1 and (b) stockpile 2.
Figure 17. Stockpile mean volumetric errors, V ¯ e , against the waypoints allocation percentage, P t r a j , for different yawing speeds. (a) Stockpile 1 and (b) stockpile 2.
Drones 06 00386 g017
Figure 18. The projected scan on the XY-plane for stockpiles 1 and 2. (a,b) Results for the baseline testcase at a P t r a j , value of 30% for stockpile 1 and 2, respectively, and (c,d) results for the maximum yawing speed case of 1°·s−1 at a P t r a j value of 30%. The scattered data is the point cloud, P G , and the colors encode height from the ground. The black line is the recorded flight trajectory.
Figure 18. The projected scan on the XY-plane for stockpiles 1 and 2. (a,b) Results for the baseline testcase at a P t r a j , value of 30% for stockpile 1 and 2, respectively, and (c,d) results for the maximum yawing speed case of 1°·s−1 at a P t r a j value of 30%. The scattered data is the point cloud, P G , and the colors encode height from the ground. The black line is the recorded flight trajectory.
Drones 06 00386 g018
Figure 19. The change in the drone’s total energy consumption, Δ E t o t a l , referenced to the baseline testcase at a P t r a j value of 5%, versus the trajectory waypoints allocation percentage, P t r a j . In this demonstration, one parameter is changed at a time.
Figure 19. The change in the drone’s total energy consumption, Δ E t o t a l , referenced to the baseline testcase at a P t r a j value of 5%, versus the trajectory waypoints allocation percentage, P t r a j . In this demonstration, one parameter is changed at a time.
Drones 06 00386 g019
Figure 20. The mean projected reconstrued area ratio, R ¯ a , against waypoints allocation percentage, P t r a j , for (a) stockpile 1 and (b) stockpile 2.
Figure 20. The mean projected reconstrued area ratio, R ¯ a , against waypoints allocation percentage, P t r a j , for (a) stockpile 1 and (b) stockpile 2.
Drones 06 00386 g020
Figure 21. Projected scan on the XY-plane for stockpile 1 at the minimum and the maximum tested P t r a j values of 5% and 30% using (a,c) 2D LiDAR and (b,d) 3D LiDAR. The scattered data is the point cloud, P G , and the colors encode height from the ground. The black line is the recorded flight trajectory.
Figure 21. Projected scan on the XY-plane for stockpile 1 at the minimum and the maximum tested P t r a j values of 5% and 30% using (a,c) 2D LiDAR and (b,d) 3D LiDAR. The scattered data is the point cloud, P G , and the colors encode height from the ground. The black line is the recorded flight trajectory.
Drones 06 00386 g021
Figure 22. The volumetric error values from the actuated 1D LiDAR system compared to those from using 2D/3D LiDAR scanners for (a) stockpile 1 and (b) stockpile 2.
Figure 22. The volumetric error values from the actuated 1D LiDAR system compared to those from using 2D/3D LiDAR scanners for (a) stockpile 1 and (b) stockpile 2.
Drones 06 00386 g022
Figure 23. Change in the total energy consumption, Δ E t o t a l , with respect to the baseline case for the actuated 1D LiDAR cases as well as the 2D and 3D LiDAR cases.
Figure 23. Change in the total energy consumption, Δ E t o t a l , with respect to the baseline case for the actuated 1D LiDAR cases as well as the 2D and 3D LiDAR cases.
Drones 06 00386 g023
Table 1. Simulation parameters were used for this work.
Table 1. Simulation parameters were used for this work.
ParameterSymbolsValue
Actual stockpile volumes for stockpiles 1 and 2 V a c t u a l 7193.5 ,   8763.6 m3
Storage length L 60 m
Storage width W 40 m
Desired drone speed V 1 m·s−1
Desired flight altitude z r 12 m
Trajectory waypoints allocation percentages P t r a j 5 ,   10 ,   15 ,   20 ,   25 ,   30 %
Servo   motor   wave   frequency   ( for   C γ = 0.95 ) f 0.67 Hz
Desired maximum angular servo motor speed γ ˙ m a x 5 rad·s−1
1D LiDAR range r 40 m
1D LiDAR rate-500 Hz
Table 2. Performance comparison between the proposed actuated 1D LiDAR system against 2D and 3D LiDAR scanners. Values presented are averaged from the results of the six P t r a j cases considered. Δ E ¯ B L represents the average difference in Δ E t o t a l from the corresponding baseline case.
Table 2. Performance comparison between the proposed actuated 1D LiDAR system against 2D and 3D LiDAR scanners. Values presented are averaged from the results of the six P t r a j cases considered. Δ E ¯ B L represents the average difference in Δ E t o t a l from the corresponding baseline case.
LiDAR
Configuration
Stockpile 1Stockpile 2
V e ¯ ¯ [%] R = a [%] Δ E ¯ | B L [%] V e ¯ ¯ [%] R = a [%] Δ E ¯ | B L [%]
Actuated 1D LiDAR (Baseline)1.696.601.697.60
Actuated 1D LiDAR ( ψ ˙ m a x = 6 ° ·s−1)1.097.7220.598.222
Actuated 1D LiDAR ( ψ ˙ m a x = 1 ° ·s−1)0.798.31950.298.6195
2D LiDAR1.596.0312.097.131
3D LiDAR1.198.4640.698.964
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Alsayed, A.; Nabawy, M.R.A. Indoor Stockpile Reconstruction Using Drone-Borne Actuated Single-Point LiDARs. Drones 2022, 6, 386. https://doi.org/10.3390/drones6120386

AMA Style

Alsayed A, Nabawy MRA. Indoor Stockpile Reconstruction Using Drone-Borne Actuated Single-Point LiDARs. Drones. 2022; 6(12):386. https://doi.org/10.3390/drones6120386

Chicago/Turabian Style

Alsayed, Ahmad, and Mostafa R. A. Nabawy. 2022. "Indoor Stockpile Reconstruction Using Drone-Borne Actuated Single-Point LiDARs" Drones 6, no. 12: 386. https://doi.org/10.3390/drones6120386

APA Style

Alsayed, A., & Nabawy, M. R. A. (2022). Indoor Stockpile Reconstruction Using Drone-Borne Actuated Single-Point LiDARs. Drones, 6(12), 386. https://doi.org/10.3390/drones6120386

Article Metrics

Back to TopTop