Next Article in Journal
Comparative Assessment of the Effect of Positioning Techniques and Ground Control Point Distribution Models on the Accuracy of UAV-Based Photogrammetric Production
Next Article in Special Issue
Accurate Tracking of Agile Trajectories for a Tail-Sitter UAV Under Wind Disturbances Environments
Previous Article in Journal
UAV Localization Method with Keypoints on the Edges of Semantic Objects for Low-Altitude Economy
Previous Article in Special Issue
Online Autonomous Motion Control of Communication-Relay UAV with Channel Prediction in Dynamic Urban Environments
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Assessment of LiDAR-Based Sensing Technologies in Bird–Drone Collision Scenarios

by
Paula Seoane
,
Enrique Aldao
,
Fernando Veiga-López
and
Higinio González-Jorge
*
Aerolab, Insituto de Física e Ciencias Aeroespaciais (IFCAE), Universidade de Vigo, Campus de As Lagoas, E-32004 Ourense, Spain
*
Author to whom correspondence should be addressed.
Drones 2025, 9(1), 13; https://doi.org/10.3390/drones9010013
Submission received: 21 November 2024 / Revised: 22 December 2024 / Accepted: 25 December 2024 / Published: 27 December 2024

Abstract

:
The deployment of Advanced Air Mobility requires the continued development of technologies to ensure operational safety. One of the key aspects to consider here is the availability of robust solutions to avoid tactical conflicts between drones and other flying elements, such as other drones or birds. Bird detection is a relatively underexplored area, but due to the large number of birds, their shared airspace with drones, and the fact that they are non-cooperative elements within an air traffic management system, it is of interest to study how their detection can be improved and how collisions with them can be avoided. This work demonstrates how a LiDAR sensor mounted on a drone can detect birds of various sizes. A LiDAR simulator, previously developed by the Aerolab research group, is employed in this study. Six different collision trajectories and three different bird sizes (pigeon, falcon, and seagull) are tested. The results show that the LiDAR can detect any of these birds at about 30 m; bird detection improves when the bird gets closer and has a larger size. The detection accuracy is higher than 1 m in most of the cases under study. The errors grow with increasing drone-bird relative speed.

1. Introduction

The number of Unmanned Aerial Vehicle (UAV) operations has experienced a surge in recent years, both in the civilian [1] and military sectors. These platforms can be employed for multiple purposes, such as infrastructure inspections [2], surveying [3], agriculture and forestry monitoring [4,5], search and rescue operations, surveillance systems [6], or loitering munitions [7].
In the coming years, the civil sector is expected to continue driving the deployment of autonomous logistics with drones and the transportation of people with air taxis [8,9]. These operations would be conducted within the U-Space concept [10,11], a set of services and procedures that would enable safe, efficient, and affordable access to airspace for numerous or complex UAV operations. This scenario would involve a high degree of automation, where manned and unmanned traffic would share the same airspace, referred to as U-Space airspace. In this context, new traffic management technologies and operational procedures are required to guarantee the safety of operations.
U-Space encompasses conflict management at both strategic and tactical levels. On the one hand, Strategic Conflict Management [12] includes (i) pre-flight planning, which involves the coordination and planning of trajectories prior to operations, preventing conflicts by assigning routes that avoid high-traffic or restricted areas; (ii) air corridor allocation, which determines and assigns specific routes to minimize the collision risks; and (iii) flight reservations and approvals, where drone operators must reserve airspace and obtain approvals for their flight plans, facilitating the organization and control of air traffic. On the other hand, Tactical Conflict Management [13] encompasses (i) real-time detection and avoidance, where sensors and communication technologies detect obstacles in real-time and allow drone avoidance maneuvers; (ii) real-time communication, where drones and their operators receive updates and alerts about potential conflicts, allowing immediate adjustments to flight routes; and (iii) automatic intervention, where automated systems can intervene to adjust a drone’s route and avoid collisions.
Focusing our attention on tactical deconfliction and real-time detection and avoidance, there are two main threats to consider: malicious drones [14,15] and birds [16,17], both non-stationary objects. Malicious drones are those that, either intentionally or unintentionally, do not share their position with the U-Space system and can appear in the path of an aircraft, potentially causing a collision. Birds are always external elements to the U-Space system and must be detected to avoid possible collisions. Lyons et al. (2018) studied bird interaction with drones, from individuals to large colonies [18]. They performed 97 flight hours across various environments in Australia and provided preliminary guidance for safe drone operations, highlighting the need for further research on bird–drone interactions. Key findings of their study include minimal issues with large breeding colonies, some aggressive behavior from solitary breeders, and a raptor attack on a fixed-wing drone. Detailed procedures for flight planning and safety are also provided.
Most common methods for drone and bird detection use imagery sensors and computer vision algorithms. Pedro et al. (2021) presented a novel approach using deep learning techniques and off-the-shelf commercial vision sensors addressing real-time collision avoidance with dynamic objects [19]. Experiments in simulations and real-world scenarios, including dodging a thrown ball, confirmed its viability. A video dataset was created, and transfer learning was tested with positive results. Panchal et al. (2022) introduced urban air traffic management for collision avoidance with non-cooperative airspace users [20]. This work proposed an Urban Air Mobility Collision Avoidance System to reduce the risk of collisions between air taxis and non-cooperating airspace users by adding an extra safety layer. It introduced a conflict detection and resolution method specific to Urban Air Mobility, using a three-dimensional safety envelope based on current aircraft configurations. The procedures for avoiding conflicts before and during the flights are also outlined, with a shuttle service between an airport and a railway station used as a common example. The results highlight the importance of considering individual aircraft configurations in conflict avoidance to prevent collisions. Jiaping et al. (2024) proposed an Asynchronous Multi-Stage Deep Reinforcement Learning (AMS-DRL) approach for drone navigation under adversarial attacks from multiple pursuers [21]. By evolving adversarial agents and ensuring Nash equilibrium, the method enables drones to evade attacks and reach targets, outperforming baselines in simulations and real-time tests. Onifade et al. (2023) employed improved datasets and object detectors like YOLO and Faster R-CNN to increase the accuracy and speed [22] of their solutions. Their main objective was to improve UAV performance by addressing obstacles, such as birds and other aircraft, ensuring improved accuracy and response time.
Although the techniques based on image sensors and computer vision algorithms yield acceptable results, they have limitations, such as difficulty in nocturnal operation and susceptibility to glare from sunlight. These shortcomings compromise the robustness of these solutions, potentially leading to errors in threat detection and positioning. Moreover, they require substantial onboard computational power, which is not always available on a small-size aircraft. The development of increasingly affordable solid-state LiDAR sensors, which are lighter and consume less energy, offers new possibilities in this field. As an active sensor, LiDAR is significantly more resistant to variations in external lighting compared to passive image sensor techniques. In addition, LiDAR provides a direct range measurement [23]. This work presents an initial investigation into the capabilities of LiDAR to detect and position flying objects in the Urban Air Mobility (UAM) contexts. The study mainly focuses on bird encounters, as these are the most common incidents that could compromise the safety of UAM operations.
The study is framed within the topic of urban artificial intelligence and their interactions with biological elements. It is aligned with previous works by Aoyama & Leon (2021) about urban governance and autonomous vehicles [24], Cugurullo et al. (2023) about urbanistic perspectives of artificial intelligence and the city [25], Cugurullo et al. (2024) about artificial intelligence–city nexus beyond Frankenstein Urbanism [26], and Jackman et al. (2024) focused on artificial intelligence urbanism [27].
The manuscript is structured as follows: Section 2 covers materials and methods, Section 3 presents results and discussion, and Section 4 provides conclusions.

2. Materials and Methods

2.1. Operational Scenarios and Trajectory Simulation

Six different operational scenarios are defined considering the following collision trajectories between the drone and the bird (Figure 1): (a) frontal collision trajectories (both with horizontal flight)—trajectory 1; (b) frontal collision trajectories (drone with horizontal flight and bird with 3° descending slope)—trajectory 2; (c) lateral collision trajectories (drone with horizontal flight and bird with 3° descending slope)—trajectory 3; (d) lateral collision trajectories (both with horizontal flight, drone with straight trajectory, and bird with curved trajectory)—trajectory 4; (e) lateral collision trajectories (both with horizontal flight, drone with curved trajectory)—trajectory 5; and (f) lateral collision trajectories (drone with horizontal flight and bird with different z-plane curved trajectory—trajectory 6. These scenarios were simulated in MatLAB [28], considering a UAV flight speed of 5 m/s, a typical value achievable by most commercial rotary-wing models [29,30]. For the birds, the flight speeds were adjusted based on species-specific characteristics, as detailed in Section 2.2. Additionally, to guarantee that all trajectories are flyable and representative of the real bird motion, the curvature radius was constrained to ensure a maximum acceleration modulus of 5 m/s2. As will be introduced in Section 2.2, these values are adjusted based on the animal characteristics.
Additionally, to statistically assess the sensor’s behavior, 300 random collision scenarios were generated for each type of bird. For all these cases, a motion with constant acceleration is assumed. Thus, the bird’s relative position is given by r t = r 0 + v 0 t + a 0 2   t 2 , where r 0 represents the initial position, v 0 the initial velocity, and a 0 the acceleration vector. These values were adjusted using a random sampling of a uniform distribution. To obtain motion profiles representative of a bird, acceleration modulus is constrained to a maximum of 2 m s 2 and speed modulus to the interval 5 ,   v m a x   m s , where v m a x is the maximum flight speed of each bird. Furthermore, a maximum flight path angle of 15 degrees was imposed to avoid abrupt descents that may not represent typical bird trajectories. The initial positions, r 0 , were also randomized and adjusted to ensure a separation between bird and UAV of 5 to 50 m. In all the scenarios, this parameter was set to ensure the birds always remain within the field of view during the trajectory. Figure 2 illustrates two randomly generated scenarios, showing the trajectories of the bird and the UAV. In both cases, the UAV follows a straight path, while the bird’s trajectory curves slightly due to the random acceleration a 0 . The curvature radius is large due to the low a 0 values used here, but study cases included values up to 2 m/s2 to increase the variability in the bird’s speed. Nevertheless, all the simulated scenarios represent relatively smooth parabolic trajectories, representative of bird’s motion and similar to those presented in Figure 1.

2.2. Three-Dimensional Models

It is estimated that there are approximately 18,000 bird species in the world. Due to the complexity of analyzing all of them, three species have been selected for this study: pigeon, falcon, and seagull (Figure 3). They are common species in the researchers’ region and throughout Europe, which could potentially create a risk during drone flights. Likewise, they are birds with different wingspans, providing insights into the system’s behavior for targets of various sizes. The geometries in STL format were downloaded from the CGTrader website, Game Ready Bird Collection [31]. The file included various birds, but only the appropriate ones were selected and exported as individual files. They were then rescaled according to the actual wingspan of each species. Table 1 shows the main birds’ physical characteristics.
In the present study, a specific drone model was not employed, as the main aim of this work is to assess the detection capabilities of LiDAR technology rather than to provide a real-time solution for conflict resolution. As for the sensor, a solid-state Livox Avia is considered, as it is a lightweight model that can be embedded in several commercial UAV models [26].

2.3. LiDAR Simulator

The LiDAR simulator used in this work was previously developed by the researchers from Aerolab Group, part of the Instituto de Física y Ciencia Aeroespacial [34,35] and is based on ray tracing techniques. For the simulations, the specifications of a Livox Avia LiDAR are employed [36]. The main parameters of this system are detailed in Table 2. It is worth noting that the simulator is adaptable to any LiDAR system available in the market.
The input data for the simulator include an STL file containing the bird geometry, its geospatial coordinates, and orientation, as well as the geospatial coordinates and orientation of the drone (Figure 4). The simulation runs with a 0.01 s cycle, during which the positions and orientations are recalculated based on the predefined trajectories of the bird and the drone. Based on this information and using the ray tracing algorithm, the new LiDAR echoes are generated (Figure 5). LiDAR echoes centroid is calculated to provide range data for the simulated detection system.

2.4. Performance Metrics

To evaluate sensor performance, the Euclidean distance between sensor estimations and bird state was computed as follows:
e p i = r r e a l i r e s t i 2   ;   e v i = v r e a l i v e s t i 2 ,
where:
  • r r e a l i is the bird geometric center at discretization point i. For the calculations, the STL geometric center is employed.
  • r e s t i is the estimated position at discretization point i.
  • v r e a l i   and   v e s t i represent the real and estimated bird speed, respectively.
  • e p i and e v i are the position and velocity estimation errors, respectively.
Following this approach, all the trajectories introduced in Section 2.1 were evaluated. Additionally, to analyze the sensor behavior in different operating conditions, a sliding window approach was employed, and the results from the trajectory dataset were analyzed in terms of sensing distance r and bird speed v . For a given detection distance r , the results where r r e a l i r < δ r = 2   m were extracted, and their average error e ¯ p r was computed. This same procedure was performed to calculate the average error as a function of bird speed e ¯ p r , but considering a window size δ v = 1   m / s .

3. Results and Discussion

Figure 6 shows the point cloud simulated from each bird, trajectory, and LiDAR echoes. Overall, the trajectories of the birds are correctly obtained quantitatively. However, errors related to the point cloud dispersion are present. Point cloud density increases when the bird approximates the drone, yielding lower positioning errors. Moreover, note that, as expected, larger birds provide more returns at the same distance in comparison with smaller birds, thus reducing the number of errors in the measurements.
Figure 7 exhibits the detection errors for each bird and trajectory. As previously said, the positioning error decreases as the bird approaches the drone. The bird’s size also influences this. For example, in the case of the pigeon (trajectory 1; (a)), the bird is initially undetected until 0.7 s into the simulation. Between that instant and t = 1.2 s, the centroid position is highly inaccurate due to the low number of echoes, resulting in a low-resolution point cloud. Positioning errors up to 1.6 m are reported for these conditions. For t > 1.2 s, the detection improves until the end of the simulation, reducing the detection error to around 0.1 m. This trend is followed for all analyzed cases, with the one for the smallest bird (pigeon) being the most characteristic. However, all birds present relatively high errors given non-linear, direct line-of-sight trajectories. For instance, the largest scale bird (seagull) shows similar trends for trajectories 4, 5, and 6. This implies that the detection challenges occur until a specific range is reached, which is dependent on both the trajectory and the bird.
In any case, LiDAR detection seems a promising tool for tactical targeting and detection. For the analyzed case, objects (birds) with characteristic lengths of 1 m and relatively high speeds can be identified at 30 m with a relatively high precision, with an average error of around 1 m.
Figure 8, Figure 9 and Figure 10 present the position and velocity estimation errors for the batches of 300 randomized trajectories for each bird species. The graphs show how the average errors in these estimations vary based on the distance and relative speed between the UAV and each bird. As shown, for all three species, the estimation errors increase with relative speed. The lowest errors are found with the pigeon, showing an average position error below 0.4 m and velocity error under 0.65 m/s at worst. This is likely due to the pigeon’s slower flight speed, which makes tracking its path through point clouds easier. Conversely, the falcon posed the most tracking challenges, with position errors of up to 1.7 m and velocity errors due to its high flight speed. It is noteworthy that when calculating these metrics, the trajectory points where bird route estimates are initialized are considered, leading to higher errors compared to those in earlier figures, where regressions have stabilized. However, the results remain consistent and demonstrate the sensor’s robust performance across all simulated case studies.
Regarding the impact of sensing distance, no clear trend is observed across the three species, suggesting that the density of received echoes is sufficient for tracking targets throughout the detection range. However, smaller birds, such as sparrows, might be challenging to detect due to their small effective cross-section. Generally, larger birds are easier to track as they produce more echoes, thus reducing the regression errors. Although one might expect the seagull, due to its large size, to have the lowest errors, its wide wingspan causes LiDAR reflections to originate further from its geometric center, increasing the point cloud dispersion and hampering regression.

4. Conclusions

This work demonstrates how a LiDAR simulator can be applied to evaluate the capability of this type of sensor to detect birds on collision trajectories with drones, thereby addressing tactical conflicts in the context of Advanced Air Mobility. The simulator is applied based on the specifications of a LiVOX Avia LiDAR, three different bird sizes (pigeon, falcon, and seagull), and multiple collision scenarios. The results demonstrate the validity of the LiDAR system for this type of application for ranges up to 50 m, with bounded errors below 0.8 m for position and 1.6 m/s for speed estimation. Generally, larger birds are easier to track as they produce more echoes, thus reducing errors. Errors increase with increasing drone–bird relative speed.
In future work, the presented methodology will be expanded to include additional commercial LiDAR models for assessment and benchmarking. Furthermore, the ray tracing simulator will be enhanced by incorporating factors such as multi-path ray propagation and material reflectivity.

Author Contributions

Conceptualization, P.S., F.V.-L. and H.G.-J.; methodology, P.S. and E.A.; software, P.S. and E.A.; validation, P.S.; formal analysis, F.V.-L.; investigation, P.S. and E.A.; data curation, P.S.; writing—original draft preparation, H.G.-J. and P.S.; writing—review and editing, F.V.-L.; supervision, F.V.-L. and H.G.-J.; project administration, H.G.-J.; funding acquisition, H.G.-J. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the University of Vigo, CISUG, Xunta de Galicia, Gobierno de España, Agencia Estatal de Investigación, and European Union—Next GenerationEU through the following grants: FPU21/01176, PID2021-125060OB-100, TED2021-129756B-C31, Complementary R&D Plan, Galician Marine Sciences Program, Funding for open access charge (University of Vigo/CISUG).

Data Availability Statement

The raw data supporting the conclusions of this article will be made available by the authors on request.

Acknowledgments

The authors thank the administrative staff of the Institute of Physics and Aerospace Science for the administrative and technical support provided for the completion of this work.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. González-Jorge, H.; Martínez-Sánchez, J.; Bueno, M.; Arias, P. Unmanned aerial systems for civil applications: A review. Drones 2017, 1, 2. [Google Scholar] [CrossRef]
  2. Rodríguez, D.A.; Lozano Tafur, C.; Melo Daza, P.F.; Villalba Vidales, K.A.; Daza Rincón, J.C. Inspection of aircrafts and airports using UAS: A review. Results Eng. 2024, 22, 102330. [Google Scholar] [CrossRef]
  3. Turner, I.L.; Harley, M.D.; Drummond, C.D. UAVs for coastal surveying. Coast. Eng. 2016, 114, 19–24. [Google Scholar] [CrossRef]
  4. Mogili, U.R.; Deepak, B.B.V.L. Review on application of drone system in precision agriculture. Procedia Comput. Sci. 2018, 133, 502–509. [Google Scholar] [CrossRef]
  5. Tang, L.; Shao, G. Drone remote sensing for forestry research practices. J. For. Res. 2015, 26, 791–797. [Google Scholar] [CrossRef]
  6. Koslowski, R.; Schulzke, M. Drones along borders: Border security UAV in the United States and the European Union. Int. Stud. Perspect. 2018, 19, 305–324. [Google Scholar] [CrossRef]
  7. Kunertova, D. The war in Ukraine shows the game-changing effects of drones depends on the game. Bull. At. Sci. 2023, 79, 95–102. [Google Scholar] [CrossRef]
  8. Garrow, L.A.; German, B.J.; Leonard, C.E. Urban air mobility: A comprehensive review and comparative analysis with autonomous and electric ground transportation for informing future research. Transp. Res. Part C Emerg. Technol. 2021, 132, 103377. [Google Scholar] [CrossRef]
  9. Rajedran, S.; Srinivas, S. Air taxi for urban mobility: A critical review of recent developments, future challenges, and opportunities. Transp. Res. Part E Logist. Transp. Rev. 2020, 143, 102090. [Google Scholar] [CrossRef]
  10. Barrado, C.; Boyero, M.; Brucculeri, L.; Ferrara, G.; Hately, A.; Hullah, P.; Martín-Marrero, D.; Pastor, E.; Rushton, A.P.; Volert, A. U-space concept of operations: A key enabler for opening airspace to emerging low-altitude operations. Aerospace 2020, 7, 24. [Google Scholar] [CrossRef]
  11. Doole, M.; Ellerbroek, J.; Hoekstra, J. Estimation of traffic density from drone-based delivery in very low level urban airspace. J. Air Transp. Manag. 2020, 88, 101862. [Google Scholar] [CrossRef]
  12. Alharbi, A.; Poujade, A.; Malandrakis, K.; Petrunin, I.; Panagiotakopulos, D.; Tsourdos, A. Rule-based conflicto management for unmanned traffic management scenarios. In Proceedings of the 2020 AIAA/IEEE 39th Digital Avionics Systems Conference (DASC), San Antonio, TX, USA, 11–15 October 2020. [Google Scholar]
  13. Jover, J.; Bermúdez, A.; Casado, R. A tactical conflict resolution proposal for u-space zu airspace volumes. Sensors 2021, 21, 5649. [Google Scholar] [CrossRef]
  14. Jamil, S.; Abbas, M.S.; Roy, A.M. Distinguishing malicious drones using vision transformer. AI 2022, 3, 260–273. [Google Scholar] [CrossRef]
  15. Suneetha, G.; Arun, R.K.P. A collaborative lightweight malicious drone detection system in sensitive areas. In Proceedings of the 2023 IEEE International Conference on Advanced Networks and Telecommunications Systems, Jaipur, India, 17–20 December 2023. [Google Scholar]
  16. Ritchie, M.; Fioranelli, F.; Griffiths, H.; Torvik, B. Monostatic and bistatic radar measurements of birds and micro-drone. In Proceedings of the 2016 IEEE Radar Conference, Philadelphia, PA, USA, 2–6 May 2016. [Google Scholar]
  17. Rahman, S.; Robertson, D.A. Classification of drones and birds using convolutional neural networks applied to radar micro-Doppler spectrogram images. IET Radar Sonar Navig. 2020, 14, 653–661. [Google Scholar] [CrossRef]
  18. Lyons, M.; Brandis, K.; Callaghan, C.; McCann, J.; Mills, C.; Ryall, S.; Kingsford, R. Bird interactions with drones, from individuals to large colonies. Aust. Field Ornithol. 2018, 35, 51–56. [Google Scholar] [CrossRef]
  19. Pedro, D.; Matos-Carvalho, J.P.; Fonseca, J.M.; Mora, A. Collision avoidance on unmanned aerial vehicles using neural network pipelines and flow clustering techniques. Remote Sens. 2021, 13, 2643. [Google Scholar] [CrossRef]
  20. Panchal, I.; Metz, I.C.; Ribeiro, M.; Armanini, S.F. Urban air traffic management for collision avoidance with non-cooperative airspace users. In Proceedings of the RD Congress of the International Council of the Aeronautical Sciences, Stockholm, Sweden, 4–9 September 2022; Available online: https://www.icas.org/ICAS_ARCHIVE/ICAS2022/data/papers/ICAS2022_0567_paper.pdf (accessed on 9 September 2024).
  21. Xiao, J.; Feroskhan, M. Learning multi-pursuit evasion for safe targeted navigation of drones. IEEE Trans. Artif. Intell. 2024, 5, 6210–6224. [Google Scholar] [CrossRef]
  22. Onifade, T.; Eldash, O.; Bayoumi, M. An optimized object detection algorithm for unmanned aerial vehicles (UAVs). In Proceedings of the IEEE World Forum on Internet of Things, Aveiro, Portugal, 12–27 October 2023; pp. 1–7. [Google Scholar]
  23. Behroozpour, B.; Sandborn, P.A.M.; Wu, M.C.; Boser, B.E. LiDAR system architectures and circuits. IEEE Commun. Mag. 2017, 55, 135–142. [Google Scholar] [CrossRef]
  24. Aoyama, Y.; Alvarez Leon, J.F. Urban governance and autonomous vehicles. Cities 2021, 119, 103410. [Google Scholar] [CrossRef]
  25. Cugurullo, F.; Caprotti, F.; Cook, M.; Karvonen, A.; McGuirk, P.; Marvin, S. Artificial Intelligence and the city. In Urbanistic Perspectives on AI; Routledge: London, UK, 2023. [Google Scholar]
  26. Cugurullo, F. New stories of urban AI: Exploring the artificial intelligence-city nexus beyond Frankenstein Urbanism. Urban Geogr. 2024, 45, 1300–1307. [Google Scholar] [CrossRef]
  27. Jackman, A. AI urbanism and feminist geopolitics: Making space for diverse practices, actors and agencies. Urban Geogr. 2024, 45, 1292–1296. [Google Scholar] [CrossRef]
  28. MatLAB Software. Available online: https://www.mathworks.com/products/matlab.html (accessed on 9 September 2024).
  29. DJI Matrice 350. Available online: https://enterprise.dji.com/es/matrice-350-rtk (accessed on 9 September 2024).
  30. DJI Phantom. Available online: https://www.dji.com/es/products/phantom (accessed on 9 September 2024).
  31. CGTrader Website, Game Ready Bird Collection. Available online: https://www.cgtrader.com/free-3d-models/animals/bird/game-ready-bird-collection (accessed on 9 September 2024).
  32. Bird ID, Easy Bird Identification. Available online: https://www.birdid.co.uk/IdentifyBird.aspx (accessed on 9 September 2024).
  33. Warrick, D.R.; Dial, K.P. Kinematic, Aerodynamic and Anatomical Mechanisms in the Slow, Maneuvering Flight of Pigeons. J. Exp. Biol. 1998, 201, 655–672. [Google Scholar] [CrossRef] [PubMed]
  34. Aldao, E.; González-deSantos, L.M.; González-Jorge, H. LiDAR based detect and avoid system for UAV navigation in UAM corridors. Drones 2022, 6, 185. [Google Scholar] [CrossRef]
  35. Aldao, E.; Veiga-López, F.; González-deSantos, L.M.; Gonzalez-Jorge, H. Enhancing UAV classification with synthetic data: GMM LiDAR simulator for aerial surveillance applications. IEEE Sens. J. 2024, 24, 26960–26970. [Google Scholar] [CrossRef]
  36. Livox Avia. Available online: https://www.livoxtech.com/avia/specs (accessed on 9 September 2024).
Figure 1. Operational scenarios. (a) trajectory 1, (b) trajectory 2, (c) trajectory 3, (d), trajectory 4, (e) trajectory 5, and (f) trajectory 6.
Figure 1. Operational scenarios. (a) trajectory 1, (b) trajectory 2, (c) trajectory 3, (d), trajectory 4, (e) trajectory 5, and (f) trajectory 6.
Drones 09 00013 g001
Figure 2. Randomized operational scenarios. (a) r 0 = 50 , 17 ,   0   m ;   v 0 = 10 ,   10 ,   1 m s ;   a 0 = 0.6 , 0.8   0.2 m s 2   ,   b   r 0 = 20 ,   17 ,   0   m ;   v 0 = 11 , 8 , 1.2 m s ; a 0 = 0 , 1.2 , 0.3 m s 2   .
Figure 2. Randomized operational scenarios. (a) r 0 = 50 , 17 ,   0   m ;   v 0 = 10 ,   10 ,   1 m s ;   a 0 = 0.6 , 0.8   0.2 m s 2   ,   b   r 0 = 20 ,   17 ,   0   m ;   v 0 = 11 , 8 , 1.2 m s ; a 0 = 0 , 1.2 , 0.3 m s 2   .
Drones 09 00013 g002
Figure 3. Bird 3D models. (a) pigeon, (b) falcon, and (c) seagull.
Figure 3. Bird 3D models. (a) pigeon, (b) falcon, and (c) seagull.
Drones 09 00013 g003
Figure 4. LiDAR detection algorithm.
Figure 4. LiDAR detection algorithm.
Drones 09 00013 g004
Figure 5. LiDAR echoes. (a) pigeon 3D model (left) and point cloud (right), (b) falcon 3D model (left) and point cloud (right), and (c) seagull 3D model (left) and point cloud (right).
Figure 5. LiDAR echoes. (a) pigeon 3D model (left) and point cloud (right), (b) falcon 3D model (left) and point cloud (right), and (c) seagull 3D model (left) and point cloud (right).
Drones 09 00013 g005
Figure 6. LiDAR echoes simulated for each bird and trajectory.
Figure 6. LiDAR echoes simulated for each bird and trajectory.
Drones 09 00013 g006
Figure 7. Position detection error depending on the operational scenario: (a) trajectory 1, (b) trajectory 2, (c) trajectory 3, (d) trajectory 4, (e) trajectory 5, and (f) trajectory 6.
Figure 7. Position detection error depending on the operational scenario: (a) trajectory 1, (b) trajectory 2, (c) trajectory 3, (d) trajectory 4, (e) trajectory 5, and (f) trajectory 6.
Drones 09 00013 g007
Figure 8. Error statistical assessment for the 300 simulated trajectories of pigeon encounters: (a) average error in target position as a function of flight speed, (b) average error in target position as a function of sensing distance, (c) average error in speed estimation as a function of flight speed, and (d) average error in speed estimation as a function of sensing distance.
Figure 8. Error statistical assessment for the 300 simulated trajectories of pigeon encounters: (a) average error in target position as a function of flight speed, (b) average error in target position as a function of sensing distance, (c) average error in speed estimation as a function of flight speed, and (d) average error in speed estimation as a function of sensing distance.
Drones 09 00013 g008
Figure 9. Error statistical assessment for the 300 simulated trajectories of seagull encounters: (a) average error in target position as a function of flight speed, (b) average error in target position as a function of sensing distance, (c) average error in speed estimation as a function of flight speed, and (d) average error in speed estimation as a function of sensing distance.
Figure 9. Error statistical assessment for the 300 simulated trajectories of seagull encounters: (a) average error in target position as a function of flight speed, (b) average error in target position as a function of sensing distance, (c) average error in speed estimation as a function of flight speed, and (d) average error in speed estimation as a function of sensing distance.
Drones 09 00013 g009
Figure 10. Error statistical assessment for the 300 simulated trajectories of falcon encounters: (a) average error in target position as a function of flight speed, (b) average error in target position as a function of sensing distance, (c) average error in speed estimation as a function of flight speed, and (d) average error in speed estimation as a function of sensing distance.
Figure 10. Error statistical assessment for the 300 simulated trajectories of falcon encounters: (a) average error in target position as a function of flight speed, (b) average error in target position as a function of sensing distance, (c) average error in speed estimation as a function of flight speed, and (d) average error in speed estimation as a function of sensing distance.
Drones 09 00013 g010
Table 1. Typical bird dimensions and flight speeds according to [32,33].
Table 1. Typical bird dimensions and flight speeds according to [32,33].
PigeonFalconSeagull
Wingspan (m)0.62–0.720.89–1.131.3–1.6
Maximum speed (m/s)142717
Table 2. Livox Avia specifications.
Table 2. Livox Avia specifications.
ModelLivox Avia
Laser wavelength905 nm
Laser safetyClass 1
Detection range190 m @ 10% reflectivity
230 m @ 20% reflectivity
320 m @ 80% reflectivity
Field of view (FOV)70.4° H × 77.2° V (non-repetitive scanning)
70.4° H × 4.5° V (repetitive line scanning)
Range precision2 cm
Angular precision<0.05°
Beam divergence0.28° V × 0.03° H
Point rate240,000 points/s (first or strongest return)
480,000 points/s (dual return)
720,000 points/s (triple return)
Power supply voltage range10~15 V DC (with Converter 2.0: 9~30 V DC)
Dimensions91 × 61.2 × 64.8 mm
Weight498 g (without cables)
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Seoane, P.; Aldao, E.; Veiga-López, F.; González-Jorge, H. Assessment of LiDAR-Based Sensing Technologies in Bird–Drone Collision Scenarios. Drones 2025, 9, 13. https://doi.org/10.3390/drones9010013

AMA Style

Seoane P, Aldao E, Veiga-López F, González-Jorge H. Assessment of LiDAR-Based Sensing Technologies in Bird–Drone Collision Scenarios. Drones. 2025; 9(1):13. https://doi.org/10.3390/drones9010013

Chicago/Turabian Style

Seoane, Paula, Enrique Aldao, Fernando Veiga-López, and Higinio González-Jorge. 2025. "Assessment of LiDAR-Based Sensing Technologies in Bird–Drone Collision Scenarios" Drones 9, no. 1: 13. https://doi.org/10.3390/drones9010013

APA Style

Seoane, P., Aldao, E., Veiga-López, F., & González-Jorge, H. (2025). Assessment of LiDAR-Based Sensing Technologies in Bird–Drone Collision Scenarios. Drones, 9(1), 13. https://doi.org/10.3390/drones9010013

Article Metrics

Back to TopTop