High-Fidelity Drone Simulation with Depth Camera Noise and Improved Air Drag Force Models
Abstract
:1. Introduction
- Development of an improved model of the depth model by embedding a Gaussian noise model, which was verified through experiments, into the depth camera model. This modification makes the depth camera model more accurate and realistic across various distances and materials.
- Development of an improved rotor air drag force model. By including thrust values in addition to drone velocity and rotation, we create a force model that more accurately represents real-world air drag forces.
- Providing an improved simulation environment that is more reliable for testing the real-world behavior of autonomous drones compared with the existing Airsim simulator, especially in high-speed drone flight experiments.
2. Depth Camera Modeling and Implementation
2.1. Stereo IR Depth Camera
2.2. Experimental Setup
2.3. Depth Noise Modeling
2.4. Simulation Implementation and Results
3. Rotor Drag Force Model Implementation
3.1. Rotor Drag Force Model
3.2. Experiments
3.2.1. Thrust Experiments
3.2.2. Rotor Drag Force Experiments
4. Conclusions and Discussion
4.1. Conclusions
4.2. Comparison with Other Drone Simulators
4.3. Discussion
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Fennelly, L.J.; Perry, M.A. Unmanned aerial vehicle (drone) usage in the 21st century. In The Professional Protection Officer; Elsevier: Amsterdam, The Netherlands, 2020; pp. 183–189. [Google Scholar]
- Restas, A. Drone applications for supporting disaster management. World J. Eng. Technol. 2015, 3, 316. [Google Scholar] [CrossRef]
- Daud, S.M.S.M.; Yusof, M.Y.P.M.; Heo, C.C.; Khoo, L.S.; Singh, M.K.C.; Mahmood, M.S.; Nawawi, H. Applications of drone in disaster management: A scoping review. Sci. Justice 2022, 62, 30–42. [Google Scholar] [CrossRef] [PubMed]
- Pobkrut, T.; Eamsa-Ard, T.; Kerdcharoen, T. Sensor drone for aerial odor mapping for agriculture and security services. In Proceedings of the 2016 13th International Conference on Electrical Engineering/Electronics, Computer, Telecommunications and Information Technology (ECTI-CON), Chiang Mai, Thailand, 28 June–1 July 2016; IEEE: Hoboken, NJ, USA, 2016; pp. 1–5. [Google Scholar]
- Cvitanić, D. Drone applications in transportation. In Proceedings of the 2020 5th International Conference on Smart and Sustainable Technologies (SpliTech), Bol, Croatia, 2–4 July 2020; IEEE: Hoboken, NJ, USA, 2020; pp. 1–4. [Google Scholar]
- Ahirwar, S.; Swarnkar, R.; Bhukya, S.; Namwade, G. Application of drone in agriculture. Int. J. Curr. Microbiol. Appl. Sci. 2019, 8, 2500–2505. [Google Scholar] [CrossRef]
- Li, Y.; Liu, C. Applications of multirotor drone technologies in construction management. Int. J. Constr. Manag. 2019, 19, 401–412. [Google Scholar] [CrossRef]
- Goessens, S.; Mueller, C.; Latteur, P. Feasibility study for drone-based masonry construction of real-scale structures. Autom. Constr. 2018, 94, 458–480. [Google Scholar] [CrossRef]
- Fraundorfer, F.; Heng, L.; Honegger, D.; Lee, G.H.; Meier, L.; Tanskanen, P.; Pollefeys, M. Vision-based autonomous mapping and exploration using a quadrotor MAV. In Proceedings of the 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, Algarve, Portugal, 7–12 October 2012; IEEE: Hoboken, NJ, USA, 2012; pp. 4557–4564. [Google Scholar]
- Kabiri, K. Mapping coastal ecosystems and features using a low-cost standard drone: Case study, Nayband Bay, Persian gulf, Iran. J. Coast. Conserv. 2020, 24, 62. [Google Scholar] [CrossRef]
- Chashmi, S.Y.N.; Asadi, D.; Dastgerdi, K.A. Safe land system architecture design of multi-rotors considering engine failure. Int. J. Aeronaut. Astronaut. 2022, 3, 7–19. [Google Scholar] [CrossRef]
- Tutsoy, O.; Asadi, D.; Ahmadi, K.; Nabavi-Chasmi, S.Y. Robust reduced order thau observer with the adaptive fault estimator for the unmanned air vehicles. IEEE Trans. Veh. Technol. 2023, 72, 1601–1610. [Google Scholar] [CrossRef]
- Hentati, A.I.; Krichen, L.; Fourati, M.; Fourati, L.C. Simulation tools, environments and frameworks for UAV systems performance analysis. In Proceedings of the 2018 14th International Wireless Communications & Mobile Computing Conference (IWCMC), Limassol, Cyprus, 25–29 June 2018; IEEE: Hoboken, NJ, USA, 2018; pp. 1495–1500. [Google Scholar]
- Maciel, A.; Halic, T.; Lu, Z.; Nedel, L.P.; De, S. Using the PhysX engine for physics-based virtual surgery with force feedback. Int. J. Med. Robot. Comput. Assist. Surg. 2009, 5, 341–353. [Google Scholar] [CrossRef]
- Wang, S.; Chen, J.; Zhang, Z.; Wang, G.; Tan, Y.; Zheng, Y. Construction of a virtual reality platform for UAV deep learning. In Proceedings of the 2017 Chinese Automation Congress (CAC), Jinan, China, 20–22 October 2017; IEEE: Hoboken, NJ, USA, 2017; pp. 3912–3916. [Google Scholar]
- Silano, G.; Iannelli, L. CrazyS: A software-in-the-loop simulation platform for the crazyflie 2.0 nano-Quadcopter. In Robot Operating System (ROS); Springer: Berlin/Heidelberg, Germany, 2020; pp. 81–115. [Google Scholar]
- Medrano, J.; Moon, H. Rotor drag model considering the influence of thrust and moving speed for multicopter control. In Proceedings of the 2020 15th Korea Robotics Society Annual Conference, Seoul, Republic of Korea, 3–5 November 2015; KROS: Seoul, Republic of Korea, 2020. [Google Scholar]
- Oguz, S.; Heinrich, M.; Allwright, M.; Zhu, W.; Wahby, M.; Garone, E.; Dorigo, M. S-Drone: An Open-Source Quadrotor for Experimentation in Swarm Robotics; IRIDIA: Bruxelles, Belgium, 2022. [Google Scholar]
- Nabavi-Chashmi, S.Y.; Asadi, D.; Ahmadi, K.; Demir, E. Image-Based UAV Vertical Distance and Velocity Estimation Algorithm during the Vertical Landing Phase Using Low-Resolution Images. Int. J. Aerosp. Mech. Eng. 2023, 17, 28–35. [Google Scholar]
- Song, Y.; Naji, S.; Kaufmann, E.; Loquercio, A.; Scaramuzza, D. Flightmare: A flexible quadrotor simulator. In Proceedings of the Conference on Robot Learning, London, UK, 8–11 November 2021; PMLR: Baltimore, MA, USA, 2021; pp. 1147–1157. [Google Scholar]
- Medrano, J.; Yumbla, F.; Jeong, S.; Choi, I.; Park, Y.; Auh, E.; Moon, H. Jerk estimation for quadrotor based on differential flatness. In Proceedings of the 2020 17th International Conference on Ubiquitous Robots (UR), Kyoto, Japan, 22–26 June 2020; IEEE: Hoboken, NJ, USA, 2020; pp. 99–104. [Google Scholar]
- Nemra, A.; Aouf, N. Robust INS/GPS sensor fusion for UAV localization using SDRE nonlinear filtering. IEEE Sens. J. 2010, 10, 789–798. [Google Scholar] [CrossRef]
- Aguilar, W.G.; Rodríguez, G.A.; Álvarez, L.; Sandoval, S.; Quisaguano, F.; Limaico, A. Visual SLAM with a RGB-D camera on a quadrotor UAV using on-board processing. In Proceedings of the International Work-Conference on Artificial Neural Networks, Cadiz, Spain, 14–16 June 2017; Springer: Berlin/Heidelberg, Germany, 2017; pp. 596–606. [Google Scholar]
- Haider, A.; Hel-Or, H. What can we learn from depth camera sensor noise? Sensors 2022, 22, 5448. [Google Scholar] [CrossRef] [PubMed]
- Halmetschlager-Funek, G.; Suchi, M.; Kampel, M.; Vincze, M. An empirical evaluation of ten depth cameras: Bias, precision, lateral noise, different lighting conditions and materials, and multiple sensor setups in indoor environments. IEEE Robot. Autom. Mag. 2018, 26, 67–77. [Google Scholar] [CrossRef]
- Shah, S.; Dey, D.; Lovett, C.; Kapoor, A. Airsim: High-fidelity visual and physical simulation for autonomous vehicles. In Proceedings of the 11th Conference on Field and Service Robotics, Zurich, Switzerland, 12–15 September 2017; Springer: Berlin/Heidelberg, Germany, 2018; pp. 621–635. [Google Scholar]
- Li, X.; Chen, L.; Li, S.; Zhou, X. Depth segmentation in real-world scenes based on U–V disparity analysis. J. Vis. Commun. Image Represent. 2020, 73, 102920. [Google Scholar] [CrossRef]
- Carfagni, M.; Furferi, R.; Governi, L.; Santarelli, C.; Servi, M.; Uccheddu, F.; Volpe, Y. Metrological and critical characterization of the Intel D415 stereo depth camera. Sensors 2019, 19, 489. [Google Scholar] [CrossRef]
- Giancola, S.; Valenti, M.; Sala, R. Metrological qualification of the Intel D400™ active stereoscopy cameras. In A Survey on 3D Cameras: Metrological Comparison of Time-of-Flight, Structured-Light and Active Stereoscopy Technologies; Springer: Berlin/Heidelberg, Germany, 2018; pp. 71–85. [Google Scholar]
- Ahn, M.S.; Chae, H.; Noh, D.; Nam, H.; Hong, D. Analysis and noise modeling of the intel realsense d435 for mobile robots. In Proceedings of the 2019 16th International Conference on Ubiquitous Robots (UR), Jeju, Republic of Korea, 24–27 June 2019; IEEE: Hoboken, NJ, USA, 2019; pp. 707–711. [Google Scholar]
- Niu, H.; Ji, Z.; Zhu, Z.; Yin, H.; Carrasco, J. 3d vision-guided pick-and-place using kuka lbr iiwa robot. In Proceedings of the 2021 IEEE/SICE International Symposium on System Integration (SII), Fukushima, Japan, 11–14 January 2021; IEEE: Hoboken, NJ, USA, 2021; pp. 592–593. [Google Scholar]
- Grunnet-Jepsen, A.; Sweetser, J.N.; Winer, P.; Takagi, A.; Woodfill, J. Projectors for Intel® Realsense™ Depth Cameras d4xx; Intel Support, Interl Corporation: Santa Clara, CA, USA, 2018. [Google Scholar]
- Ma, C.; Zhou, Y.; Li, Z. A New Simulation Environment Based on Airsim, ROS, and PX4 for Quadcopter Aircrafts. In Proceedings of the 2020 6th International Conference on Control, Automation and Robotics (ICCAR), Singapore, 20–23 April 2020; IEEE: Hoboken, NJ, USA, 2020; pp. 486–490. [Google Scholar]
- Meier, L.; Honegger, D.; Pollefeys, M. PX4: A node-based multithreaded open source robotics framework for deeply embedded platforms. In Proceedings of the 2015 IEEE International Conference on Robotics and Automation (ICRA), Seattle, WA, USA, 26–30 May 2015; IEEE: Hoboken, NJ, USA, 2015; pp. 6235–6240. [Google Scholar]
- Atoev, S.; Kwon, K.R.; Lee, S.H.; Moon, K.S. Data analysis of the MAVLink communication protocol. In Proceedings of the 2017 International Conference on Information Science and Communications Technologies (ICISCT), Seattle, WA, USA, 26–30 May 2015; IEEE: Hoboken, NJ, USA, 2017; pp. 1–3. [Google Scholar]
- Intel Realsense D435 Specification. Available online: https://www.intelrealsense.com/depth-camera-d435/ (accessed on 21 March 2023).
- Fields, A.J.; Linnville, S.E.; Hoyt, R.E. Correlation of objectively measured light exposure and serum vitamin D in men aged over 60 years. Health Psychol. Open 2016, 3, 2055102916648679. [Google Scholar] [CrossRef]
- Nguyen, C.V.; Izadi, S.; Lovell, D. Modeling kinect sensor noise for improved 3d reconstruction and tracking. In Proceedings of the 2012 Second International Conference on 3D Imaging, Modeling, Processing, Visualization & Transmission, Zurich, Switzerland, 13–15 October 2012; IEEE: Hoboken, NJ, USA, 2012; pp. 524–530. [Google Scholar]
- Fäulhammer, T.; Ambruş, R.; Burbridge, C.; Zillich, M.; Folkesson, J.; Hawes, N.; Jensfelt, P.; Vincze, M. Autonomous learning of object models on a mobile robot. IEEE Robot. Autom. Lett. 2016, 2, 26–33. [Google Scholar] [CrossRef]
- Tateno, K.; Tombari, F.; Navab, N. Real-time and scalable incremental segmentation on dense slam. In Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany, September 28–2 October 2015; IEEE: Hoboken, NJ, USA, 2015; pp. 4465–4472. [Google Scholar]
- Nießner, M.; Zollhöfer, M.; Izadi, S.; Stamminger, M. Real-time 3D reconstruction at scale using voxel hashing. ACM Trans. Graph. (ToG) 2013, 32, 169. [Google Scholar] [CrossRef]
- Zhou, Q.Y.; Koltun, V. Depth camera tracking with contour cues. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 7–12 June 2015; pp. 632–638. [Google Scholar]
- Faessler, M.; Franchi, A.; Scaramuzza, D. Differential flatness of quadrotor dynamics subject to rotor drag for accurate tracking of high-speed trajectories. IEEE Robot. Autom. Lett. 2017, 3, 620–626. [Google Scholar] [CrossRef]
- Kai, J.M.; Allibert, G.; Hua, M.D.; Hamel, T. Nonlinear feedback control of quadrotors exploiting first-order drag effects. IFAC-PapersOnLine 2017, 50, 8189–8195. [Google Scholar] [CrossRef]
- Mellinger, D.; Kumar, V. Minimum snap trajectory generation and control for quadrotors. In Proceedings of the 2011 IEEE International Conference on Robotics and Automation, Kyoto, Japan, 1–5 November 2011; IEEE: Hoboken, NJ, USA, 2011; pp. 2520–2525. [Google Scholar]
- Lee, D. A linear acceleration control for precise trajectory tracking flights of a quadrotor UAV under high-wind environments. Int. J. Aeronaut. Space Sci. 2021, 22, 898–910. [Google Scholar] [CrossRef]
- Sciortino, C.; Fagiolini, A. ROS/Gazebo-based simulation of quadcopter aircrafts. In Proceedings of the 2018 IEEE 4th International Forum on Research and Technology for Society and Industry (RTSI), Palermo, Italy, 10–13 September 2018; IEEE: Hoboken, NJ, USA, 2018; pp. 1–6. [Google Scholar]
- Sagitov, A.; Gerasimov, Y. Towards DJI phantom 4 realistic simulation with gimbal and RC controller in ROS/Gazebo environment. In Proceedings of the 2017 10th International Conference on Developments in eSystems Engineering (DeSE), Paris, France, 14–16 June 2017; IEEE: Hoboken, NJ, USA, 2017; pp. 262–266. [Google Scholar]
- Furrer, F.; Burri, M.; Achtelik, M.; Siegwart, R. RotorS—A modular gazebo MAV simulator framework. In Robot Operating System (ROS); Springer: Berlin/Heidelberg, Germany, 2016; pp. 595–625. [Google Scholar]
- Guerra, W.; Tal, E.; Murali, V.; Ryou, G.; Karaman, S. Flightgoggles: Photorealistic sensor simulation for perception-driven robotics using photogrammetry and virtual reality. In Proceedings of the 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macao, Macau, 3–8 November 2019; IEEE: Hoboken, NJ, USA, 2019; pp. 6941–6948. [Google Scholar]
- Moon, H.; Sun, Y.; Baltes, J.; Kim, S.J. The IROS 2016 competitions [competitions]. IEEE Robot. Autom. Mag. 2017, 24, 20–29. [Google Scholar] [CrossRef]
- Moon, H.; Martinez-Carranza, J.; Cieslewski, T.; Faessler, M.; Falanga, D.; Simovic, A.; Scaramuzza, D.; Li, S.; Ozo, M.; De Wagter, C.; et al. Challenges and implemented technologies used in autonomous drone racing. Intell. Serv. Robot. 2019, 12, 137–148. [Google Scholar] [CrossRef]
- Foehn, P.; Brescianini, D.; Kaufmann, E.; Cieslewski, T.; Gehrig, M.; Muglikar, M.; Scaramuzza, D. Alphapilot: Autonomous drone racing. Auton. Robot. 2022, 46, 307–320. [Google Scholar] [CrossRef] [PubMed]
Plastic | Cement | Formboard | |
---|---|---|---|
m = 0.85 (1/m) | m = 0.50 (1/m) | m = 0.55 (1/m) | |
Value | 0.26049 | 0.7126 | 0.5438 |
Speed | Airsim Drag Force Model | Rotor Drag Force Model |
---|---|---|
5 m/s | 0.57 m/s | 0.64 m/s |
10 m/s | 1.48 m/s | 0.98 m/s |
Simulation | Dynamics | Visual Quality | Sensor Type | Sensor Noise Model | VR Headset | Vehicles | Aerodynamic Drag Force Model |
---|---|---|---|---|---|---|---|
Gazebo [47] | Gazebo-based | Low(OpenGL) | IMU, GPS, RGBD, Lidar | X | X | Single | Velocity (drone), rotation dependent coefficients (drone) |
Hector [20,48] | Gazebo-based | Low (OpenGL) | IMU, GPS, RGBD, Lidar | GPS, IMU | X | Single | Velocity (drone) |
RotorS [49] | Gazebo -based | Low (OpenGL) | IMU, GPS, RGBD, Lidar | IMU, GPS, RGB | X | Single | Velocity (drone), angular velocities (rotors) |
FlightGoggles [50] | Flexible | High (Unity) | IMU, GPS, RGB | IMU, RGB | O | Single | Velocity (drone) |
Airsim [26] | PhysX | High (Unreal Engine) | IMU, GPS, RGBD, Lidar | IMU, GPS, RGB | O | Multiple | Velocity (drone), rotation (drone) |
Airsim with our model | PhysX | High (Unreal Engine) | IMU, GPS, RGBD, Lidar | IMU, GPS, RGB, Depth Camera | O | Multiple | Velocity (drone), rotation (drone), thrust values (rotors) |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Kim, W.; Luong, T.; Ha, Y.; Doh, M.; Yax, J.F.M.; Moon, H. High-Fidelity Drone Simulation with Depth Camera Noise and Improved Air Drag Force Models. Appl. Sci. 2023, 13, 10631. https://doi.org/10.3390/app131910631
Kim W, Luong T, Ha Y, Doh M, Yax JFM, Moon H. High-Fidelity Drone Simulation with Depth Camera Noise and Improved Air Drag Force Models. Applied Sciences. 2023; 13(19):10631. https://doi.org/10.3390/app131910631
Chicago/Turabian StyleKim, Woosung, Tuan Luong, Yoonwoo Ha, Myeongyun Doh, Juan Fernando Medrano Yax, and Hyungpil Moon. 2023. "High-Fidelity Drone Simulation with Depth Camera Noise and Improved Air Drag Force Models" Applied Sciences 13, no. 19: 10631. https://doi.org/10.3390/app131910631
APA StyleKim, W., Luong, T., Ha, Y., Doh, M., Yax, J. F. M., & Moon, H. (2023). High-Fidelity Drone Simulation with Depth Camera Noise and Improved Air Drag Force Models. Applied Sciences, 13(19), 10631. https://doi.org/10.3390/app131910631