Advanced 3D Navigation System for AGV in Complex Smart Factory Environments
Abstract
:1. Introduction
1.1. Related Work
1.2. Contributions and Structure
2. Complex Scene Navigation System CSNS
2.1. System Framework
2.2. Mapping and Positioning
2.2.1. Establishment of Point Cloud Map and Grid Map
2.2.2. Positioning in Complex Environments
2.3. Path Planning Fusion Algorithm
2.3.1. Perceptual Data Preprocessing
2.3.2. Fusion Path Planning
3. Experiment and Analysis
3.1. Experimental Setup
3.2. Simulation Experiment Verification
3.2.1. Map Building and Localization Performance Analysis
3.2.2. Path Planning Performance Test
3.3. Real-World Deployment Experiment Verification and Analysis
3.3.1. Map Building and Localization Performance Analysis
3.3.2. Real Vehicle Navigation Test
4. Conclusions
Supplementary Materials
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Castelo-Branco, I.; Amaro-Henriques, M.; Cruz-Jesus, F.; Oliveira, T. Assessing the Industry 4.0 European divide through the country/industry dichotomy. Comput. Ind. Eng. 2023, 176, 108925. [Google Scholar] [CrossRef]
- Ortt, R.; Stolwijk, C.; Punter, M. Implementing Industry 4.0: Assessing the current state. J. Manuf. Technol. Manag. 2020, 31, 825–836. [Google Scholar] [CrossRef]
- Asif, M. Are QM models aligned with Industry 4.0? A perspective on current practices. J. Clean. Prod. 2020, 258, 120820. [Google Scholar] [CrossRef]
- Kalsoom, T.; Ramzan, N.; Ahmed, S.; Ur-Rehman, M. Advances in sensor technologies in the era of smart factory and industry 4.0. Sensors 2020, 20, 6783. [Google Scholar] [CrossRef] [PubMed]
- Bartneck, C.; Forlizzi, J. A design-centred framework for social human-robot interaction. In Proceedings of the RO-MAN 2004 13th IEEE International Workshop on Robot and Human Interactive Communication (IEEE Catalog No. 04TH8759), Kurashiki, Japan, 22 September 2004. [Google Scholar]
- Kirby, R.; Forlizzi, J.; Simmons, R. Affective social robots. Robot. Auton. Syst. 2010, 58, 322–332. [Google Scholar] [CrossRef]
- dos Reis, W.P.N.; Morandin Junior, O. Sensors applied to automated guided vehicle position control: A systematic literature review. Int. J. Adv. Manuf. Technol. 2021, 113, 21–34. [Google Scholar] [CrossRef]
- Zhang, J.; Yang, X.; Wang, W.; Guan, J.; Ding, L.; Lee, V.C. Automated guided vehicles and autonomous mobile robots for recognition and tracking in civil engineering. Autom. Constr. 2023, 146, 104699. [Google Scholar] [CrossRef]
- Sahoo, S.; Lo, C.-Y. Smart manufacturing powered by recent technological advancements: A review. J. Manuf. Syst. 2022, 64, 236–250. [Google Scholar] [CrossRef]
- Zhou, K.; Zhang, H.; Li, F. TransNav: Spatial sequential transformer network for visual navigation. J. Comput. Des. Eng. 2022, 9, 1866–1878. [Google Scholar] [CrossRef]
- Radočaj, D.; Plaščak, I.; Jurišić, M. Global Navigation Satellite Systems as State-of-the-Art Solutions in Precision Agriculture: A Review of Studies Indexed in the Web of Science. Agriculture 2023, 13, 1417. [Google Scholar] [CrossRef]
- Zhang, Y.; Li, B.; Sun, S.; Liu, Y.; Liang, W.; Xia, X.; Pang, Z. GCMVF-AGV: Globally Consistent Multi-View Visual-Inertial Fusion for AGV Navigation in Digital Workshops. IEEE Trans. Instrum. Meas. 2023, 72, 5030116. [Google Scholar] [CrossRef]
- Winterhalter, W.; Fleckenstein, F.; Dornhege, C.; Burgard, W. Localization for precision navigation in agricultural fields—Beyond crop row following. J. Field Robot. 2021, 38, 429–451. [Google Scholar] [CrossRef]
- Tu, X.; Gai, J.; Tang, L. Robust navigation control of a 4WD/4WS agricultural robotic vehicle. Comput. Electron. Agric. 2019, 164, 104892. [Google Scholar] [CrossRef]
- Li, H.; Ao, L.; Guo, H.; Yan, X. Indoor multi-sensor fusion positioning based on federated filtering. Measurement 2020, 154, 107506. [Google Scholar] [CrossRef]
- Yasuda, Y.D.V.; Martins, L.E.G.; Cappabianco, F.A.M. Autonomous visual navigation for mobile robots: A systematic literature review. ACM Comput. Surv. (CSUR) 2020, 53, 1–34. [Google Scholar] [CrossRef]
- Jia, G.; Li, X.; Zhang, D.; Xu, W.; Lv, H.; Shi, Y.; Cai, M. Visual-SLAM Classical framework and key Techniques: A review. Sensors 2022, 22, 4582. [Google Scholar] [CrossRef]
- Cheng, J.; Zhang, L.; Chen, Q.; Hu, X.; Cai, J. A review of visual SLAM methods for autonomous driving vehicles. Eng. Appl. Artif. Intell. 2022, 114, 104992. [Google Scholar] [CrossRef]
- Zhou, X.; Huang, R. A State-of-the-Art Review on SLAM. In Proceedings of the International Conference on Intelligent Robotics and Applications, Harbin, China, 1–3 August 2022; Springer: Cham, Switzerland, 2022. [Google Scholar]
- Campos, C.; Elvira, R.; Rodriguez, J.J.G.; Montiel, J.M.M.; Tardos, J.D. ORB-SLAM3: An accurate open-source library for visual, visual–inertial, and multimap SLAM. IEEE Trans. Robot. 2021, 37, 1874–1890. [Google Scholar] [CrossRef]
- Qin, T.; Li, P.; Shen, S. VINS-Mono: A robust and versatile monocular visual-inertial state estimator. IEEE Trans. Robot. 2018, 34, 1004–1020. [Google Scholar] [CrossRef]
- Silva, J.R.; Simão, M.; Mendes, N.; Neto, P. Navigation and obstacle avoidance: A case study using Pepper robot. In Proceedings of the IECON 2019—45th Annual Conference of the IEEE Industrial Electronics Society, Lisbon, Portugal, 14–17 October 2019; Volume 1. [Google Scholar]
- Hess, W.; Kohler, D.; Rapp, H.; Andor, D. Real-time loop closure in 2D LIDAR SLAM. In Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–21 May 2016. [Google Scholar]
- Lin, J.; Zhang, F. Loam livox: A fast, robust, high-precision LiDAR odometry and mapping package for LiDARs of small FoV. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 31 May–31 August 2020; pp. 3126–3131. [Google Scholar]
- Shan, T.; Englot, B.; Meyers, D.; Wang, W.; Ratti, C.; Rus, D. LIO-SAM: Tightly-coupled lidar inertial odometry via smoothing and mapping. In Proceedings of the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 24 October 2020–24 January 2021; pp. 5135–5142. [Google Scholar]
- Xu, W.; Zhang, F. FAST-LIO: A fast, robust LiDAR-inertial odometry package by tightly-coupled iterated kalman filter. IEEE Robot. Autom. Lett. 2021, 6, 3317–3324. [Google Scholar] [CrossRef]
- Yu, X.; Chen, W.-N.; Gu, T.; Yuan, H.; Zhang, H.; Zhang, J. ACO-A*: Ant colony optimization plus A* for 3-D traveling in environments with dense obstacles. IEEE Trans. Evol. Comput. 2018, 23, 617–631. [Google Scholar] [CrossRef]
- Orozco-Rosas, U.; Montiel, O.; Sepúlveda, R. Mobile robot path planning using membrane evolutionary artificial potential field. Appl. Soft Comput. 2019, 77, 236–251. [Google Scholar] [CrossRef]
- Jian, Z.; Chen, S.; Zhang, S.; Chen, Y.; Zheng, N. Multi-model-based local path planning methodology for autonomous driving: An integrated framework. IEEE Trans. Intell. Transp. Syst. 2020, 23, 4187–4200. [Google Scholar] [CrossRef]
- Lu, F.; Milios, E. Globally consistent range scan alignment for environment mapping. Auton. Robot. 1997, 4, 333–349. [Google Scholar] [CrossRef]
- Zhu, H.; Kuang, X.; Su, T.; Chen, Z.; Yu, B.; Li, B. Dual-Constraint Registration LiDAR SLAM Based on Grid Maps Enhancement in Off-Road Environment. Remote Sens. 2022, 14, 5705. [Google Scholar] [CrossRef]
- Jang, W.; Kim, T.-W. iSAM2 using CUR matrix decomposition for data compression and analysis. J. Comput. Des. Eng. 2021, 8, 855–870. [Google Scholar] [CrossRef]
- Shan, T.; Englot, B. LeGO-LOAM: Lightweight and ground-optimized lidar odometry and mapping on variable terrain. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018. [Google Scholar]
- Zhang, J.; Singh, S. LOAM: Lidar odometry and mapping in real-time. In Proceedings of the Robotics: Science and Systems, Berkeley, CA, USA, 12–16 July 2014; Volume 2. [Google Scholar]
- Hornung, A.; Wurm, K.M.; Bennewitz, M.; Stachniss, C.; Burgard, W. OctoMap: An efficient probabilistic 3D mapping framework based on octrees. Auton. Robot. 2013, 34, 189–206. [Google Scholar] [CrossRef]
- Magnusson, M. The Three-Dimensional Normal-Distributions Transform: An Efficient Representation for Registration, Surface Analysis, and Loop Detection. Ph.D. Thesis, Örebro Universitet, Örebro, Sweden, 2009. [Google Scholar]
- Li, X.; Du, S.; Li, G.; Li, H. Integrate point-cloud segmentation with 3D LiDAR scan-matching for mobile robot localization and mapping. Sensors 2019, 20, 237. [Google Scholar] [CrossRef]
- Zováthi, Ö.; Nagy, B.; Benedek, C. Point cloud registration and change detection in urban environment using an onboard Lidar sensor and MLS reference data. Int. J. Appl. Earth Obs. Geoinf. 2022, 110, 102767. [Google Scholar] [CrossRef]
- Sun, H.; Deng, Q.; Liu, X.; Shu, Y.; Ha, Y. An Energy-Efficient Stream-Based FPGA Implementation of Feature Extraction Algorithm for LiDAR Point Clouds With Effective Local-Search. IEEE Trans. Circuits Syst. I Regul. Pap. 2022, 70, 253–265. [Google Scholar] [CrossRef]
- Yi, X.; Zhu, A.; Li, C.; Yang, S.X. A novel bio-inspired approach with multi-resolution mapping for the path planning of multi-robot system in complex environments. J. Comput. Des. Eng. 2022, 9, 2343–2354. [Google Scholar] [CrossRef]
- Andreasson, H.; Larsson, J.; Lowry, S. A Local Planner for Accurate Positioning for a Multiple Steer-and-Drive Unit Vehicle Using Non-Linear Optimization. Sensors 2022, 22, 2588. [Google Scholar] [CrossRef] [PubMed]
- Ren, J.; Wu, T.; Zhou, X.; Yang, C.; Sun, J.; Li, M.; Jiang, H.; Zhang, A. SLAM, Path Planning Algorithm and Application Research of an Indoor Substation Wheeled Robot Navigation System. Electronics 2022, 11, 1838. [Google Scholar] [CrossRef]
Algorithm | With Obstacles | Without Obstacles |
---|---|---|
A*algorithm | 2.3 s | 1.8 s |
Dijkstra algorithm | 3.0 s | 2.5 s |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Li, Y.; Wang, D.; Li, Q.; Cheng, G.; Li, Z.; Li, P. Advanced 3D Navigation System for AGV in Complex Smart Factory Environments. Electronics 2024, 13, 130. https://doi.org/10.3390/electronics13010130
Li Y, Wang D, Li Q, Cheng G, Li Z, Li P. Advanced 3D Navigation System for AGV in Complex Smart Factory Environments. Electronics. 2024; 13(1):130. https://doi.org/10.3390/electronics13010130
Chicago/Turabian StyleLi, Yiduo, Debao Wang, Qipeng Li, Guangtao Cheng, Zhuoran Li, and Peiqing Li. 2024. "Advanced 3D Navigation System for AGV in Complex Smart Factory Environments" Electronics 13, no. 1: 130. https://doi.org/10.3390/electronics13010130
APA StyleLi, Y., Wang, D., Li, Q., Cheng, G., Li, Z., & Li, P. (2024). Advanced 3D Navigation System for AGV in Complex Smart Factory Environments. Electronics, 13(1), 130. https://doi.org/10.3390/electronics13010130