The Use of Terrestrial and Maritime Autonomous Vehicles in Nonintrusive Object Inspection
Abstract
:1. Introduction
2. Typical Structure of Self-Moving Vehicles
3. Sensors Used for Self-Driving Vehicles
3.1. Obstacle Detection with the Video Camera
- -
- Algorithms based on known object recognition. They aim to recognize previously known object parameters and evaluate the distance to these objects based on known object dimensions. These algorithms are easy to implement and they are able to effectively recognize previously known objects, but are useless under any uncertainties, e.g., detecting obstacles that were not previously learned;
- -
- Motion-based algorithms. They analyze the sequence of images and compute each pixel offset. Based on the system motion data, it is possible to detect obstacles that appear on a vehicle way without previous information on the type or shape of these obstacles. In most cases, an optical flux algorithm is used to implement this method.
- -
- The stereo comparison method, based on searching for common patterns in two images, computing differences in the binocular discrepancy maps, and estimating the distance to the obstacle based on the horizontal displacement;
- -
- The method of homographic transformation, which aims to transform the image angle from one camera to the image angle of another camera, assuming any significant differences between straight and altered images as an obstacle.
3.2. Interference Detection Using Active Sensors
- -
- LIDAR uses laser radiation to calculate the distance to the target;
- -
- RADAR uses radio waves to calculate the angle, type, distance, and speed of the obstacle;
- -
- Ultrasonic sensors use high-frequency sonic radiation to analyze the time of reflected signal detection to determine the distance to the object.
3.3. Rationale for the Obstacle Sensor Choice
3.4. Obstacle Detection Algorithms
4. Route Planning Algorithms for Self-Driving Vehicles
- -
- Bug algorithms;
- -
- Naïve, or the simplest algorithms;
- -
- Algorithms with the use of distance sensors;
- -
- Potential field-based algorithms;
- -
- Graph-based algorithms;
- -
- Formal algorithms (Dijkstra’s algorithm, Floyd–Warschall algorithm);
- -
- Heuristic algorithms (e.g., width search);
- -
- Hybrid search algorithms (for example, A*, D*).
4.1. Bug Algorithms—Bug-1 Algorithm
4.2. Bug Algorithms with the Distance Sensor
4.3. Potential Field Algorithms
4.4. Algorithms on Graphs
4.5. Hybrid Search Algorithms
- (1)
- Dijkstra’s algorithm;
- (2)
- Width search algorithm;
- (3)
- Algorithm A*.
- -
- D* algorithms: D * and Focused D *;
- -
- LPA* algorithms: LPA* and D* Lite algorithms.
4.6. Rationale for the Choice of Algorithm
- -
- They require information on the existence of all obstacles, even those that are beyond the scope of the sensors;
- -
- They have great computational complexity, as they provide a complete recalculation of the potential field with each new measurement;
- -
- They do not guarantee the achievement of the goal with specific configurations of obstacles or incomplete initial data;
- -
- They do not take into account the kinematics of the self-driving vehicle.
5. Self-Driving Vehicles Positioning Principles and Navigation Control
5.1. General Principles of Global Positioning
- -
- Calculating the object’s location, speed, and movement direction based on the GPS satellites’ signals;
- -
- Connecting external sensors to analog or digital inputs of a tracker;
- -
- Reading data from the vehicle’s onboard equipment via either serial port or CAN interface;
- -
- Storing a certain amount of data in the internal memory during communication breaks;
- -
- Transferring received data to the central server for the following processing.
5.2. Software for Autonomous Moving Control
6. Maritime-Specific Sensors and Nonintrusive Object Inspection
7. Nonintrusive Control with Self-Driving Vehicles
8. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Mamchur, D.; Peksa, J.; Le Clainche, S.; Vinuesa, R. Application and Advances in Radiographic and Novel Technologies Used for Non-Intrusive Object Inspection. Sensors 2022, 22, 2121. [Google Scholar] [CrossRef] [PubMed]
- Tholen, B. The changing border: Developments and risks in border control management of Western countries. Int. Rev. Adm. Sci. 2010, 76, 259–278. [Google Scholar] [CrossRef]
- Trauner, F.; Ripoll Servent, A. The Communitarization of the Area of Freedom, Security and Justice: Why Institutional Change does not Translate into Policy Change. JCMS J. Common Mark. Stud. 2016, 54, 1417–1432. [Google Scholar] [CrossRef] [Green Version]
- Wasilewski, T.; Szulczynski, B.; Wojciechowski, M.; Kamysz, W.; Gebicki, J. A highly selective biosensor based on peptide directly derived from the HarmOBP7 aldehyde binding site. Sensors 2019, 19, 4284. [Google Scholar] [CrossRef] [Green Version]
- Di-Poï, N.; Milinkovitch, M.C. Crocodylians evolved scattered multi-sensory micro-organs. Evodevo 2013, 4, 19. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Qi, P.F.; Meng, Q.H.; Zeng, M. A CNN-based simplified data processing method for electronic noses. In Proceedings of the ISOCS/IEEE International Symposium on Olfaction and Electronic Nose, Montreal, QC, Canada, 28–31 May 2017; pp. 1–3. [Google Scholar] [CrossRef]
- Polner, M. Customs and Illegal Trade: Old Game–New Rules. J. Borderl. Stud. 2015, 30, 329–344. [Google Scholar] [CrossRef]
- Nguyen, H.D.; Cai, R.; Zhao, H.; Kot, A.C.; Wen, B. Towards More Efficient Security Inspection via Deep Learning: A Task-Driven X-ray Image Cropping Scheme. Micromachines 2022, 13, 565. [Google Scholar] [CrossRef]
- Yang, H.; Zhang, D.; Qin, S.; Cui, T.J.; Miao, J. Real-Time Detection of Concealed Threats with Passive Millimeter Wave and Visible Images via Deep Neural Networks. Sensors 2021, 21, 8456. [Google Scholar] [CrossRef]
- Wang, J.; Jiang, K.; Zhang, T.; Gu, X.; Liu, G.; Lu, X. Visible–Infrared Person Re-Identification via Global Feature Constraints Led by Local Features. Electronics 2022, 11, 2645. [Google Scholar] [CrossRef]
- Capasso, P.; Cimmino, L.; Abate, A.F.; Bruno, A.; Cattaneo, G. A PNU-Based Methodology to Improve the Reliability of Biometric Systems. Sensors 2022, 22, 6074. [Google Scholar] [CrossRef]
- Elordi, U.; Lunerti, C.; Unzueta, L.; Goenetxea, J.; Aranjuelo, N.; Bertelsen, A.; Arganda-Carreras, I. Designing Automated Deployment Strategies of Face Recognition Solutions in Heterogeneous IoT Platforms. Information 2021, 12, 532. [Google Scholar] [CrossRef]
- Huszár, V.D.; Adhikarla, V.K. Live Spoofing Detection for Automatic Human Activity Recognition Applications. Sensors 2021, 21, 7339. [Google Scholar] [CrossRef] [PubMed]
- Montaño-Serrano, V.M.; Jacinto-Villegas, J.M.; Vilchis-González, A.H.; Portillo-Rodríguez, O. Artificial Vision Algorithms for Socially Assistive Robot Applications: A Review of the Literature. Sensors 2021, 21, 5728. [Google Scholar] [CrossRef] [PubMed]
- Palomino, M.A.; Aider, F. Evaluating the Effectiveness of Text Pre-Processing in Sentiment Analysis. Appl. Sci. 2022, 12, 8765. [Google Scholar] [CrossRef]
- Slabchenko, O.; Sydorenko, V.; Siebert, X. Development of models for imputation of data from social networks on the basis of an extended matrix of attributes. East.-Eur. J. Enterp. Technol. 2016, 4, 24–34. [Google Scholar] [CrossRef] [Green Version]
- Sydorenko, V.; Kravchenko, S.; Rychok, Y.; Zeman, K. Method of Classification of Tonal Estimations Time Series in Problems of Intellectual Analysis of Text Content. Transp. Res. Procedia 2020, 44, 102–109. [Google Scholar] [CrossRef]
- Romanovs, A.; Bikovska, J.; Peksa, J.; Vartiainen, T.; Kotsampopoulos, P.; Eltahawy, B.; Lehnhoff, S.; Brand, M.; Strebko, J. State of the Art in Cybersecurity and Smart Grid Education. In Proceedings of the 2021 IEEE 19th International Conference on Smart Technologies (EUROCON), Lviv, Ukraine, 6–8 July 2021. [Google Scholar] [CrossRef]
- Williams, J.M. The safety/security nexus and the humanitarianisation of border enforcement. Geogr. J. 2016, 182, 27–37. [Google Scholar] [CrossRef]
- Iqbal, A.; Amin, R.; Iqbal, J.; Alroobaea, R.; Binmahfoudh, A.; Hussain, M. Sentiment Analysis of Consumer Reviews Using Deep Learning. Sustainability 2022, 14, 10844. [Google Scholar] [CrossRef]
- Al Naqbi, N.; Al Momani, N.; Davies, A. The Influence of Social Media on Perceived Levels of National Security and Crisis: A Case Study of Youth in the United Arab Emirates. Sustainability 2022, 14, 10785. [Google Scholar] [CrossRef]
- Tesfagergish, S.G.; Kapočiūtė-Dzikienė, J.; Damaševičius, R. Zero-Shot Emotion Detection for Semi-Supervised Sentiment Analysis Using Sentence Transformers and Ensemble Learning. Appl. Sci. 2022, 12, 8662. [Google Scholar] [CrossRef]
- Shahzalal, M.; Adnan, H.M. Attitude, Self-Control, and Prosocial Norm to Predict Intention to Use Social Media Responsibly: From Scale to Model Fit towards a Modified Theory of Planned Behavior. Sustainability 2022, 14, 9822. [Google Scholar] [CrossRef]
- Sandino, J.; Vanegas, F.; Maire, F.; Caccetta, P.; Sanderson, C.; Gonzalez, F. UAV Framework for Autonomous Onboard Navigation and People/Object Detection in Cluttered Indoor Environments. Remote Sens. 2020, 12, 3386. [Google Scholar] [CrossRef]
- Recalde, L.F.; Guevara, B.S.; Carvajal, C.P.; Andaluz, V.H.; Varela-Aldás, J.; Gandolfo, D.C. System Identification and Nonlinear Model Predictive Control with Collision Avoidance Applied in Hexacopters UAVs. Sensors 2022, 22, 4712. [Google Scholar] [CrossRef] [PubMed]
- Khan, M.F.; Yau, K.-L.A.; Ling, M.H.; Imran, M.A.; Chong, Y.-W. An Intelligent Cluster-Based Routing Scheme in 5G Flying Ad Hoc Networks. Appl. Sci. 2022, 12, 3665. [Google Scholar] [CrossRef]
- Ming, Z.; Huang, H. A 3D Vision Cone Based Method for Collision Free Navigation of a Quadcopter UAV among Moving Obstacles. Drones 2021, 5, 134. [Google Scholar] [CrossRef]
- Cui, X.; Zhang, X.; Zhao, Z. Real-Time Safety Decision-Making Method for Multirotor Flight Strategies Based on TOPSIS Model. Appl. Sci. 2022, 12, 6696. [Google Scholar] [CrossRef]
- Nikolakopoulos, K.; Kyriou, A.; Koukouvelas, I.; Zygouri, V.; Apostolopoulos, D. Combination of Aerial, Satellite, and UAV Photogrammetry for Mapping the Diachronic Coastline Evolution: The Case of Lefkada Island. ISPRS Int. J. Geo-Inf. 2019, 8, 489. [Google Scholar] [CrossRef] [Green Version]
- Avola, D.; Cinque, L.; Di Mambro, A.; Diko, A.; Fagioli, A.; Foresti, G.L.; Marini, M.R.; Mecca, A.; Pannone, D. Low-Altitude Aerial Video Surveillance via One-Class SVM Anomaly Detection from Textural Features in UAV Images. Information 2022, 13, 2. [Google Scholar] [CrossRef]
- Marzoughi, A.; Savkin, A.V. Autonomous Navigation of a Team of Unmanned Surface Vehicles for Intercepting Intruders on a Region Boundary. Sensors 2021, 21, 297. [Google Scholar] [CrossRef]
- Tomar, I.; Sreedevi, I.; Pandey, N. State-of-Art Review of Traffic Light Synchronization for Intelligent Vehicles: Current Status, Challenges, and Emerging Trends. Electronics 2022, 11, 465. [Google Scholar] [CrossRef]
- Zhao, Z.; Hu, Q.; Feng, H.; Feng, X.; Su, W. A Cooperative Hunting Method for Multi-AUV Swarm in Underwater Weak Information Environment with Obstacles. J. Mar. Sci. Eng. 2022, 10, 1266. [Google Scholar] [CrossRef]
- Cetin, K.; Tugal, H.; Petillot, Y.; Dunnigan, M.; Newbrook, L.; Erden, M.S. A Robotic Experimental Setup with a Stewart Platform to Emulate Underwater Vehicle-Manipulator Systems. Sensors 2022, 22, 5827. [Google Scholar] [CrossRef] [PubMed]
- Coppola, M.; McGuire, K.N.; Scheper, K.Y.W.; de Croon, G.C.H.E. Onboard communication-based relative localization for collision avoidance in micro air vehicle teams. Auton. Robot. 2018, 42, 1787–1805. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Bresson, G.; Alsayed, Z.; Yu, L.; Glaser, S. Simultaneous localization and mapping: A survey of current trends in autonomous driving. IEEE Trans. Intell. Veh. 2017, 2, 194–220. [Google Scholar] [CrossRef] [Green Version]
- Ranyal, E.; Sadhu, A.; Jain, K. Road Condition Monitoring Using Smart Sensing and Artificial Intelligence: A Review. Sensors 2022, 22, 3044. [Google Scholar] [CrossRef]
- Fraundorfer, F.; Engels, C.; Nister, D. Topological mapping, localization and navigation using image collections. In Proceedings of the International Conference on Intelligent Robots and Systems, San Diego, CA, USA, 29 October 2007–2 November 2007; pp. 3872–3877. [Google Scholar] [CrossRef]
- Goedemé, T.; Nuttin, M.; Tuytelaars, T.; Van Gool, L. Omnidirectional vision based topological navigation. Int. J. Comput. Vis. 2007, 74, 219–236. [Google Scholar] [CrossRef] [Green Version]
- Kim, D.-H.; Shin, K.; Han, C.-S.; Lee, J.Y. Sensor-based navigation of a car-like robot based on bug family algorithms. Proc. Inst. Mech. Eng. Part C J. Mech. Eng. Sci. 2013, 227, 1224–1241. [Google Scholar] [CrossRef]
- Mueller, M.W.; Hamer, M.; D’Andrea, R. Fusing ultrawide band range measurements with accelerometers and rate gyroscopes for quadrocopter state estimation. In Proceedings of the 2015 IEEE International Conference on Robotics and Automation (ICRA), Seattle, WA, USA, 26–30 May 2015; pp. 1730–1736. [Google Scholar] [CrossRef]
- Budiharto, W. Intelligent Surveillance Robot with Obstacle Avoidance Capabilities Using Neural Network. Comput. Intell. Neurosci. 2015, 2015, 745823. [Google Scholar] [CrossRef] [Green Version]
- Badrloo, S.; Varshosaz, M.; Pirasteh, S.; Li, J. Image-Based Obstacle Detection Methods for the Safe Navigation of Unmanned Vehicles: A Review. Remote Sens. 2022, 14, 3824. [Google Scholar] [CrossRef]
- Zhang, X.; Zhou, M.; Qiu, P.; Huang, Y.; Li, J. Radar and vision fusion for the real-time obstacle detection and identification. Ind. Robot. 2019, 46, 391–395. [Google Scholar] [CrossRef]
- Ci, W.; Xu, T.; Lin, R.; Lu, S. A Novel Method for Unexpected Obstacle Detection in the Traffic Environment Based on Computer Vision. Appl. Sci. 2022, 12, 8937. [Google Scholar] [CrossRef]
- Neelam Jaikishore, C.; Podaturpet Arunkumar, G.; Jagannathan Srinath, A.; Vamsi, H.; Srinivasan, K.; Ramesh, R.K.; Jayaraman, K.; Ramachandran, P. Implementation of Deep Learning Algorithm on a Custom Dataset for Advanced Driver Assistance Systems Applications. Appl. Sci. 2022, 12, 8927. [Google Scholar] [CrossRef]
- Hachaj, T. Potential Obstacle Detection Using RGB to Depth Image Encoder–Decoder Network: Application to Unmanned Aerial Vehicles. Sensors 2022, 22, 6703. [Google Scholar] [CrossRef] [PubMed]
- Hussain, M.; Ali, N.; Hong, J.-E. Vision beyond the Field-of-View: A Collaborative Perception System to Improve Safety of Intelligent Cyber-Physical Systems. Sensors 2022, 22, 6610. [Google Scholar] [CrossRef]
- Buckman, N.; Hansen, A.; Karaman, S.; Rus, D. Evaluating Autonomous Urban Perception and Planning in a 1/10th Scale MiniCity. Sensors 2022, 22, 6793. [Google Scholar] [CrossRef]
- Elamin, A.; El-Rabbany, A. UAV-Based Multi-Sensor Data Fusion for Urban Land Cover Mapping Using a Deep Convolutional Neural Network. Remote Sens. 2022, 14, 4298. [Google Scholar] [CrossRef]
- Prochowski, L.; Szwajkowski, P.; Ziubiński, M. Research Scenarios of Autonomous Vehicles, the Sensors and Measurement Systems Used in Experiments. Sensors 2022, 22, 6586. [Google Scholar] [CrossRef]
- Kelly, C.; Wilkinson, B.; Abd-Elrahman, A.; Cordero, O.; Lassiter, H.A. Accuracy Assessment of Low-Cost Lidar Scanners: An Analysis of the Velodyne HDL–32E and Livox Mid–40’s Temporal Stability. Remote Sens. 2022, 14, 4220. [Google Scholar] [CrossRef]
- Yeong, D.J.; Velasco-Hernandez, G.; Barry, J.; Walsh, J. Sensor and Sensor Fusion Technology in Autonomous Vehicles: A Review. Sensors 2021, 21, 2140. [Google Scholar] [CrossRef]
- Pires, M.; Couto, P.; Santos, A.; Filipe, V. Obstacle Detection for Autonomous Guided Vehicles through Point Cloud Clustering Using Depth Data. Machines 2022, 10, 332. [Google Scholar] [CrossRef]
- Taylor, K.; LaValle, S.M. Intensity-based navigation with global guarantees. Auton. Robot. 2014, 36, 349. [Google Scholar] [CrossRef]
- Xu, Q.-L.; Tang, G.-Y. Vectorization path planning for autonomous mobile agent in unknown environment. Neural Comput. Appl. 2013, 23, 2129. [Google Scholar] [CrossRef]
- Zhu, Y.; Zhang, T.; Song, J.; Li, X. A new bug-type navigation algorithm considering practical implementation issues for mobile robots. In Proceedings of the 2010 IEEE International Conference on Robotics and Biomimetics, Tianjin, China, 14–18 December 2010; pp. 531–536. [Google Scholar] [CrossRef]
- McGuire, K.N.; de Croon, G.C.H.E.; Tuyls, K. A comparative study of bug algorithms for robot navigation. Robot. Auton. Syst. 2019, 121, 103261. [Google Scholar] [CrossRef] [Green Version]
- Lumelsky, V.; Stepanov, A. Dynamic path planning for a mobile automaton with limited information on the environment. IEEE Trans. Autom. Control. 1986, 31, 1058–1063. [Google Scholar] [CrossRef]
- Lumelsky, V.J.; Stepanov, A.A. Path-planning strategies for a point mobile automaton moving amidst unknown obstacles of arbitrary shape. Algorithmica 1987, 2, 403–430. [Google Scholar] [CrossRef]
- Kamon, I.; Rivlin, E. Sensory-based motion planning with global proofs. IEEE Trans. Robot. Autom. 1997, 13, 814–822. [Google Scholar] [CrossRef] [Green Version]
- Lumelsky, V.; Skewis, T. A paradigm for incorporating vision in the robot navigation function. In Proceedings of the 1988 IEEE International Conference on Robotics and Automation, Philadelphia, PA, USA, 24–29 April 1988; Volume 2, pp. 734–739. [Google Scholar] [CrossRef]
- Lumelsky, V.J.; Skewis, T. Incorporating range sensing in the robot navigation function and Cybernetics. IEEE Trans. Syst. 1990, 20, 1058–1069. [Google Scholar] [CrossRef] [Green Version]
- Kamon, I.; Rivlin, E.; Rimon, E. A new range-sensor based globally convergent navigation algorithm for mobile robots. In Proceedings of the IEEE International Conference on Robotics and Automation, Minneapolis, MN, USA, 22–28 April 1996; Volume 1, pp. 429–435. [Google Scholar] [CrossRef]
- Sankaranarayanan, A.; Vidyasagar, M. A new path planning algorithm for moving a point object amidst unknown obstacles in a plane. In Proceedings of the IEEE International Conference on Robotics and Automation, Cincinnati, OH, USA, 13–18 May 1990; Volume 3, pp. 1930–1936. [Google Scholar] [CrossRef]
- Noborio, H.; Maeda, Y.; Urakawa, K. Three or more dimensional sensor-based path-planning algorithm hd-i. In Proceedings of the 1999 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human and Environment Friendly Robots with High Intelligence and Emotional Quotients (Cat. No.99CH36289), Kyongju, Korea, 17–21 October 1999; Volume 3, pp. 1699–1706. [Google Scholar]
- Horiuchi, Y.; Noborio, H. Evaluation of path length made in sensor-based path-planning with the alternative following. In Proceedings of the 2001 ICRA IEEE International Conference on Robotics and Automation, (Cat. No.01CH37164), Seoul, Korea, 21–26 May 2001; Volume 2, pp. 1728–1735. [Google Scholar]
- Lee, S.; Adams, T.M.; Ryoo, B.Y. A fuzzy navigation system for mobile construction robots. Autom. Constr. 1997, 6, 97–107. [Google Scholar] [CrossRef]
- Kamon, I.; Rimon, E.; Rivlin, E. Tangentbug: A range-sensorbased navigation algorithm. Int. J. Robot. Res. 1998, 17, 934–953. [Google Scholar] [CrossRef]
- Khatib, O. Real-Time Obstacle Avoidance for Manipulators and Mobile Robots. In Autonomous Robot Vehicles; Springer: Berlin/Heidelberg, Germany, 1986; pp. 396–404. [Google Scholar] [CrossRef]
- Pêtrès, C.; Romero-Ramirez, M.-A.; Plumet, F. A potential field approach for reactive navigation of autonomous sailboats. Robot. Auton. Syst. 2012, 60, 1520–1527. [Google Scholar] [CrossRef]
- Cullen, A.; Mazhar, M.K.A.; Smith, M.D.; Lithander, F.E.; Ó Breasail, M.; Henderson, E.J. Wearable and Portable GPS Solutions for Monitoring Mobility in Dementia: A Systematic Review. Sensors 2022, 22, 3336. [Google Scholar] [CrossRef] [PubMed]
- Caporali, A.; Zurutuza, J. Broadcast Ephemeris with Centimetric Accuracy: Test Results for GPS, Galileo, Beidou and Glonass. Remote Sens. 2021, 13, 4185. [Google Scholar] [CrossRef]
- Andritsos, F.; Mosconi, M. Port security in EU: A systemic approach. In Proceedings of the 2010 International WaterSide Security Conference, Carrara, Italy, 3–5 November 2010. [Google Scholar] [CrossRef]
- Li, S.; Jin, X.; Yao, S.; Yang, S. Underwater Small Target Recognition Based on Convolutional Neural Network. In Proceedings of the Global Oceans 2020: Singapore–U.S. Gulf Coast, Biloxi, MS, USA, 5–30 October 2020. [Google Scholar] [CrossRef]
- Cho, H.; Pyo, J.; Yu, S.-C. Drift error reduction based on the sonar image prediction and matching for underwater hovering. IEEE Sens. J. 2016, 16, 8566–8577. [Google Scholar] [CrossRef]
- Pyo, J.; Cho, H.; Joe, H.; Ura, T.; Yu, S.-C. Development of hovering type AUV “Cyclops” and its performance evaluation using image mosaicing. Ocean. Eng. 2015, 109, 517–530. [Google Scholar] [CrossRef] [Green Version]
- Foresti, G.L.; Murino, V.; Regazzoni, C.S.; Trucco, A. A voting-based approach for fast object recognition in underwater acoustic images. IEEE J. Ocean. Eng. 1997, 22, 57–65. [Google Scholar] [CrossRef]
- Nie, D.; Sun, Z.; Qiao, G.; Liu, S.; Yin, Y. Kite-type passive acoustic detection system for underwater small targets. In Proceedings of the 2014 Oceans-St. John’s, St. John’s, NL, Canada, 14–19 September 2014. [Google Scholar] [CrossRef]
- Felber, F. Extended Intruder Detection to Counter Advanced Underwater Threats in Ports and Harbors. In Proceedings of the 2018 IEEE International Symposium on Technologies for Homeland Security (HST), Woburn, MA, USA, 23–24 October 2018. [Google Scholar] [CrossRef]
- Percival, A.M.; Crowe, D.V.; Crawford, A. CUwPS: An integrated system for the detection, localization, and classification of underwater threats. In Proceedings of the 2010 International WaterSide Security Conference, Carrara, Italy, 3–5 November 2010. [Google Scholar] [CrossRef]
- Sutin, A.; Salloum, H.; DeLorme, M.; Sedunov, N.; Sedunov, A.; Tsionskiy, M. Stevens Passive Acoustic system for surface and underwater threat detection. In Proceedings of the 2013 IEEE International Conference on Technologies for Homeland Security (HST), Waltham, MA, USA, 12–14 November 2013. [Google Scholar] [CrossRef]
- Balashova, E.A.; Zabolotskikh, E.V.; Azarov, S.M.; Khvorostovsky, K.; Chapron, B. Arctic Ocean Surface Type Classification Using SAR Images and Machine Learning Algorithms. In Proceedings of the IGARSS 2019-2019 IEEE International Geoscience and Remote Sensing Symposium, Yokohama, Japan, 28 July–2 August 2019. [Google Scholar] [CrossRef]
- Lopera Tellez, O. Underwater threat recognition: Are automatic target classification algorithms going to replace expert human operators in the near future? In Proceedings of the OCEANS 2019-Marseille, Marseille, France, 17–20 June 2019. [Google Scholar] [CrossRef]
- Huo, G.; Wu, Z.; Li, J. Underwater Object Classification in Sidescan Sonar Images Using Deep Transfer Learning and Semisynthetic Training Data. IEEE Access 2020, 8, 47407–47418. [Google Scholar] [CrossRef]
- Matsuda, Y.; Ogawa, M.; Yano, M. System of detecting underwater threats in side scan sonar images. In Proceedings of the OCEANS 2015-Genova, Genova, Italy, 18–21 May 2015. [Google Scholar] [CrossRef]
- Mallet, D.; Pelletier, D. Underwater video techniques for observing coastal marine biodiversity: A review of sixty years of publications (1952–2012). Fish. Res. 2014, 154, 44–62. [Google Scholar] [CrossRef] [Green Version]
- Chen, B.; Li, R.; Bai, W.; Zhang, X.; Li, J.; Guo, R. Research on Recognition Method of Optical Detection Image of Underwater Robot for Submarine Cable. In Proceedings of the 2019 IEEE 3rd Advanced Information Management, Communicates, Electronic and Automation Control Conference (IMCEC), Chongqing, China, 11–13 October 2019. [Google Scholar] [CrossRef]
- Pavin, A. Underwater object recognition in photo images. In Proceedings of the OCEANS 2015-MTS/IEEE Washington, Washington, DC, USA, 19–22 October 2015. [Google Scholar] [CrossRef]
- Chen, Z.; Zhao, T.; Cheng, N.; Sun, X.; Fu, X. Towards Underwater Object Recognition Based on Supervised Learning. In Proceedings of the 2018 OCEANS-MTS/IEEE Kobe Techno-Oceans (OTO), Kobe, Japan, 28–31 May 2018. [Google Scholar] [CrossRef]
- Sun, X.; Shi, J.; Dong, J.; Wang, X. Fish recognition from low-resolution underwater images. In Proceedings of the 2016 9th International Congress on Image and Signal Processing, BioMedical Engineering and Informatics (CISP-BMEI), Datong, China, 15–17 October 2016. [Google Scholar] [CrossRef]
- Jin, L.; Liang, H. Deep learning for underwater image recognition in small sample size situations. In Proceedings of the OCEANS 2017-Aberdeen, Aberdeen, UK, 19–22 June 2017. [Google Scholar] [CrossRef]
- Yu, X.; Xing, X.; Zheng, H.; Fu, X.; Huang, Y.; Ding, X. Man-Made Object Recognition from Underwater Optical Images Using Deep Learning and Transfer Learning. In Proceedings of the 2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Calgary, AB, Canada, 15–20 April 2018. [Google Scholar] [CrossRef]
- Meecham, A.; Acker, T. Underwater threat detection and tracking using multiple sensors and advanced processing. In Proceedings of the 2016 IEEE International Carnahan Conference on Security Technology (ICCST), Orlando, FL, USA, 24–27 October 2016. [Google Scholar] [CrossRef]
- Melgar, I.; Mas, C.; Sanchis, M.; Gomez, M. Optimization of System-of-Systems architectures for maritime border control using genetic algorithms. In Proceedings of the IECON 2010-36th Annual Conference on IEEE Industrial Electronics Society, Glendale, AZ, USA, 7–10 November 2010. [Google Scholar] [CrossRef]
- Ordonez, C.E.; Potesta, J.J.; Malinoski, M.; Halpin, S.M. Autonomous underwater vehicle observation, real-time MetOcean, field, asset, and project execution data. In Proceedings of the OCEANS 2016 MTS/IEEE Monterey, Monterey, CA, USA, 19–23 September 2016. [Google Scholar] [CrossRef]
- Pyo, J.; Yu, S.-C. Development of radial layout underwater acoustic marker using forward scan sonar for AUV. In Proceedings of the 2019 IEEE Underwater Technology (UT), Kaohsiung, Taiwan, 16–19 April 2019. [Google Scholar] [CrossRef]
- Maki, T.; Shiroku, R.; Sato, Y.; Matsuda, T.; Sakamaki, T.; Ura, T. Docking method for hovering type AUVs by acoustic and visual positioning. In Proceedings of the 2013 IEEE International Underwater Technology Symposium (UT), Tokyo, Japan, 5–8 March 2013. [Google Scholar] [CrossRef]
- Yahya, M.F.; Arshad, M.R. Robust recognition of targets for underwater docking of autonomous underwater vehicle. In Proceedings of the 2016 IEEE/OES Autonomous Underwater Vehicles (AUV), Tokyo, Japan, 6–9 November 2016. [Google Scholar] [CrossRef]
- Obhodas, J.; Sudac, D.; Valkovic, V. Matrix Characterization of the Sea Floor in the Threat Material Detection Processes. IEEE Trans. Nucl. Sci. 2010, 57, 2762–2767. [Google Scholar] [CrossRef]
Features | LIDAR | RADAR | Ultrasonic | Video Camera |
---|---|---|---|---|
Detect close objects | low | high | very high | low |
Viewing angle | 360° | 360° | 30° | ~90° |
Effective distance | ~100 m | ~0.15–250 m | 0.03–10 m | ~250 m |
Operation in darkness | + | + | + | – |
Speed detection | + | + | – | – |
Device price | USD 70,000 | ~USD 200 | 1 USD/pcs | ~USD 100 |
Processing unit price | USD 100+ | USD 100+ | USD 10 | USD 100+ |
Algorithm | Computational Complexity | Guarantees Goal Achievement | Takes into Account Kinematics | Algorithm Speed | Works in Dynamically Changed Environment |
---|---|---|---|---|---|
Bug-1 | Simple | Yes | No | Slow | Yes |
Bug algorithms with the distance sensor | Medium | Yes | Yes/No | Medium | Yes |
Tangent-bug | Medium | Yes | No | Medium | Yes |
InsertBug | Medium | Yes | Yes | Medium | Yes |
Potential field algorithms | Complex | No | No | Fast | No |
Algorithms on graph | Complex | No | No | Fast | No |
Hybrid search algorithms | Very Complex | Yes | Yes | Fast | Yes |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Mamchur, D.; Peksa, J.; Kolodinskis, A.; Zigunovs, M. The Use of Terrestrial and Maritime Autonomous Vehicles in Nonintrusive Object Inspection. Sensors 2022, 22, 7914. https://doi.org/10.3390/s22207914
Mamchur D, Peksa J, Kolodinskis A, Zigunovs M. The Use of Terrestrial and Maritime Autonomous Vehicles in Nonintrusive Object Inspection. Sensors. 2022; 22(20):7914. https://doi.org/10.3390/s22207914
Chicago/Turabian StyleMamchur, Dmytro, Janis Peksa, Antons Kolodinskis, and Maksims Zigunovs. 2022. "The Use of Terrestrial and Maritime Autonomous Vehicles in Nonintrusive Object Inspection" Sensors 22, no. 20: 7914. https://doi.org/10.3390/s22207914
APA StyleMamchur, D., Peksa, J., Kolodinskis, A., & Zigunovs, M. (2022). The Use of Terrestrial and Maritime Autonomous Vehicles in Nonintrusive Object Inspection. Sensors, 22(20), 7914. https://doi.org/10.3390/s22207914