The Perception System of Intelligent Ground Vehicles in All Weather Conditions: A Systematic Literature Review
Abstract
:1. Introduction
- Additionally, a 3D visualization of state-of-the-art sensing technologies is shown using a spider chart, clarifying emerging trends and gaps in development. Based on the spider chart information, sensor reliability challenges and gaps can be tackled in the efficient implementation of automated driving in all weather conditions.
- Sensor fusion perspective is proposed, which involves several strategies and tasks, that will help to facilitate active sensor toggling (switching). The active sensor toggling strategy helps in the selection of sensors depending on the environment awareness context. Moreover, a potential combination of sensors is proposed for selected driving safety applications for all weather drives.
2. Evolution of Intelligent Vehicle Technology
2.1. Phase I (1980 to 2003)
2.2. Phase II (2003 to 2008)
2.3. Phase III (from 2008)
3. Automated Navigation Features in Difficult Weather Conditions
3.1. Forward Assistance
3.1.1. Adaptive Cruise Control (ACC)
3.1.2. Forward Collision Avoidance (FCA)
3.1.3. Road and Traffic Sign Recognition
3.1.4. Traffic Jam Assist (TJA)
3.2. Lateral Assistance
3.2.1. Lane Departure Warning (LDW) and Lane Keeping Assistance (LKA)
3.2.2. Lane Change Assistance (LCA)/Blind Spot Monitoring (BSM)
4. Sensors
4.1. Overview
4.2. Radar
4.3. Lidar
4.4. Ultrasonic Sensor
4.5. Vision-Based Systems
4.6. Far-Infrared Camera
4.7. Emerging Technology
5. Perspective on Sensor Fusion
- At the first stage, the fusion system must decide the features that make the navigation environment different from normal navigation. Therefore, there should be a context-aware mechanism that adapts the level of confidence of each piece of sensor information. When snow/rain is falling, the context-aware mechanism can simply use the camera and the weather data to confirm such an occurrence. Furthermore, a training process can be used to classify different weather-related road contexts.
- At the second stage, fusion processing can be carried out to provide the most recent sensing information and the corresponding level of confidence. Although the fusion concept can enhance the capability of the automated navigation feature in all weather conditions, the overall processing power can be considerable. Therefore, the fusion hardware and software architectures should be deeply analyzed before further implementation steps.
- Adaptive cruise control (ACC)
- Forward collision avoidance (FCA), traffic jam assist (TJA), and blind spot monitoring (BSM) systems
- Road and traffic sign recognition (TSR):
- Lane departure warning and lane keeping warning (LDW and LKW) safety systems
- Parking assistance (PA) systems
6. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- D. O. Transport. Weather Impact on Safety. Available online: https://ops.fhwa.dot.gov/weather/q1_roadimpact.htm (accessed on 1 October 2019).
- NHTSA. Traffic Safety Facts. Available online: https://crashstats.nhtsa.dot.gov/Api/Public/ViewPublication/812806 (accessed on 1 November 2019).
- El Faouzi, N.E.; Heilmann, B.; Aron, M.; Do, M.T.; Hautiere, N.; Monteil, J. Real Time Monitoring Surveillance and Control of Road Networks under Adverse Weather Condition; HAL: Houston, TX, USA, 2010. [Google Scholar]
- SAE J3016: Levels of Driving Automation. S. International. 2019. Available online: https://www.sae.org/news/2019/01/sae-updates-j3016-automated-driving-graphic (accessed on 1 October 2019).
- Ando, Y.M.R.; Nishihori, Y.; Yang, J. Effects of Advanced Driver Assistance System for Elderly’s Safe Transportation. In Proceedings of the SMART ACCESSIBILITY 2018: The Third International Conference on Universal Accessibility in the Internet of Things and Smart Environments, Rome, Italy, 25–29 March 2018. [Google Scholar]
- Kang, Y.; Yin, H.; Berger, C. Test Your Self-Driving Algorithm: An Overview of Publicly Available Driving Datasets and Virtual Testing Environments. IEEE Trans. Intell. Veh. 2018. [Google Scholar] [CrossRef]
- Yurtsever, E.; Lambert, J.; Carballo, A.; Takeda, K. A Survey of Autonomous Driving: Common Practices and Emerging Technologies. IEEE Access 2020, 8, 58443–58469. [Google Scholar] [CrossRef]
- Marti, E.; de Miguel, M.A.; Garcia, F.; Perez, J. A Review of Sensor Technologies for Perception in Automated Driving. IEEE Intell. Transp. Syst. Mag. 2019, 11, 94–108. [Google Scholar] [CrossRef] [Green Version]
- Rosique, F.; Navarro, P.J.; Fernandez, C.; Padilla, A. A Systematic Review of Perception System and Simulators for Autonomous Vehicles Research. Sensors 2019, 19, 648. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Bengler, K.; Dietmayer, K.; Farber, B.; Maurer, M.; Stiller, C.; Winner, H. Three Decades of Driver Assistance Systems: Review and Future Perspectives. IEEE Intell. Transp. Syst. Mag. 2014, 6, 6–22. [Google Scholar] [CrossRef]
- Diermeier, D. Mercedes and the Moose Test (B). Kellogg Sch. Manag. Cases 2017. [Google Scholar] [CrossRef]
- Billington, J. The Prometheus Project: The Story Behind One of AV’s Greatest Developments. Available online: https://www.autonomousvehicleinternational.com/features/the-prometheus-project.html (accessed on 1 October 2019).
- Carnegie Mellon University. No Hands Across America Journal. Available online: https://www.cs.cmu.edu/~tjochem/nhaa/Journal.html (accessed on 1 October 2019).
- Bertozzi, M.; Broggi, A.; Conte, G.; Fascioli, R. The Experience of the ARGO Autonomous Vehicle. In Proceedings of the Enhanced and Synthetic Vision, Orlando, FL, USA, 13–17 April 1998. [Google Scholar] [CrossRef]
- Automobile, W. Parking Sensor. Available online: https://en.wikipedia.org/wiki/Parking_sensor (accessed on 1 October 2019).
- Meinel, H.H. Evolving automotive radar—From the very beginnings into the future. In Proceedings of the 8th European Conference on Antennas and Propagation (EuCAP 2014), The Hague, The Netherlands, 6–11 April 2014; pp. 3107–3114. [Google Scholar] [CrossRef]
- Bertozzi, M.; Broggi, A.; Fascioli, A. Vision-based Intelligent Vehicles: State of the Art and Perspectives. Robot. Auton. Syst. 2000, 32, 1–16. [Google Scholar] [CrossRef] [Green Version]
- Broggi, A.; Bertozzi, M.; Fascioli, A. Architectural Issues on Vision-Based Automatic Vehicle Guidance: The Experience of the ARGO Project. Real-Time Imaging 2000, 6, 313–324. [Google Scholar] [CrossRef] [Green Version]
- Thorpe, C.; Jochem, T.; Pomerleau, D. The 1997 automated highway free agent demonstration. In Proceedings of the Conference on Intelligent Transportation Systems, Boston, MA, USA, 12 October 1997; pp. 496–501. [Google Scholar] [CrossRef]
- Urmson, C.; Duggins, D.; Jochem, T.; Pomerleau, D.; Thorpe, C. From Automated Highways to Urban Challenges. In Proceedings of the 2008 IEEE International Conference on Vehicular Electronics and Safety, Columbus, OH, USA, 22–24 September 2008; pp. 6–10. [Google Scholar] [CrossRef]
- Bebel, J.C.; Howard, N.; Patel, T. An autonomous system used in the DARPA Grand Challenge. In Proceedings of the 7th International IEEE Conference on Intelligent Transportation Systems (IEEE Cat. No.04TH8749), Washington, WA, USA, 3–6 October 2004; pp. 487–490. [Google Scholar] [CrossRef]
- Velodyne. HDL—64E Lidar. Available online: https://velodynelidar.com/products/hdl-64e/ (accessed on 1 October 2020).
- Dickmann, J.; Appenrodt, N.; Klappstein, J.; Bloecher, H.L.; Muntzinger, M.; Sailer, A.; Hahn, M.; Brenk, C. Making Bertha See Even More: Radar Contribution. IEEE Access 2015, 3, 1233–1247. [Google Scholar] [CrossRef]
- Hoeger, R.; Amditis, A.; Kunert, M.; Hoess, A.; Flemisch, F.; Krueger, H.P.; Bartels, A.; Beutner, A.; Pagle, K. Highly Automated Vehicles For Intelligent Transport: Haveit Approach. 2008. Available online: https://www.researchgate.net/publication/225000799_HIGHLY_AUTOMATED_VEHICLES_FOR_INTELLIGENT_TRANSPORT_HAVEit_APPROACH (accessed on 1 October 2019).
- Vanholme, B.; Gruyer, D.; Lusetti, B.; Glaser, S.; Mammar, S. Highly Automated Driving on Highways Based on Legal Safety. IEEE Trans. Intell. Transp. Syst. 2013, 14, 333–347. [Google Scholar] [CrossRef]
- Thomaidis, G.; Kotsiourou, C.; Grubb, G.; Lytrivis, P.; Karaseitanidis, G.; Amditis, A. Multi-sensor tracking and lane estimation in highly automated vehicles. IET Intell. Transp. Syst. 2013, 7, 160–169. [Google Scholar] [CrossRef]
- Dávila, A.; Nombela, M. Sartre–Safe Road Trains for the Environment Reducing Fuel Consumption through lower Aerodynamic Drag Coefficient. SAE Tech. Pap. 2011. [Google Scholar] [CrossRef]
- Bertozzi, M.; Bombini, L.; Broggi, A.; Buzzoni, M.; Cardarelli, E.; Cattani, S.; Cerri, P.; Debattisti, S.; Fedriga, R.; Felisa, M.; et al. The VisLab Intercontinental Autonomous Challenge: 13,000 km, 3 months, no driver. In Proceedings of the 17th World Congress on ITS, Busan, Korea, 1–5 September 2010. [Google Scholar]
- Englund, C.; Chen, L.; Ploeg, J.; Semsar-Kazerooni, E.; Voronov, A.; Bengtsson, H.H.; Didoff, J. The Grand Cooperative Driving Challenge 2016: Boosting the Introduction of Cooperative Automated Vehicles. 2016. Available online: https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=7553038&isnumber=7553013 (accessed on 1 October 2019).
- R. Tom Alkim. European Truck Platooning Challenge. Available online: http://wiki.fot-net.eu/index.php/European_Truck_Platooning_Challenge (accessed on 1 October 2019).
- Tsugawa, S.; Jeschke, S.; Shladover, S.E. A Review of Truck Platooning Projects for Energy Savings. IEEE Trans. Intell. Veh. 2016, 1, 68–77. [Google Scholar] [CrossRef]
- Poczte, S.L.; Jankovic, L.M. 10 The Google Car: Driving Toward A Better Future? J. Bus. Case Stud. First Quart. 2014, 10, 1–8. [Google Scholar]
- Nissan. Nissan Q50 2014. Available online: http://www.nissantechnicianinfo.mobi/htmlversions/2013_Q50_Special/Safety.html (accessed on 1 October 2019).
- Tesla. Available online: https://en.wikipedia.org/wiki/Tesla_Autopilot (accessed on 1 October 2019).
- Ford. Ford and Mobileye to Offer Better Camera-Based Collision Avoidance Tech. Available online: https://www.kbford.com/blog/2020/july/28/ford-and-mobileye-to-offer-better-camera-based-collision-avoidance-tech.htm (accessed on 1 October 2019).
- Xiao, L.; Gao, F. A comprehensive review of the development of adaptive cruise control systems. Veh. Syst. Dyn. 2010, 48, 1167–1192. [Google Scholar] [CrossRef]
- Ioannou, P. Guest Editorial Adaptive Cruise Control Systems Special ISSUE. 2003. Available online: https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=1255572 (accessed on 1 October 2019).
- Rajamani, R.; Zhu, C. Semi-autonomous adaptive cruise control systems. IEEE Trans. Veh. Technol. 2002, 51, 1186–1192. [Google Scholar] [CrossRef]
- Winner, H.; Schopper, M. Adaptive Cruise Control. In Handbook of Driver Assistance Systems: Basic Information, Components and Systems for Active Safety and Comfort; Winner, H., Hakuli, S., Lotz, F., Singer, C., Eds.; Springer International Publishing: Cham, Switzerland, 2016; pp. 1093–1148. [Google Scholar]
- Robinson, S.R. The Infrared & Electro-Optical Systems Handbook. Volume 8, No. 25. 1993. Available online: https://apps.dtic.mil/dtic/tr/fulltext/u2/a364018.pdf (accessed on 1 October 2019).
- FLIR. Seeing through Fog and Rain with a Thermal Imaging Camera. 2019. Available online: https://www.flirmedia.com/MMC/CVS/Tech_Notes/TN_0001_EN.pdf (accessed on 1 October 2019).
- Forward Vehicle Collision Warning Systems—Performance Requirements and Test Procedures, Intelligent Transport System ISO/TC 204, ISO. 2013. Available online: https://www.iso.org/obp/ui/#iso:std:iso:15623:ed-2:v1:en (accessed on 1 October 2019).
- Winner, H. Handbook on Driver Assistance System (Fundemental of collision protection systems); Springer International Publishing Switzerland: Cham, Switzerland, 2016. [Google Scholar]
- Sign Detection with LIDAR. Available online: https://unmanned.tamu.edu/projects/sign-detection-with-lidar/ (accessed on 1 October 2019).
- Johansson, B. Road Sign Recognition from a Moving Vehicle. Master’s Thesis, Uppsala University, Uppsala, Sweden, 2003. [Google Scholar]
- Yang, Y.; Luo, H.; Xu, H.; Wu, F. Towards Real-Time Traffic Sign Detection and Classification. IEEE Trans. Intell. Transp. Syst. 2016, 17, 2022–2031. [Google Scholar] [CrossRef]
- Chen, E.H.; Röthig, P.; Zeisler, J.; Burschka, D. Investigating Low Level Features in CNN for Traffic Sign Detection and Recognition. In Proceedings of the 2019 IEEE Intelligent Transportation Systems Conference (ITSC), Auckland, New Zealand, 27–30 October 2019; pp. 325–332. [Google Scholar] [CrossRef]
- Lüke, S.; Fochler, O.; Schaller, T.; Regensburger, U. Traffic-Jam Assistance and Automation. In Handbook of Driver Assistance Systems: Basic Information, Components and Systems for Active Safety and Comfort; Winner, H., Hakuli, S., Lotz, F., Singer, C., Eds.; Springer International Publishing: Berlin/Heidelberg, Germany, 2014. [Google Scholar]
- Traffic Jam Assistance-Support of the Driver in the Transverse and Longitudinal Guidance. 2008. Available online: https://mediatum.ub.tum.de/doc/1145106/1145106.pdf (accessed on 1 October 2019).
- Cicchino, J.B. Effects of lane departure warning on police-reported crash rates. J. Saf. Res. 2018, 66, 61–70. [Google Scholar] [CrossRef]
- Yenikaya, S.; Yenikaya, G.; Düven, E. Keeping the Vehicle on the Road-A Survey on On-Road Lane Detection Systems. ACM Comput. Surv. 2013. [Google Scholar] [CrossRef]
- Hata, A.; Wolf, D. Road marking detection using LIDAR reflective intensity data and its application to vehicle localization. In Proceedings of the 17th International IEEE Conference on Intelligent Transportation Systems (ITSC), Qingdao, China, 20 November 2014; pp. 584–589. [Google Scholar]
- Feniche, M.; Mazri, T. Lane Detection and Tracking For Intelligent Vehicles: A Survey. In Proceedings of the 2019 International Conference of Computer Science and Renewable Energies (ICCSRE), Agadir, Morocco, 22–24 July 2019. [Google Scholar] [CrossRef]
- Li, Q.; Chen, L.; Li, M.; Shaw, S.; Nüchter, A. A Sensor-Fusion Drivable-Region and Lane-Detection System for Autonomous Vehicle Navigation in Challenging Road Scenarios. IEEE Trans. Veh. Technol. 2014, 63, 540–555. [Google Scholar] [CrossRef]
- Intelligent Transport System—Lane Change Decision Aid System, I. 17387:2008. 2008. Available online: https://www.iso.org/obp/ui/#iso:std:iso:17387:ed-1:v1:en (accessed on 1 October 2019).
- Zakuan, F.R.A.; Hamid, U.Z.A.; Limbu, D.K.; Zamzuri, H.; Zakaria, M.A. Performance Assessment of an Integrated Radar Architecture for Multi-Types Frontal Object Detection for Autonomous Vehicle. In Proceedings of the 2018 IEEE International Conference on Automatic Control and Intelligent Systems (I2CACIS), Shah Alam, Malaysia, 20–20 October 2018; pp. 13–18. [Google Scholar] [CrossRef]
- ISO 20473:2007(e), Optics and Photonics—Spectral Bands, i. O. F. Standardization. 2007. Available online: https://www.pointsdevue.com/03-iso-204732007e-optics-and-photonics-spectral-bands-international-organization-standardization# (accessed on 1 October 2019).
- Alibaba.com. Ultrasonic Parking Sensor. Available online: https://www.alibaba.com/showroom/ultrasonic-parking-sensor.html (accessed on 1 October 2020).
- NHTS. Preliminary Cost-Benefit Analysis of Ultrasonic and Camera Backup Systems. 2006. Available online: https://www.nhtsa.gov/sites/nhtsa.dot.gov/files/nhtsa-2006-25579-0002.pdf (accessed on 1 October 2019).
- Digitimes Research: 79GHz to Replace 24GHz for Automotive Millimeter-Wave Radar Sensors. Available online: https://www.digitimes.com/news/a20170906PD208.html#:~:text=In%202017%2C%20prices%20for%2024GHz,and%20US%24100%2D200%20respectively (accessed on 1 October 2020).
- Lambert, E.G.S. LiDAR Systems: Costs, Integration, and Major Manufacturers. Available online: https://www.mes-insights.com/lidar-systems-costs-integration-and-major-manufacturers-a-908358/ (accessed on 1 October 2020).
- Khader, S.C.M. An Introduction to Automotive Light Detection and Ranging (LIDAR) and Solutions to Serve Future Autonomous Driving Systems. Available online: https://www.ti.com/lit/wp/slyy150a/slyy150a.pdf?ts=1602909172290&ref_url=https%253A%252F%252Fwww.google.com%252F#:~:text=As%20LIDAR%20has%20gained%20in,than%20US%24200%20by%202022 (accessed on 1 October 2020).
- CSI. Mobileye Car System Installation. Available online: https://www.carsystemsinstallation.ca/product/mobileye-630/ (accessed on 1 October 2020).
- Tech, E. Mobileye Monocamera. Available online: https://www.extremetech.com/extreme/145610-mobileye-outfits-old-cars-with-new-electronic-vision (accessed on 1 October 2020).
- GroupGets. FLIR ADK—Thermal Vision Automotive Development Kit. Available online: https://store.groupgets.com/products/flir-adk-thermal-vision-automotive-development-kit#:~:text=FLIR%20ADK%20%2D%20Thermal%20Vision%20Automotive%20Development%20Kit%20%E2%80%93%20GroupGets (accessed on 1 October 2019).
- Meinel, H.H.; Bösch, W. Radar Sensors in Cars. In Automated Driving: Safer and More Efficient Future Driving; Watzenig, D., Horn, M., Eds.; Springer International Publishing: Cham, Switzerland, 2017; pp. 245–261. [Google Scholar]
- Stuff, A. Radar Technical Specifications. Available online: https://autonomoustuff.com/wp-content/uploads/2020/06/radar-comp-chart.pdf (accessed on 1 October 2020).
- BOSCH. Mid Range Radar. Available online: https://www.bosch-mobility-solutions.com/en/products-and-services/passenger-cars-and-light-commercial-vehicles/driver-assistance-systems/lane-change-assist/mid-range-radar-sensor-mrrrear/ (accessed on 1 October 2020).
- Schneider, R.; Wenger, J. High resolution radar for automobile applications. Adv. Radio Sci. 2003, 1, 105–111. [Google Scholar] [CrossRef]
- Schneider, M. Automotive radar–Status and trends. In Proceedings of the German Microwave Conference, Ulm, Germany, 5–7 April 2005. [Google Scholar]
- Fölster, F.; Rohling, H. Signal processing structure for automotive radar. Frequenz 2006, 60, 20–24. [Google Scholar] [CrossRef]
- Li, J.; Stoica, P. MIMO radar with colocated antennas. IEEE Signal Process. Mag. 2007, 24, 106–114. [Google Scholar] [CrossRef]
- Rasshofer, R.H. Functional requirements of future automotive radar systems. In Proceedings of the 2007 European Radar Conference, Munich, Germany, 10–12 October 2007; pp. 259–262. [Google Scholar] [CrossRef]
- Schoor, M.; Yang, B. High-Resolution Angle Estimation for an Automotive FMCW Radar Sensor. In Proceedings of the International Radar Symposium (IRS), Cologne, Germany 5–7 September 2007. [Google Scholar]
- Brunnbauer, M.; Meyer, T.; Ofner, G.; Mueller, K.; Hagen, R. Embedded wafer level ball grid array (eWLB). In Proceedings of the 2008 33rd IEEE/CPMT International Electronics Manufacturing Technology Conference (IEMT), Penang, Malaysia, 4–6 November 2008. [Google Scholar]
- Knapp, H.; Treml, M.; Schinko, A.; Kolmhofer, E.; Matzinger, S.; Strasser, G.; Lachner, R.; Maurer, L.; Minichshofer, J. Three-channel 77 GHz automotive radar transmitter in plastic package. In Proceedings of the 2012 IEEE Radio Frequency Integrated Circuits Symposium, Montreal, QC, Canada, 17–19 June 2012; pp. 119–122. [Google Scholar] [CrossRef]
- Wagner, C.; Böck, J.; Wojnowski, M.; Jäger, H.; Platz, J.; Treml, M.; Dober, F.; Lachner, R.; Minichshofer, J.; Maurer, L. A 77GHz automotive radar receiver in a wafer level package. In Proceedings of the 2012 IEEE Radio Frequency Integrated Circuits Symposium, Montreal, QC, Canada, 17–19 June 2012; pp. 511–514. [Google Scholar] [CrossRef]
- Keysight Technologies. How Millimeter Wave Automotive Radar Enhances ADAS and Autonomous Driving. 2018. Available online: www.keysight.com (accessed on 1 October).
- Patole, S.M.; Torlak, M.; Wang, D.; Ali, M. Automotive radars: A review of signal processing techniques. IEEE Signal Process. Mag. 2017, 34, 22–35. [Google Scholar] [CrossRef]
- Dickmann, J.; Klappstein, J.; Hahn, M.; Appenrodt, N.; Bloecher, H.L.; Werber, K.; Sailer, A. Automotive radar the key technology for autonomous driving: From detection and ranging to environmental understanding. In Proceedings of the 2016 IEEE Radar Conference (RadarConf), Philadelphia, PA, USA, 2–6 May 2016. [Google Scholar] [CrossRef]
- Meinl, F.; Stolz, M.; Kunert, M.; Blume, H. R27. An experimental high performance radar system for highly automated driving. In Proceedings of the 2017 IEEE MTT-S International Conference on Microwaves for Intelligent Mobility (ICMIM), Nagoya, Japan, 19–21 March 2017; pp. 71–74. [Google Scholar] [CrossRef]
- Brisken, S.; Ruf, F.; Höhne, F. Recent evolution of automotive imaging radar and its information content. IET Radar Sonar Navig. 2018, 12, 1078–1081. [Google Scholar] [CrossRef]
- Eltrass, A.; Khalil, M. Automotive radar system for multiple-vehicle detection and tracking in urban environments. IET Intell. Transp. Syst. 2018, 12, 783–792. [Google Scholar] [CrossRef]
- Huang, X.; Ding, J.; Liang, D.; Wen, L. Multi-Person Recognition Using Separated Micro-Doppler Signatures. IEEE Sens. J. 2020, 20, 6605–6611. [Google Scholar] [CrossRef]
- Lee, S.; Yoon, Y.; Lee, J.; Kim, S. Human–vehicle classification using feature-based SVM in 77-GHz automotive FMCW radar. IET Radar Sonar Navig. 2017, 11, 1589–1596. [Google Scholar] [CrossRef]
- Zang, S.; Ding, M.; Smith, D.; Tyler, P.; Rakotoarivelo, T.; Kaafar, M.A. The Impact of Adverse Weather Conditions on Autonomous Vehicles: How Rain, Snow, Fog, and Hail Affect the Performance of a Self-Driving Car. IEEE Veh. Technol. Mag. 2019, 14, 103–111. [Google Scholar] [CrossRef]
- Yang, R.K.; Li, L.; Ma, H.H. Effects of Backscattering Enhancement Considering Multiple Scattering in Rain on Mmw Radar Performance; NISCAIR-CSIR: New Delhi, India, 2013. [Google Scholar]
- Huang, J.; Jiang, S.; Lu, X. Rain Backscattering Properties And Effects On The Radar Performance At Mm Wave Band. Int. J. Infrared Millim. Waves 2001, 22, 917–922. [Google Scholar] [CrossRef]
- Norouzian, F.; Marchetti, E.; Hoare, E.; Gashinova, M.; Constantinou, C.; Gardner, P.; Cherniakov, M. Experimental study on low-THz automotive radar signal attenuation during snowfall. IET Radar Sonar Navig. 2019, 13, 1421–1427. [Google Scholar] [CrossRef]
- Pozhidaev, V.N. AVR 8.1 Estimation of attenuation and backscattering of millimeter radio waves in meteorological formations. J. Commun. Technol. Electron. 2010, 55, 1223–1230. [Google Scholar] [CrossRef]
- Hasirlioglu, S.; Riener, A. Challenges in Object Detection Under Rainy Weather Conditions. In Proceedings of the Second EAI International Conference, INTSYS 2018, Guimarães, Portugal, 21–23 November 2018. [Google Scholar]
- Wang, J.-G.; Chen, S.J.; Zhou, L.-B.; Wan, K.; Yau, W.-Y. Vehicle Detection and Width Estimation in Rain by Fusing Radar and Vision. In Proceedings of the 2018 15th International Conference on Control, Automation, Robotics and Vision (ICARCV), Singapore, 18–21 November 2018; pp. 1063–1068. [Google Scholar]
- Peynot, T.; Underwood, J.; Scheding, S. Towards reliable perception for unmanned ground vehicles in challenging conditions. In Proceedings of the 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, St. Louis, MO, USA, 10–15 October 2009; pp. 1170–1176. [Google Scholar] [CrossRef] [Green Version]
- Brooker, G.; Hennessey, R.; Lobsey, C.; Bishop, M.; Widzyk-Capehart, E. Seeing through Dust and Water Vapor: Millimeter Wave Radar Sensors for Mining Applications. J. Field Robot. 2007, 24, 527–557. [Google Scholar] [CrossRef]
- Alland, S.; Stark, W.; Ali, M.; Hegde, M. Interference in Automotive Radar Systems: Characteristics, Mitigation Techniques, and Current and Future Research. IEEE Signal Process. Mag. 2019, 36, 45–59. [Google Scholar] [CrossRef]
- Wikipedia. Lidar. 2019. Available online: https://en.wikipedia.org/wiki/Lidar (accessed on 1 October 2019).
- Hecht, J. Lidar for Self Driving. 2018, pp. 1–8. Available online: https://www.osapublishing.org/DirectPDFAccess/4577511A-CFD1-17871681F42CA46FF8BC_380434/opn-29-1-26.pdf?da=1&id=380434&seq=0&mobile=no (accessed on 1 October 2019). (In English).
- Stuff, A. Lidar technical comparision. 2019. Available online: https://autonomoustuff.com/lidar-chart (accessed on 1 October 2019).
- Velodyne. HDL 64. Available online: https://velodynelidar.com/products/hdl-64e/ (accessed on 1 October 2020).
- Wallace, A.M.; Halimi, A.; Buller, G.S. Full Waveform LiDAR for Adverse Weather Conditions. IEEE Trans. Veh. Technol. 2020, 69, 7064–7077. [Google Scholar] [CrossRef]
- Li, Y.; Ibanez-Guzman, J. Lidar for Autonomous Driving: The Principles, Challenges, and Trends for Automotive Lidar and Perception Systems. IEEE Signal Process. Mag. 2020, 37, 50–61. [Google Scholar] [CrossRef]
- Zermas, D.; Izzat, I.; Papanikolopoulos, N. Fast segmentation of 3D point clouds: A paradigm on LiDAR data for autonomous vehicle applications. In Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 29 May–3 June 2017; pp. 5067–5073. [Google Scholar] [CrossRef]
- Bogoslavskyi, I.; Stachniss, C. Fast range image-based segmentation of sparse 3D laser scans for online operation. In Proceedings of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Korea, 9–14 October 2016; pp. 163–169. [Google Scholar] [CrossRef]
- Thrun, A.P.S. Model based vehicle detection and tracking for autonomous urban driving. Auton Robot 2009, 26, 123–129. [Google Scholar] [CrossRef]
- Himmelsbach, M.; Mueller, A.; Lüttel, T.; Wünsche, H.J. LIDAR-based 3D Object Perception. In Proceedings of the 1st Int. Workshop on Cognition for Technical Systems, Muenchen, Germany, 6–10 May 2008. [Google Scholar]
- Capellier, E.; Davoine, F.; Cherfaoui, V.; Li, Y. Evidential deep learning for arbitrary LIDAR object classification in the context of autonomous driving. In Proceedings of the 2019 IEEE Intelligent Vehicles Symposium (IV), Paris, France, 9–12 June 2019; pp. 1304–1311. [Google Scholar] [CrossRef] [Green Version]
- Zeng, W.D.; Posner, I.; Newman, P. What could move? Finding cars, pedestrians and bicyclists in 3D laser data. In Proceedings of the 2012 IEEE International Conference on Robotics and Automation, Saint Paul, MN, USA, 14–18 May 2012; pp. 4038–4044. [Google Scholar] [CrossRef]
- Douillard, J.U.B.; Vlaskine, V.; Quadros, A.; Singh, S. A Pipeline for the Segmentation and Classification of 3D Point Clouds; Springer: Berlin/Heidelberg, Germany, 2014. [Google Scholar] [CrossRef] [Green Version]
- Premebida, C.; Ludwig, O.; Nunes, U. Exploiting LIDAR-based features on pedestrian detection in urban scenarios. In Proceedings of the 2009 12th International IEEE Conference on Intelligent Transportation Systems, St. Louis, MO, USA, 4–7 October 2009. [Google Scholar] [CrossRef]
- Kraemer, S.; Stiller, C.; Bouzouraa, M.E. LiDAR-Based Object Tracking and Shape Estimation Using Polylines and Free-Space Information. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 4515–4522. [Google Scholar] [CrossRef]
- Zhang, X.; Xu, W.; Dong, C.; Dolan, J.M. Efficient L-shape fitting for vehicle detection using laser scanners. In Proceedings of the 2017 IEEE Intelligent Vehicles Symposium (IV), Los Angeles, CA, USA, 11–14 June 2017; pp. 54–59. [Google Scholar] [CrossRef]
- Rachman, A.S.A. 3D-LIDAR Multi Object Tracking for Autonomous Driving. 2017. Available online: https://www.semanticscholar.org/paper/3D-LIDAR-Multi-Object-Tracking-for-Autonomous-and-Rachman/bafc8fcdee9b22708491ea1293524ece9e314851 (accessed on 1 October 2019).
- Hespel, L.; Riviere, N.; Huet, T.; Tanguy, B.; Ceolato, R. Performance evaluation of laser scanners through the atmosphere with adverse condition. Proc SPIE 2011. [Google Scholar] [CrossRef]
- Heinzler, R.; Schindler, P.; Seekircher, J.; Ritter, W.; Stork, W. Weather Influence and Classification with Automotive Lidar Sensors. In Proceedings of the 2019 IEEE Intelligent Vehicles Symposium (IV), Paris, France, 9–12 June 2019; pp. 1527–1534. [Google Scholar] [CrossRef] [Green Version]
- Kutila, M.; Pyykönen, P.; Holzhüter, H.; Colomb, M.; Duthon, P. Automotive LiDAR performance verification in fog and rain. In Proceedings of the IEEE Conference on Intelligent Transportation Systems, Proceedings, ITSC, Maui, HI, USA, 4–7 November 2018; pp. 1695–1701. [Google Scholar] [CrossRef]
- Bijelic, M.; Gruber, T.; Ritter, W. A Benchmark for Lidar Sensors in Fog: Is Detection Breaking Down? In Proceedings of the 2018 IEEE Intelligent Vehicles Symposium (IV), Changshu, China, 26–30 June 2018; pp. 760–767. [Google Scholar] [CrossRef] [Green Version]
- Li, Y.; Duthon, P.; Colomb, M.; Ibanez-Guzman, J. What Happens for a ToF LiDAR in Fog? IEEE Trans. Intell. Transp. Syst. 2020. [Google Scholar] [CrossRef]
- Kutila, M.; Pyykonen, P.; Ritter, W.; Sawade, O.; Schäufele, B. Automotive LIDAR sensor development scenarios for harsh weather conditions. In Proceedings of the 2016 IEEE 19th International Conference on Intelligent Transportation Systems (ITSC), Rio de Janeiro, Rio de Janeiro, Brazil, 1–4 November 2016; pp. 265–270. [Google Scholar]
- Goodin, C.; Carruth, D.; Doude, M.; Hudson, C. Predicting the influence of rain on LIDAR in ADAS. Electronics 2019, 8, 89. [Google Scholar] [CrossRef] [Green Version]
- Byeon, M.; Yoon, S.W. Analysis of Automotive Lidar Sensor Model Considering Scattering Effects in Regional Rain Environments. IEEE Access 2020, 8, 102669–102679. [Google Scholar] [CrossRef]
- Michaud, S.; Lalonde, J.-F.; Giguère, P. Towards Characterizing the Behavior of LiDARs in Snowy Conditions. In Proceedings of the 7th Workshop on Planning, Perception and Navigation for Intelligent Vehicles, IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany, 28 September–3 October 2015. [Google Scholar]
- Ryde, J.; Hillier, N. Performance of laser and radar ranging devices in adverse environmental conditions. J. Field Robot. 2009, 26, 712–727. [Google Scholar] [CrossRef]
- Wojtanowski, J.; Zygmunt, M.; Kaszczuk, M.; Mierczyk, Z.; Muzal, M. Comparison of 905 nm and 1550 nm semiconductor laser rangefinders’ performance deterioration due to adverse environmental conditions. Opto-Electron. Rev. 2014, 22, 183–190. [Google Scholar] [CrossRef]
- Tesla “Model ‘S’ Owners Manual”. 2020. Available online: https://www.tesla.com/sites/default/files/model_s_owners_manual_north_america_en_us.pdf (accessed on 1 October 2020).
- Technavio. Global Automotive Parking Sensors Market 2016–2020. 2016. Available online: https://www.technavio.com/report/global-automotive-electronicsglobal-automotive-parking-sensors-market-2016-2020 (accessed on 1 October 2020).
- Bosch. Ultrasonic Sensor Technical Performance. Available online: https://www.bosch-mobility-solutions.com/en/products-and-services/passenger-cars-and-light-commercial-vehicles/driver-assistance-systems/construction-zone-assist/ultrasonic-sensor/ (accessed on 1 October 2020).
- Murata Manufacturing Co., Ltd. Ultrasonic Sensors. Available online: http://www.symmetron.ru/suppliers/murata/files/pdf/murata/ultrasonic-sensors.pdf (accessed on 1 October 2019).
- Pepperl_Fuchs. Technology Guide for Ultrasonic Sensors. 2010. Available online: https://files.pepperl-fuchs.com/webcat/navi/productInfo/doct/tdoct3631a_eng.pdf?v=20181114123018 (accessed on 1 October 2019).
- GmbH, R.B. Bosch Automotive Handbook; Robert Bosch GmbH: Gerlingen, Germany, 2004. [Google Scholar]
- Massa, D.P. Choosing ultrasonic sensor for proximity or distance measurement. Sensors 1999, 16, 3. [Google Scholar]
- Nordevall, J.; Method Development of Automotive Ultrasound Simulations. Applied Mechanics, Chalmers University of Technology. 2015. Available online: http://publications.lib.chalmers.se/records/fulltext/219224/219224.pdf (accessed on 1 October 2019).
- Hatano, H.; Yamazato, T.; Katayama, M. Automotive Ultrasonic Array Emitter for Short-range Targets Detection. In Proceedings of the 2007 4th International Symposium on Wireless Communication Systems, Trondheim, Norway, 17–19 October 2007; pp. 355–359. [Google Scholar]
- Agarwal, V.; Murali, N.V.; Chandramouli, C. A Cost-Effective Ultrasonic Sensor-Based Driver-Assistance System for Congested Traffic Conditions. IEEE Trans. Intell. Transp. Syst. 2009, 10, 486–498. [Google Scholar] [CrossRef]
- Adarsh, S.; Kaleemuddin, S.M.; Bose, D.; Ramachandran, K.I. Performance comparison of Infrared and Ultrasonic sensors for obstacles of different materials in vehicle/ robot navigation applications. IOP Conf. Ser. Mater. Sci. Eng. 2016, 149, 012141. [Google Scholar] [CrossRef] [Green Version]
- Kapoor, R.; Ramasamy, S.; Gardi, A.; van Schyndel, R.; Sabatini, R. Acoustic sensors for air and surface navigation applications. Sensors 2018, 18, 499. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Li, S.E.; Li, G.; Yu, J.; Liu, C.; Cheng, B.; Wang, J.; Li, K. Kalman filter-based tracking of moving objects using linear ultrasonic sensor array for road vehicles. Mech. Syst. Signal Process. 2018, 98, 173–189. [Google Scholar] [CrossRef]
- Jiménez, F.; Naranjo, E.J.; Gómez, O.; Anaya, J.J. Vehicle Tracking for an Evasive Manoeuvres Assistant Using Low-Cost Ultrasonic Sensors. Sensors 2014, 14, 22689–22705. [Google Scholar] [CrossRef] [Green Version]
- Yu, J.; Li, S.E.; Liu, C.; Cheng, B. Dynamical tracking of surrounding objects for road vehicles using linearly-arrayed ultrasonic sensors. In Proceedings of the 2016 IEEE Intelligent Vehicles Symposium (IV), Gothenburg, Sweden, 19–22 June 2016; pp. 72–77. [Google Scholar] [CrossRef]
- Liptai, P.; Badida, M.; Lukáčová, K. Influence of Atmospheric Conditions on Sound Propagation—Mathematical Modeling. Óbuda Univ. e-Bull. 2015, 5, 127–134. [Google Scholar]
- Mahapatra, R.P.; Kumar, K.V.; Khurana, G.; Mahajan, R. Ultra Sonic Sensor Based Blind Spot Accident Prevention System. In Proceedings of the 2008 International Conference on Advanced Computer Theory and Engineering, Phuket, Thailand, 20–22 December 2008; pp. 992–995. [Google Scholar] [CrossRef]
- Alonso, L.; Oria, J.P.; Arce, J.; Fernandez, M. Urban traffic avoiding car collisions fuzzy system based on ultrasound. In Proceedings of the 2008 World Automation Congress, Hawaii, HI, USA, 28 September–2 October 2008. [Google Scholar]
- Kai-Tai, S.; Chih-Hao, C.; Chiu, H.C. Design and experimental study of an ultrasonic sensor system for lateral collision avoidance at low speeds. In Proceedings of the IEEE Intelligent Vehicles Symposium, Parma, Italy, 14–17 June 2004; pp. 647–652. [Google Scholar] [CrossRef]
- Rhee, J.H.; Seo, J. Low-Cost Curb Detection and Localization System Using Multiple Ultrasonic Sensors. Sensors 2019, 19, 1389. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Hosur, P.; Shettar, R.B.; Potdar, M. Environmental awareness around vehicle using ultrasonic sensors. In Proceedings of the 2016 International Conference on Advances in Computing, Communications and Informatics (ICACCI), Jaipur, India, 21–24 September 2016; pp. 1154–1159. [Google Scholar]
- Shin, S.; Choi, S.B. Target Speed Sensing Technique using Dilation Correlation of Ultrasonic Signal for Vehicle. In Proceedings of the 2019 IEEE Sensors Applications Symposium (SAS), Sophia Antipolis, France, 11–13 March 2019. [Google Scholar] [CrossRef]
- Kredba, J.; Holada, M. Precision ultrasonic range sensor using one piezoelectric transducer with impedance matching and digital signal processing. In Proceedings of the IEEE International Workshop of Electronics, Control, Measurement, Signals and their Application to Mechatronics (ECMSM), Donostia-San Sebastian, Spain, 24–26 May 2017. [Google Scholar] [CrossRef]
- Nauth, P.M.; Pech, A.H.; Michalik, R. Research on a new Smart Pedestrian Detection Sensor for Vehicles. In Proceedings of the IEEE Sensors Applications Symposium (SAS), Sophia Antipolis, France, 11–13 March 2019. [Google Scholar] [CrossRef]
- Tsai, W.-Y.; Chen, H.-C.; Liao, T.-L. An ultrasonic air temperature measurement system with self-correction function for humidity. Meas. Sci. Technol. 2005, 16, 548–555. [Google Scholar] [CrossRef]
- Muhoz, A.C. CHAPTER 15—Position and Motion Sensors. In Sensor Technology Handbook; Wilson, J.S., Ed.; Burlington: Newnes, Australia, 2005; pp. 321–409. [Google Scholar]
- S. F. University. SOUND PROPAGATION. Available online: https://www.sfu.ca/sonic-studio-webdav/handbook/Sound_Propagation.html (accessed on 1 October 2019).
- Lim, B.S.; Keoh, S.L.; Thing, V.L. Autonomous Vehicle Ultrasonic Sensor Vulnerability and Impact Assessment. 2018. Available online: https://ieeexplore.ieee.org/document/8355132 (accessed on 1 October 2020).
- Xu, W.; Yan, C.; Jia, W.; Ji, X.; Liu, J. Analyzing and Enhancing the Security of Ultrasonic Sensors for Autonomous Vehicles. IEEE Internet Things J. 2018, 5, 5015–5029. [Google Scholar] [CrossRef]
- Mobileye. Mobileye C2-270 Technical Datasheet. Available online: https://itk-mdl.asutk.ru/upload/iblock/c82/Mobileye%20C2-270%20Technical%20Spec%20v1.2.pdf (accessed on 1 October 2020).
- Autonomoustuff. Mobileye Moncamera. Available online: https://autonomoustuff.com/product/mobileye-camera-dev-kit/ (accessed on 1 October 2020).
- Mehta, S.; Patel, A.; Mehta, J. CCD or CMOS Image Sensor For Photography. In Proceedings of the International Conference on Communications and Signal Processing (ICCSP), Melmaruvathur, India, 2–4 April 2015. [Google Scholar] [CrossRef]
- T. D. Inc. CCD vs. CMOS. Available online: https://www.teledynedalsa.com/imaging/ knowledge-center/appnotes/ccd-vs-cmos/ (accessed on 20 February 2020).
- Bernini, N.; Bertozzi, M.; Castangia, L.; Patander, M.; Sabbatelli, M. Real-time obstacle detection using stereo vision for autonomous ground vehicles: A survey. In Proceedings of the 17th International IEEE Conference on Intelligent Transportation Systems (ITSC), Qingdao, China, 8–11 October 2014; pp. 873–878. [Google Scholar] [CrossRef]
- John, V.; Yoneda, K.; Liu, Z.; Mita, S. Saliency Map Generation by the Convolutional Neural Network for Real-Time Traffic Light Detection Using Template Matching. IEEE Trans. Comput. Imaging 2015, 1, 159–173. [Google Scholar] [CrossRef]
- Mu, G.; Xinyu, Z.; Deyi, L.; Tianlei, Z.; Lifeng, A. Traffic light detection and recognition for autonomous vehicles. J. China Univ. Posts Telecommun. 2015, 22, 50–56. [Google Scholar] [CrossRef]
- Zhang, J.; Huang, M.; Jin, X.; Li, X. A Real-Time Chinese Traffic Sign Detection Algorithm Based on Modified YOLOv2. Algorithms 2017, 10, 127. [Google Scholar] [CrossRef] [Green Version]
- Kulkarni, R.; Dhavalikar, S.; Bangar, S. Traffic Light Detection and Recognition for Self Driving Cars Using Deep Learning. In Proceedings of the 2018 4th International Conference on Computing, Communication Control and Automation, ICCUBEA, Pune, India, 16–18 August 2018. [Google Scholar] [CrossRef]
- Hasirlioglu, S.; Riener, A. Introduction to rain and fog attenuation on automotive surround sensors. In Proceedings of the IEEE 20th International Conference on Intelligent Transportation Systems (ITSC), Yokohama, Japan, 16–19 October 2017. [Google Scholar]
- Hasirlioglu, S.; Riener, A. A Model-Based Approach to Simulate Rain Effects on Automotive Surround Sensor Data. In Proceedings of the 2018 21st International Conference on Intelligent Transportation Systems (ITSC), Maui, HI, USA, 4–7 November 2018; pp. 2609–2615. [Google Scholar] [CrossRef]
- Hasirlioglu, S.; Kamann, A.; Doric, I.; Brandmeier, T. Test methodology for rain influence on automotive surround sensors. In Proceedings of the 2016 IEEE 19th International Conference on Intelligent Transportation Systems (ITSC), Rio de Janeiro, Brazil, 1–4 November 2016; pp. 2242–2247. [Google Scholar] [CrossRef]
- Xique, I.J.; Buller, W.; Fard, Z.B.; Dennis, E.; Hart, B. Evaluating Complementary Strengths and Weaknesses of ADAS Sensors. In Proceedings of the IEEE Vehicular Technology Conference, Chicago, IL, USA, 27–30 August 2018. [Google Scholar] [CrossRef]
- Garg, K.; Nayar, S.K. When does a camera see rain? In Proceedings of the Tenth IEEE International Conference on Computer Vision (ICCV’05) Volume 1, Beijing, China, 17–21 October 2005; Volume 2, pp. 1067–1074. [Google Scholar] [CrossRef]
- Gangula, L.B.; Srikanth, G.; Naveen, C.; Satpute, V.R. Vision Improvement in Automated Cars by Image Deraining. In Proceedings of the 2018 IEEE International Students’ Conference on Electrical, Electronics and Computer Science (SCEECS), Bhopal, India, 24–25 February 2018. [Google Scholar] [CrossRef]
- Lee, U.; Jung, J.; Shin, S.; Jeong, Y.; Park, K.; Shim, D.H.; Kweon, I.S. EureCar turbo: A self-driving car that can handle adverse weather conditions. In Proceedings of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Korea, 9–14 October 2016; pp. 2301–2306. [Google Scholar] [CrossRef]
- Flir. Automotive Thermal Camera Specification. Available online: https://www.flir.ca/products/adk/ (accessed on 1 October 2020).
- FLIR. FLIR ADK. Available online: https://www.flir.ca/products/adk/ (accessed on 1 October 2020).
- FLIR Systems. The Ultimate Infrared Handbook For R&D Professionals; FLIR AB: Boston, MA, USA, 2010; Available online: https://www.flirmedia.com/MMC/THG/Brochures/T559243/T559243_EN.pdf (accessed on 1 October 2019).
- John, V.; Tsuchizawa, S.; Liu, Z.; Mita, S. Fusion of thermal and visible cameras for the application of pedestrian detection. Signal Image Video Process. 2016, 11, 517–524. [Google Scholar] [CrossRef]
- Chien, S.C.; Chang, F.C.; Tsai, C.C.; Chen, Y.Y. Intelligent all-day vehicle detection based on decision-level fusion using color and thermal sensors. In Proceedings of the International Conference on Advanced Robotics and Intelligent Systems, ARIS, Taipei, Taiwan, 6–8 September 2017. [Google Scholar] [CrossRef]
- Hurney, P.; Morgan, F.; Glavin, M.; Jones, E.; Waldron, P. Review of pedestrian detection techniques in automotive far-infrared video. IET Intell. Transp. Syst. 2015, 9, 824–832. [Google Scholar] [CrossRef]
- Berg, A. Detection and Tracking in Thermal Infrared Imagery. Ph.D. Thesis, Electrical Engineering Linkoping University, Linköping, Sweden, 2016. [Google Scholar]
- Kim, T.; Kim, S. Pedestrian detection at night time in FIR domain: Comprehensive study about temperature and brightness and new benchmark. Pattern Recognit. 2018, 79, 44–54. [Google Scholar] [CrossRef]
- Wang, H.; Cai, Y.; Chen, X.; Chen, L. Night-Time Vehicle Sensing in Far Infrared Image with Deep Learning. J. Sens. 2016, 2016, 3403451. [Google Scholar] [CrossRef] [Green Version]
- Qi, B.; John, V.; Liu, Z.; Mita, S. Pedestrian detection from thermal images: A sparse representation based approach. Infrared Phys. Technol. 2016, 76, 157–167. [Google Scholar] [CrossRef]
- Li, X.; Guo, R.; Chen, C. Robust Pedestrian Tracking and Recognition from FLIR Video: A Unified Approach via Sparse Coding. Sensors 2014, 14, 11245–11259. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Forslund, D.; Bjärkefur, J. Night vision animal detection. In Proceedings of the 2014 IEEE Intelligent Vehicles Symposium Proceedings, Dearborn, MI, USA, 8–11 June 2014; pp. 737–742. [Google Scholar] [CrossRef]
- Fernández-Caballero, A.; López, M.T.; Serrano-Cuerda, J. Thermal-infrared pedestrian ROI extraction through thermal and motion information fusion. Sensors 2014, 14, 6666–6676. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Christiansen, P.; Steen, K.; Jørgensen, R.; Karstoft, H. Automated Detection and Recognition of Wildlife Using Thermal Cameras. Sensors 2014, 14, 13778–13793. [Google Scholar] [CrossRef]
- Jeong, M.; Ko, B.C.; Nam, J.-Y. Early Detection of Sudden Pedestrian Crossing for Safe Driving During Summer Nights. IEEE Trans. Circuits Syst. Video Technol. 2017, 27, 1368–1380. [Google Scholar] [CrossRef]
- Baek, J.; Hong, S.; Kim, J.; Kim, E. Efficient Pedestrian Detection at Nighttime Using a Thermal Camera. Sensors 2017, 17, 1850. [Google Scholar] [CrossRef] [Green Version]
- Jeon, E.S.; Kim, J.H.; Hong, H.G.; Batchuluun, G.; Park, K.R. Human Detection Based on the Generation of a Background Image and Fuzzy System by Using a Thermal Camera. Sensors 2016, 16, 453. [Google Scholar] [CrossRef] [Green Version]
- Choi, Y.; Kim, N.; Hwang, S.; Kweon, I.S. Thermal Image Enhancement using Convolutional Neural Network. In Proceedings of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Korea, 9–14 October 2016; pp. 223–230. [Google Scholar] [CrossRef]
- Hwang, S.; Park, J.; Kim, N.; Choi, Y.; Kweon, I.S. Multispectral pedestrian detection: Benchmark dataset and baseline. In Proceedings of the 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Boston, MA, USA, 7–12 June 2015; pp. 1037–1045. [Google Scholar] [CrossRef]
- Iwasaki, Y. A Method of Robust Moving Vehicle Detection For Bad Weather Using An Infrared Thermography Camera. In Proceedings of the International Conference on Wavelet Analysis and Pattern Recognition, Hong Kong, China, 30–31 August 2008. [Google Scholar]
- Iwasaki, Y.; Misumi, M.; Nakamiya, T. Robust Vehicle Detection under Various Environmental Conditions Using an Infrared Thermal Camera and Its Application to Road Traffic Flow Monitoring. Sensors 2013, 13, 7756. [Google Scholar] [CrossRef]
- Pinchon, N.; Cassignol, O.; Nicolas, A.; Bernardin, F.; Leduc, P.; Tarel, J.P.; Brémond, R.; Bercier, E.; Brunet, J. All-Weather Vision for Automotive Safety: Which Spectral Band? In Advanced Microsystems for Automotive Applications 2018; Dubbert, J., Müller, B., Meyer, G., Eds.; Springer International Publishing: Berlin/Heidelberg, Germany, 2018; pp. 3–15. [Google Scholar]
- Sabry, M.; Al-Kaff, A.; Hussein, A.; Abdennadher, S. Ground Vehicle Monocular Visual Odometry. In Proceedings of the 2019 IEEE Intelligent Transportation Systems Conference (ITSC), Auckland, New Zealand, 27–30 October 2019; pp. 3587–3592. [Google Scholar] [CrossRef]
- Howard, B. MIT Spinoff WaveSense’s Ground-Penetrating Radar Looks Down for Perfect Self-Driving. Available online: https://www.extremetech.com/extreme/306205-mit-wavesense-ground-penetrating-radar-self-driving (accessed on 1 October 2020).
- Wang, Z.; Wu, Y.; Niu, Q. Multi-Sensor Fusion in Automated Driving: A Survey. IEEE Access 2020. [Google Scholar] [CrossRef]
- Göhring, D.; Wang, M.; Schnürmacher, M.; Ganjineh, T. Radar/Lidar sensor fusion for car-following on highways. In Proceedings of the The 5th International Conference on Automation, Robotics and Applications, Wellington, New Zealand, 6–8 December 2011; pp. 407–412. [Google Scholar] [CrossRef] [Green Version]
- Savasturk, D.; Froehlich, B.; Schneider, N.; Enzweiler, M.; Franke, U. A Comparison Study on Vehicle Detection in Far Infrared and Regular Images. In Proceedings of the 2015 IEEE 18th International Conference on Intelligent Transportation Systems, Las Palmas, Spain, 15–18 September 2015. [Google Scholar]
- Mockel, S.; Scherer, F.; Schuster, P.F. Multi-sensor obstacle detection on railway tracks. In Proceedings of the IEEE IV2003 Intelligent Vehicles Symposium. Proceedings (Cat. No.03TH8683), Columbus, OH, USA, 9–11 June 2003; pp. 42–46. [Google Scholar] [CrossRef]
- Nabati, R.; Qi, H. Radar-Camera Sensor Fusion for Joint Object Detection and distance estimation. arXiv 2020, arXiv:2009.08428v1. [Google Scholar]
- Radecki, P.; Campbell, M.; Matzen, K. All Weather Perception_ Joint Data Association, Tracking and classification. arXiv 2016, arXiv:1605.02196v1. [Google Scholar]
- Diaz-Cabrera, M.; Cerri, P.; Medici, P. Robust real-time traffic light detection and distance estimation using a single camera. Expert Syst. Appl. 2015, 42, 3911–3923. [Google Scholar] [CrossRef]
- Zhou, L.; Deng, Z. LIDAR and vision-based real-time traffic sign detection and recognition algorithm for intelligent vehicle. In Proceedings of the 2014 IEEE 17th International Conference on Intelligent Transportation Systems (ITSC), Qingdao, China, 8–11 October 2014. [Google Scholar]
- Kim, T.; Song, B. Detection and Tracking of Road Barrier Based on Radar and Vision Sensor Fusion. J. Sens. 2016. [Google Scholar] [CrossRef] [Green Version]
- Im, G.; Kim, M.; Park, J. Parking Line Based SLAM Approach Using AVM/LiDAR Sensor Fusion for Rapid and Accurate Loop Closing and Parking Space Detection. Sensors 2019, 19, 4811. [Google Scholar] [CrossRef] [Green Version]
- Choi, J.; Chang, E.; Yoon, D.; Ryu, S.; Jung, H.; Suhr, J. Sensor Fusion-Based Parking Assist System; SAE Technical Paper; SAE: Warrendale, PN, USA, 2014. [Google Scholar]
Criterion | Indexes (i) |
---|---|
Range | 0—None 1—Very low performance 2—Low 3—High 4—Very high performance |
Resolution | |
Contrast | |
Weather | |
Cost | 0—None |
1—Very high cost | |
2—High cost | |
3—Low cost | |
4—Very low cost |
Sensors | Actual Price | Ideal Price |
---|---|---|
Ultrasonic sensors | USD 16 to USD 40 | USD 16 to USD 40 |
Automotive radar | USD 50 to USD 220 | USD 50 to USD 220 |
Automotive lidar | USD 500 to USD 75,000 | USD 100 to USD 10,000 |
Automotive mono-camera | USD 100 to USD 1000 | USD 100 to USD 700 |
Automotive thermal camera | USD 700 to USD 3000 | USD 100 to USD 1500 |
Sensor | Advantages | Disadvantages |
---|---|---|
Radar |
|
|
Lidar |
|
|
Ultrasonic |
|
|
Camera |
|
|
Far-Infrared |
|
|
Application | Radar | Ultrasonic | Lidar | Camera | Far-Infrared | |||||
---|---|---|---|---|---|---|---|---|---|---|
Short Range | Medium Range | Long Range | Short Range | Medium Range | Long Range | Monocular Camera | Stereo Camera | |||
Adaptive Cruise Control | √ | √ | √ | |||||||
Forward Collision Avoidance | √ | √ | √ | √ | √ | √ | ||||
Road/Traffic Sign Recognition | √ | √ | √ | √ | ||||||
Traffic Jam Assist | √ | √ | √ | √ | √ | √ | √ | |||
Lane Departure and Lane Keeping Assistance | √ | √ | √ | √ | ||||||
Blind Spot Monitoring | √ | √ | √ | √ | √ | √ | √ | |||
Parking Assistance | √ | √ | √ | √ | √ | √ | √ |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Mohammed, A.S.; Amamou, A.; Ayevide, F.K.; Kelouwani, S.; Agbossou, K.; Zioui, N. The Perception System of Intelligent Ground Vehicles in All Weather Conditions: A Systematic Literature Review. Sensors 2020, 20, 6532. https://doi.org/10.3390/s20226532
Mohammed AS, Amamou A, Ayevide FK, Kelouwani S, Agbossou K, Zioui N. The Perception System of Intelligent Ground Vehicles in All Weather Conditions: A Systematic Literature Review. Sensors. 2020; 20(22):6532. https://doi.org/10.3390/s20226532
Chicago/Turabian StyleMohammed, Abdul Sajeed, Ali Amamou, Follivi Kloutse Ayevide, Sousso Kelouwani, Kodjo Agbossou, and Nadjet Zioui. 2020. "The Perception System of Intelligent Ground Vehicles in All Weather Conditions: A Systematic Literature Review" Sensors 20, no. 22: 6532. https://doi.org/10.3390/s20226532
APA StyleMohammed, A. S., Amamou, A., Ayevide, F. K., Kelouwani, S., Agbossou, K., & Zioui, N. (2020). The Perception System of Intelligent Ground Vehicles in All Weather Conditions: A Systematic Literature Review. Sensors, 20(22), 6532. https://doi.org/10.3390/s20226532