Centralised and Decentralised Sensor Fusion-Based Emergency Brake Assist
Abstract
:1. Introduction
- (1)
- Detection: identify critical scenarios which can lead to an accident and warn the driver accordingly through audio and/or visual indications;
- (2)
- Action: in scenarios where impacts or accidents are inevitable, EBA can decrease the speed of the ego-vehicle by applying brakes in advance to achieve minimal impact.
- Highlighting the advantages and challenges of using multi-sensor fusion driven ADAS algorithms over mono-sensor ADAS features.
- Providing analytical comparisons of the two proposed methodologies of sensor fusion—‘centralised’ and ‘decentralised’ fusion architectures.
- Implementing an EBA algorithm and critically analysing the behaviour, performance, and efficacy of the feature driven by the two proposed fusion methods.
2. Background and Previous Work
2.1. Related Work
2.2. Classification of Sensor Fusion Methods
3. Proposed Fusion Methods
3.1. Fusion Architectures—Centralised and Decentralised Fusion
- A’, B’—Raw data from sensor (pixel-level data for camera and point cloud data for LiDAR)
- A, B—Processed data from sensor object detection blocks. Pre-processing blocks indicate object detection algorithms implemented for the respective sensors.
- C—Temporally and spatially synchronised data from the two sensors.
- D—Fused data. Output of sensor fusion; these data are the output of the tracking algorithm, and are immune to false negatives, false positives, and other noise present in sensor data.
- A’, B’—Raw data from sensor (pixel-level data for camera and point cloud data for LiDAR).
- A, B—Data from the sensor object detection blocks. Pre-processing blocks indicate object the detection algorithm implemented for the respective sensors.
- C—Tracking data of Sensor A. This block ensures that data are consistent despite inconsistencies at the output of the pre-processing block.
- D—Tracking data of Sensor B. This block ensures that data are consistent despite inconsistencies at the output of the pre-processing block.
- E—Output of the fusion block. Data from both sensors are spatially and temporally aligned.
3.2. Components of the Proposed Fusion Methods
3.2.1. Camera Object Detection
3.2.2. LiDAR Object Detection
- Input: N objects to be clustered and some global parameters (Eps, MinPts)
- Output: Clusters of objects
- Process:
- Select a point p arbitrarily.
- Retrieve all density-reachable points from p with respect to Eps and MinPts.
- If p is a core point, a cluster is formed.
- If p is a border point, no points are density reachable from p and DBSCAN visits the next arbitrary point in the database.
- Continue the process until all points in the database are visited.
3.2.3. Tracking
3.2.4. Data Fusion
Synchronicity of Data
Executing Fusion Node—OSCF and ODSF
- Parameters of top left corner of the bounding box, that is, (x1, y1), and
- Width and height of the bounding box, that is, (h, w).
- (1)
- Parameters of front top left corner of the bounding box, that is, (x1, y1, z1).
- (2)
- Width, height, and depth of the bounding box, that is, (h, w, l).
- Both sensors have detected an object, and the fusion node now associates their bounding boxes.
- Some frames later, one of the two sensor detection algorithms gives a false negative detection and does not detect the object.
3.3. Implementation of OSCF and ODSF
Creating the ROS Environment
3.4. Examples for Sensor Data Fusion
4. Emergency Brake Assist (EBA) Using OCSF and ODSF
4.1. Safe Scenario for EBA
4.2. Unsafe Scenario for EBA
4.3. OCSF-Driven EBA
- The frame rate is consistent around 31 frames/s.
- Instances of false positives or false negatives are observed at times, as expected from the OCSF-driven EBA algorithm.
- The tracker algorithm does a good job of suppressing the false positives (FP) and false negatives (FN); however, not all FPs and FNs are filtered. It can be understood that the number of FPs and FNs would be considerably higher if the tracker were not used.
4.4. ODSF-Driven EBA
- The frame rate is consistent around 20 frames/s; thus, comparatively less frame rate is observed.
- Compared to OCSF-driven EBA, lesser false positives and false negatives are observed, as expected from the ODSF-driven EBA.
- In this case, the tracker algorithm suppresses the false positives and negatives. As these FPs and FNs are suppressed at a modular level before the data are fused, the accuracy of this method is much higher than the OCSF-driven EBA.
- Like in the previous method as well, the steering angle and vehicle speed do not affect the area under the PDP quadrilateral. However, as we are demonstrating the sensor fusion-based ADAS feature, this is an acceptable compromise, and this can be evolved in later versions of the work.
4.5. Results
4.5.1. Frame Rate for the Execution of EBA
4.5.2. Accuracy and Precision of EBA
4.5.3. Computational Cost of EBA
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Herpel, T.; Lauer, C.; German, R.; Salzberger, J. Trade-off between coverage and robustness of automotive environment sensor systems. In Proceedings of the 2008 International Conference on Intelligent Sensors, Sensor Networks and Information Processing, Sydney, NSW, Australia, 15–18 December 2008; pp. 551–556. [Google Scholar]
- Fung, M.L.; Chen, M.Z.Q.; Chen, Y.H. Sensor fusion: A review of methods and applications. In Proceedings of the 2017 29th Chinese Control and Decision Conference (CCDC), Chongqing, China, 28–30 May 2017; pp. 3853–3860. [Google Scholar]
- Warren, M.E. Automotive LIDAR Technology. In Proceedings of the 2019 Symposium on VLSI Circuits, Kyoto, Japan, 9–14 June 2019; pp. 254–255. [Google Scholar]
- Steinbaeck, J.; Steger, C.; Holweg, G.; Druml, N. Next generation radar sensors in automotive sensor fusion systems. In Proceedings of the 2017 Sensor Data Fusion: Trends, Solutions, Applications (SDF), Bonn, Germany, 10–12 October 2017; pp. 1–6. [Google Scholar] [CrossRef]
- Schlegl, T.; Bretterklieber, T.; Neumayer, M.; Zangl, H. A novel sensor fusion concept for distance measurement in automotive applications. IEEE Sens. 2010, 775–778. [Google Scholar] [CrossRef]
- ADAS. International Conference on Advanced Driver Assistance Systems (IEE Conf. Publ. No.483). In Proceedings of the 2001 ADAS. International Conference on Advanced Driver Assistance Systems, (IEE Conf. Publ. No. 483), Birmingham, UK, 17–18 September 2001; p. i. [Google Scholar]
- Kessler, T.; Bernhard, J.; Buechel, M.; Esterle, K.; Hart, P.; Malovetz, D.; Le, M.T.; Diehl, F.; Brunner, T.; Knoll, A. Bridging the Gap between Open Source Software and Vehicle Hardware for Autonomous Driving. In Proceedings of the 2019 IEEE Intelligent Vehicles Symposium (IV), Paris, France, 9–12 June 2019; pp. 1612–1619. [Google Scholar]
- Castanedo, F. A Review of Data Fusion Techniques. Sci. World J. 2013, 2013, 1–19. [Google Scholar] [CrossRef] [PubMed]
- Stampfle, M.; Holz, D.; Becker, J. Performance evaluation of automotive sensor data fusion. In Proceedings of the 2005 IEEE Intelligent Transportation Systems, Vienna, Austria, 16 September 2005; pp. 50–55. [Google Scholar]
- Thakur, R. Scanning LIDAR in Advanced Driver Assistance Systems and Beyond: Building a road map for next-generation LIDAR technology. IEEE Consum. Electron. Mag. 2016, 5, 48–54. [Google Scholar] [CrossRef]
- Kim, J.; Han, D.S.; Senouci, B. Radar and Vision Sensor Fusion for Object Detection in Autonomous Vehicle Surroundings. In Proceedings of the 2018 Tenth International Conference on Ubiquitous and Future Networks (ICUFN), Prague, Czech Republic, 3–6 July 2018; pp. 76–78. [Google Scholar]
- Herpel, T.; Lauer, C.; German, R.; Salzberger, J. Multi-sensor data fusion in automotive applications. In Proceedings of the 2008 3rd International Conference on Sensing Technology, Taipei, Taiwan, 30 November–3 December 2008; pp. 206–211. [Google Scholar]
- Fayad, F.; Cherfaoui, V. Object-level fusion and confidence management in a multi-sensor pedestrian tracking system. In Proceedings of the 2008 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, Seoul, Korea, 20–22 August 2008; pp. 58–63. [Google Scholar] [CrossRef] [Green Version]
- Kang, Y.; Yin, H.; Berger, C. Test Your Self-Driving Algorithm: An Overview of Publicly Available Driving Datasets and Virtual Testing Environments. IEEE Trans. Intell. Veh. 2019, 4, 171–185. [Google Scholar] [CrossRef]
- Duraisamy, B.; Gabb, M.; Nair, A.V.; Schwarz, T.; Yuan, T. Track level fusion of extended objects from heterogeneous sensors. In Proceedings of the 19th International Conference on Information Fusion (FUSION), Heidelberg, Germany, 5–8 July 2016; pp. 876–885. [Google Scholar]
- Hamid, U.Z.A.; Zakuan, F.R.A.; Zulkepli, K.A.; Azmi, M.Z.; Zamzuri, H.; Rahman, M.A.A.; Zakaria, M.A. Autonomous emergency braking system with potential field risk assessment for frontal collision mitigation. In Proceedings of the 2017 IEEE Conference on Systems, Process and Control (ICSPC), Meleka, Malaysia, 15–17 December 2017; pp. 71–76. [Google Scholar]
- Rassõlkin, A.; Gevorkov, L.; Vaimann, T.; Kallaste, A.; Sell, R. Calculation of the traction effort of ISEAUTO self-driving vehicle. In Proceedings of the 2018 25th International Workshop on Electric Drives: Optimization in Control of Electric Drives (IWED), Moscow, Russia, 31 January–2 February 2018; pp. 1–5. [Google Scholar]
- Simic, A.; Kocic, O.; Bjelica, M.Z.; Milosevic, M. Driver monitoring algorithm for advanced driver assistance systems. In Proceedings of the 2016 24th Telecommunications Forum (TELFOR), Belgrade, Serbia, 22–23 November 2016; pp. 1–4. [Google Scholar]
- Ariyanto, M.; Haryadi, G.D.; Munadi, M.; Ismail, R.; Hendra, Z. Development of Low-Cost Autonomous Emergency Braking System (AEBS) for an Electric Car. In Proceedings of the 2018 5th International Conference on Electric Vehicular Technology (ICEVT), Surakarta, Indonesia, 30–31 October 2018; pp. 167–171. [Google Scholar]
- Gläser, C.; Michalke, T.P.; Bürkle, L.; Niewels, F. Environment perception for inner-city driver assistance and highly-automated driving. In Proceedings of the 2014 IEEE Intelligent Vehicles Symposium Proceedings, Dearborn, MI, USA, 8–11 June 2014; pp. 1270–1275. [Google Scholar]
- Wonneberger, S.; Mühlfellner, P.; Ceriotti, P.; Graf, T.; Ernst, R. Parallel feature extraction and heterogeneous object-detection for multi-camera driver assistance systems. In Proceedings of the 2015 25th International Conference on Field Programmable Logic and Applications (FPL), London, UK, 2–4 September 2015; pp. 1–4. [Google Scholar]
- Badue, C.; Guidolini, R.; Carneiro, R.V.; Azevedo, P.; Cardoso, V.B.; Forechi, A.; Jesus, L.; Berriel, R.; Paixão, T.; Mutz, F.; et al. Self-Driving Cars: A Survey. arXiv 2019, arXiv:1901.04407v2. [Google Scholar] [CrossRef]
- De Silva, V.; Roche, J.; Kondoz, A. Robust Fusion of LiDAR and Wide-Angle Camera Data for Autonomous Mobile Robots. Sensors 2018, 18, 2730. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Yang, S.; Li, H. Application of EKF and UKF in Target Tracking Problem. In Proceedings of the 2016 8th International Conference on Intelligent Human-Machine Systems and Cybernetics (IHMSC), Hang Zhou, China, 27–28 August 2016; Volume 1, pp. 116–120. [Google Scholar]
- Wan, E.A.; Van Der Merwe, R. The unscented Kalman filter for nonlinear estimation. In Proceedings of the IEEE 2000 Adaptive Systems for Signal Processing, Communications, and Control Symposium (Cat. No.00EX373), Lake Louise, AB, Canada, 4 October 2000; pp. 153–158. [Google Scholar]
- Grime, S.; Durrant-Whyte, H.F. Data Fusion in decentralised fusion networks. Control Eng. Pract. 1994, 2, 849–863. [Google Scholar] [CrossRef]
- Berardin, K.; Stiefelhagen, R. Evaluating Multiple Object Tracking Performance: The CLEAR MOT Metrics. EURASIP J. Image Video Process. 2008, 2008, 1–10. [Google Scholar] [CrossRef] [Green Version]
- Aeberhard, M.; Bertram, T. Object Classification in a High-Level Sensor Data Fusion Architecture for Advanced Driver As-sistance Systems. In Proceedings of the 2015 IEEE 18th International Conference on Intelligent Transportation Systems, Gran Canaria, Spain, 15–18 September 2015; pp. 416–422. [Google Scholar]
- Durrant-Whyte, H.; Henderson, T.C. Multisensor Data Fusion; Springer Science and Business Media LLC: Berlin/Heidelberg, Germany, 2008; pp. 585–610. Available online: https://link.springer.com/referenceworkentry/10.1007%2F978-3-540-30301-5_26 (accessed on 29 July 2021).
- Chavez-Garcia, R.O.; Aycard, O. Multiple Sensor Fusion and Classification for Moving Object Detection and Tracking. IEEE Trans. Intell. Transp. Syst. 2016, 17, 525–534. [Google Scholar] [CrossRef] [Green Version]
- Steinhage, A.; Winkel, C. Dynamical systems for sensor fusion and classification. In Proceedings of the Scanning the Present and Resolving the Future/IEEE 2001 International Geoscience and Remote Sensing Symposium, Sydney, NSW, Australia, 9–13 July 2002; Volume 2, pp. 855–857. [Google Scholar] [CrossRef]
- Dasarathy, B. Sensor fusion potential exploitation-innovative architectures and illustrative applications. Proc. IEEE 1997, 85, 24–38. [Google Scholar] [CrossRef]
- Heading, A.; Bedworth, M. Data fusion for object classification. In Proceedings of the Conference Proceedings 1991 IEEE In-ternational Conference on Systems, Man, and Cybernetics, Charlottesville, VA, USA, 13–16 October 2002; Volume 2, pp. 837–840. [Google Scholar]
- Luo, R.C.; Yih, C.-C.; Su, K.L. Multisensor fusion and integration: Approaches, applications, and future research directions. IEEE Sens. J. 2002, 2, 107–119. [Google Scholar] [CrossRef]
- Makarau, A.; Palubinskas, G.; Reinartz, P. Multi-sensor data fusion for urban area classification. In Proceedings of the 2011 Joint Urban Remote Sensing Event, Munich, Germany, 11–13 April 2011; pp. 21–24. [Google Scholar] [CrossRef] [Green Version]
- Lee, Y.; Lee, C.; Lee, H.; Kim, J. Fast Detection of Objects Using a YOLOv3 Network for a Vending Machine. In Proceedings of the 2019 IEEE International Conference on Artificial Intelligence Circuits and Systems (AICAS), Hsinchu, Taiwan, 18–20 March 2019; pp. 132–136. [Google Scholar] [CrossRef]
- Kumar, C.; Punitha, R. YOLOv3 and YOLOv4: Multiple Object Detection for Surveillance Applications. In Proceedings of the 2020 Third International Conference on Smart Systems and Inventive Technology (ICSSIT), Tirunelveli, India, 20–22 August 2020; pp. 1316–1321. [Google Scholar] [CrossRef]
- Deng, D. DBSCAN Clustering Algorithm Based on Density. In Proceedings of the 2020 7th International Forum on Electrical Engineering and Automation, HeFei, China, 25–27 September 2020; pp. 949–953. [Google Scholar]
- Raj, S.; Ghosh, D. Improved and Optimal DBSCAN for Embedded Applications Using High-Resolution Automotive Radar. In Proceedings of the 2020 21st International Radar Symposium (IRS), Warsaw, Poland, 5–8 October 2020; pp. 343–346. [Google Scholar]
- Nagaraju, S.; Kashyap, M.; Bhattacharya, M. A variant of DBSCAN algorithm to find embedded and nested adjacent clusters. In Proceedings of the 2016 3rd International Conference on Signal Processing and Integrated Networks (SPIN), Noida, India, 11–12 February 2016; pp. 486–491. [Google Scholar]
- Sander, J.; Ester, M.; Kriegel, H.-P.; Xu, X. Density-Based Clustering in Spatial Databases: The Algorithm GDBSCAN and Its Applications. Data Min. Knowl. Discov. 1998, 2, 169–194. [Google Scholar] [CrossRef]
- Geiger, A.; Lenz, P.; Stiller, C.; Urtasun, R. Vision meets robotics: The KITTI dataset. Int. J. Robot. Res. 2013, 32, 1231–1237. [Google Scholar] [CrossRef] [Green Version]
- Khairnar, D.G.; Merchant, S.; Desai, U. Nonlinear Target Identification and Tracking Using UKF. In Proceedings of the 2007 IEEE International Conference on Granular Computing (GRC 2007), Fremont, CA, USA, 2–4 November 2007; p. 761. [Google Scholar]
- Lee, D.-J. Nonlinear Estimation and Multiple Sensor Fusion Using Unscented Information Filtering. IEEE Signal Process. Lett. 2008, 15, 861–864. [Google Scholar] [CrossRef]
- Geiger, A.; Moosmann, F.; Car, Ö.; Schuster, B. Automatic camera and range sensor calibration using a single shot. In Proceedings of the 2012 IEEE International Conference on Robotics and Automation, Saint Paul, MN, USA, 14–18 May 2012; pp. 3936–3943. [Google Scholar]
- Ratasich, D.; Fromel, B.; Hoftberger, O.; Grosu, R. Generic sensor fusion package for ROS. In Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany, 28 September–2 October 2015; pp. 286–291. [Google Scholar]
- Jagannadha, P.K.D.; Yilmaz, M.; Sonawane, M.; Chadalavada, S.; Sarangi, S.; Bhaskaran, B.; Bajpai, S.; Reddy, V.A.; Pandey, J.; Jiang, S. Special Session: In-System-Test (IST) Architecture for NVIDIA Drive-AGX Platforms. In Proceedings of the 2019 IEEE 37th VLSI Test Symposium (VTS), Monterey, CA, USA, 23–25 April 2019; pp. 1–8. [Google Scholar]
- Kato, S.; Tokunaga, S.; Maruyama, Y.; Maeda, S.; Hirabayashi, M.; Kitsukawa, Y.; Monrroy, A.; Ando, T.; Fujii, Y.; Azumi, T. Autoware on Board: Enabling Autonomous Vehicles with Embedded Systems. In Proceedings of the 2018 ACM/IEEE 9th International Conference on Cyber-Physical Systems (ICCPS), Porto, Portugal, 11–13 April 2018; pp. 287–296. [Google Scholar]
- Kocić, J.; Jovičić, N.; Drndarević, V. Sensors and Sensor Fusion in Autonomous Vehicles. In Proceedings of the 2018 26th Telecommunications Forum (TELFOR), Belgrade, Serbia, 20–21 November 2018; pp. 420–425. [Google Scholar] [CrossRef]
- Ruan, K.; Li, L.; Yi, Q.; Chien, S.; Tian, R.; Chen, Y.; Sherony, R. A Novel Scoring Method for Pedestrian Automatic Emergency Braking Systems. In Proceedings of the 2019 IEEE International Conference on Service Operations and Logistics, and Informatics (SOLI), Zhengzhou, China, 6–8 November 2019; pp. 128–133. [Google Scholar]
- Han, Y.C.; Wang, J.; Lu, L. A Typical Remote Sensing Object Detection Method Based on YOLOv3. In Proceedings of the 2019 4th International Conference on Mechanical, Control and Computer Engineering (ICMCCE), Hohhot, China, 24–26 October 2019; pp. 520–5203. [Google Scholar]
- Choi, J.; Chun, D.; Kim, H.; Lee, H.-J. Gaussian YOLOv3: An Accurate and Fast Object Detector Using Localization Uncertainty for Autonomous Driving. In Proceedings of the 2019 IEEE/CVF International Conference on Computer Vision (ICCV), Seoul, Korea, 27–28 October 2019; pp. 502–511. [Google Scholar]
- Wu, F.; Jin, G.; Gao, M.; He, Z.; Yang, Y. Helmet Detection Based on Improved YOLO V3 Deep Model. In Proceedings of the 2019 IEEE 16th International Conference on Networking, Sensing and Control (ICNSC), Banff, AB, Canada, 9–11 May 2019; pp. 363–368. [Google Scholar]
- Wang, J.; Xiao, W.; Ni, T. Efficient Object Detection Method Based on Improved YOLOv3 Network for Remote Sensing Images. In Proceedings of the 2020 3rd International Conference on Artificial Intelligence and Big Data (ICAIBD), Chengdu, China, 28–11 May 2020; pp. 242–246. [Google Scholar]
- Gong, H.; Li, H.; Xu, K.; Zhang, Y. Object Detection Based on Improved YOLOv3-tiny. In Proceedings of the 2019 Chinese Automation Congress (CAC), Hangzhou, China, 22–24 November 2019; pp. 3240–3245. [Google Scholar]
- Ilic, V.; Marijan, M.; Mehmed, A.; Antlanger, M. Development of Sensor Fusion Based ADAS Modules in Virtual Environments. In Proceedings of the 2018 Zooming Innovation in Consumer Technologies Conference (ZINC), Novi Sad, Serbia, 30–31 May 2018; pp. 88–91. [Google Scholar] [CrossRef]
- Yan, H.; Huang, G.; Wang, H.; Shu, R. Application of unscented kalman filter for flying target tracking. In Proceedings of the 2013 International Conference on Information Science and Cloud Computing, Guangzhou, China, 7–8 December 2013; pp. 61–66. [Google Scholar] [CrossRef]
- Chernikova, O.S. An Adaptive Unscented Kalman Filter Approach for State Estimation of Nonlinear Continuous-Discrete System. In Proceedings of the 2018 XIV International Scientific-Technical Conference on Actual Problems of Electronics Instrument Engineering (APEIE), Novosibirsk, Russia, 2–6 October 2018; pp. 37–40. [Google Scholar]
- Raviteja, S.; Shanmughasundaram, R. Advanced Driver Assitance System (ADAS). In Proceedings of the 2018 Second International Conference on Intelligent Computing and Control Systems (ICICCS), Madurai, India, 14–15 June 2018; pp. 737–740. [Google Scholar]
- Norbye, H.G. Camera-Lidar Sensor Fusion in Real Time for Autonomous Surface Vehicles; Norwegian University of Science and Technology: Trondheim, Norway, 2019; Available online: https://folk.ntnu.no/edmundfo/msc2019-2020/norbye-lidar-camera-reduced.pdf (accessed on 29 July 2021).
- Miller, D.; Sun, A.; Ju, W. Situation awareness with different levels of automation. In Proceedings of the 2014 IEEE International Conference on Systems, Man, and Cybernetics (SMC), San Diego, CA, USA, 5–8 October 2014; pp. 688–693. [Google Scholar]
- Yeong, D.J.; Barry, J.; Walsh, J. A Review of Multi-Sensor Fusion System for Large Heavy Vehicles off Road in Industrial Environments. In Proceedings of the 2020 31st Irish Signals and Systems Conference (ISSC), Letterkenny, Ireland, 11–12 June 2020; pp. 1–6. [Google Scholar]
- De Silva, N.B.F.; Wilson, D.B.; Branco, K.R. Performance evaluation of the Extended Kalman Filter and Unscented Kalman Filter. In Proceedings of the 2015 International Conference on Unmanned Aircraft Systems (ICUAS), Denver, CO, USA, 9–12 June 2015; pp. 733–741. [Google Scholar]
Sr. No. | Criteria | Reference |
---|---|---|
1 | Classification based on relation between the different input sources, which can be:
| Whyte et al. [29] Chavez-Garcia et al. [30] Steinhage et al. [31] |
2 | Classification based on data types of input and output data, which can be:
| Dasarathy et al. [32] Steinhage et al. [31] Heading et al. [33] |
3 | Classification based on abstraction level of fused data, which can be:
| Luo et al. [34] Chavez-Garcia et al. [30] |
4 | Classification based on type of fusion architecture:
| Castanedo [8] Makarau et al. [35] Heading et al. [33] |
Sr. No. | Point | Coordinates |
---|---|---|
1 | A | (x1, y1) |
2 | B | (x1 + w, y1) |
3 | C | (x1 + w, y1—h) |
4 | D | (x1, y1—h) |
Sr. No. | Point | Coordinates |
---|---|---|
1 | A | (x1, y1, z1) |
2 | B | (x1 + w, y1, z1) |
3 | C | (x1 + w, y1—h, z1) |
4 | D | (x1, y1—h, z1) |
5 | E | (x1, y1—h, z1 + l) |
6 | F | (x1, y1, z1 + l) |
7 | G | (x1 + w, y1, z1 + l) |
8 | H | (x1 + w, y1—h, z1 + l) |
Sr. No. | ROS Node | Description |
---|---|---|
1 | Detection_LiDAR | This node performs object detection on LiDAR data |
2 | Detetcion_Camera | This node performs object detection on camera data |
3 | Sensor_Sync | This node applies transformation matrix and statially synchronises LiDAR and camera data |
4 | Sensor_Fusion | This node associates the synchronised LiDAR and camera data together, thereby creating an object list which includes data from both the camera and LiDAR |
5 | Tracking | This node performs functionality of the unscented Kalman filter. The UKF is implemented on fused data for OCSF and independently on sensor data in ODSF. |
Experiment Number | Scenario | Frame Rate for EBA with OCSF | Frame Rate for EBA with ODSF | Frame Rate for EBA with Mono-Sensor |
---|---|---|---|---|
1 | Densely populated urban street | 32 fps | 18 fps | 36 fps |
2 | 32 fps | 20 fps | 37 fps | |
3 | Moderately populated urban | 33 fps | 20 fps | 37 fps |
4 | 31 fps | 20 fps | 37 fps | |
5 | Sparsely populated highway | 32 fps | 22 fps | 39 fps |
6 | 32 fps | 21 fps | 38 fps | |
7 | Densely populated highway | 32 fps | 21 fps | 37 fps |
8 | 32 fps | 20 fps | 38 fps |
Sr. No. | Software Block—OCSF | Time Taken for Execution (ms) |
---|---|---|
1 | LiDAR object detection—DBSCAN | 4 |
2 | Camera object detection—YOLOv3 | 5 |
3 | Alignment—Temporal and spatial data synchronisation | 3 |
4 | Data Fusion—Association of target objects | 2.5 |
5 | Tracking—UKF | 16 |
6 | EBA | 2 |
TOTAL | 32.5 |
Sr. No. | Software Block—ODSF | Time Taken for Execution (ms) |
---|---|---|
1 | LiDAR object detection—DBSCAN | 4 |
2 | Camera object detection—YOLOv3 | 5 |
3 | Tracking for LiDAR detection—UKF | 16.2 |
4 | Tracking for camera detection—UKF | 16.5 |
5 | Alignment—Temporal and spatial data synchronisation | 3 |
6 | Data Fusion—Association of target objects | 1.8 |
7 | EBA | 2 |
TOTAL | 48.5 |
Experiment Number | Scenario | OCSF mAP (%) | ODSF mAP (%) | Mono-Sensor mAP (%) |
---|---|---|---|---|
1 | Densely populated urban street | 57.7857 | 63.9002 | 30.323 |
2 | 54.3361 | 64.7871 | 29.8019 | |
3 | Moderately populated urban | 57.0128 | 66.6676 | 31.7009 |
4 | 58.9919 | 65.0118 | 33.2245 | |
5 | Sparsely populated highway | 64.0089 | 70.5824 | 32.2637 |
6 | 64.3327 | 69.9066 | 33.034 | |
7 | Densely populated highway | 62.7008 | 68.6842 | 30.8026 |
8 | 61.1029 | 67.7183 | 29.0807 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Deo, A.; Palade, V.; Huda, M.N. Centralised and Decentralised Sensor Fusion-Based Emergency Brake Assist. Sensors 2021, 21, 5422. https://doi.org/10.3390/s21165422
Deo A, Palade V, Huda MN. Centralised and Decentralised Sensor Fusion-Based Emergency Brake Assist. Sensors. 2021; 21(16):5422. https://doi.org/10.3390/s21165422
Chicago/Turabian StyleDeo, Ankur, Vasile Palade, and Md. Nazmul Huda. 2021. "Centralised and Decentralised Sensor Fusion-Based Emergency Brake Assist" Sensors 21, no. 16: 5422. https://doi.org/10.3390/s21165422
APA StyleDeo, A., Palade, V., & Huda, M. N. (2021). Centralised and Decentralised Sensor Fusion-Based Emergency Brake Assist. Sensors, 21(16), 5422. https://doi.org/10.3390/s21165422