Advanced Alarm Method Based on Driver’s State in Autonomous Vehicles
Abstract
:1. Introduction
2. Alarm Design
2.1. Visual Alarm
- Provision method:
- -
- A visual alarm should be used in conjunction with an auditory alarm.
- Color:
- -
- Orange or yellow colors should be used to attract the driver’s attention.
- -
- Red color should be used in dangerous scenarios where the driver needs to take an immediate action.
- -
- The color green should be used when the system is normal.
- Location:
- -
- Alarms requiring a quick response from the driver must be provided to the HUD within ±5° of the driver’s gaze.
- Size:
- -
- The text and icon sizes of the provided information should be more than 0.5” in diameter, and in the case of an emergency, more than 1” in diameter.
- Flashing method:
- -
- The optimal speed of flashing in an emergency should be 3–5 Hz.
2.2. Auditory Alarm
- Provision method:
- -
- The auditory alarm should be delivered within 0.5 s after the relevant event.
- -
- The auditory alarm should be supplementary to the visual alarm.
- Voice message:
- -
- Phrases that provide visual information using text should be used.
- -
- Beeping sounds should be used before providing voice information.
- -
- In the case of an emergency, a rate of 150–200 words per minute should be used.
- Beeping sound:
- -
- In case of an emergency, a beeping sound should be used at a volume of up to 115 dB; in other scenarios, the applied volume should be up to 90 dB.
2.3. Visual/Auditory Alarm-Raising Method According to the Driver’s State
3. Experimental Design
3.1. Hypothesis
3.2. Independent Variable
3.3. Dependent Variables
- Visual recognition (survey): The degree to which the TOR was visually recognized. This was assessed using the five-point Likert scale questionnaire results and was designed with the following options: (1) not recognized at all; (2) not recognized; (3) intermediate; (4) recognized; and (5) fully recognized.
- Auditory recognition (survey): The degree of auditory TOR recognition. This was evaluated based on a five-point Likert scale questionnaire assessment and was designed with the following options: (1) not recognized at all; (2) not recognized; (3) intermediate; (4) recognized; and (5) fully recognized.
- Reaction time(s): The time from the point of TOR generation to the completion of control takeover by the driver.
- Blink (count/s): The number of eye blinks per second from the point of TOR generation to the completion of control takeover by the driver.
- Gaze distance (mm): The driver’s gaze distance per second from the point of TOR generation to the completion of control takeover by the driver.
- Pupil diameter (mm): Changes in the driver’s pupil diameter per second from the point of TOR generation to the completion of control takeover by the driver.
3.4. Experiment Equipment
3.5. Experiment Procedure
3.5.1. Drowsiness Scenario
- Normal manual driving (30–60 min). Prior to the experiment, the participants in the evaluation group limited their sleep to less than 4 h the day before the evaluation. Within 30 min of starting the experiment using the simulator, the participants were made to drive along a tedious road at 80 km/h; here, the vehicle was driven manually until drowsiness set in. The participants’ drowsiness was judged by the participants themselves. If the participant did not feel drowsy even after driving for more than 60 min, the experimenter would stop the experiment.
- Drowsy autonomous driving (20 min). The participants, recognizing that they felt drowsy, passed driving control to the vehicle by pressing the “autonomous driving” button; thereafter, the participant did not do anything related to driving or NDRTs but remained comfortable in the autonomous driving state.
- Takeover request (5 s, three repetitions). System failure occurred on the road and the autonomous driving system provided a visual and auditory takeover alarm. When the participants recognized the alarm (the alarm repeated three times with a ringing duration >5 s), they held the steering wheel with both hands and looked forward.
- Manual driving (5 min). The participants, upon transferring control of the vehicle back to themselves, manually drove the vehicle for approximately 5 min before the scenario ended.
- Usability evaluation questionnaire (10 min). After completing the scenario, a questionnaire related to different aspects such as the effectiveness and recognition of the drowsiness alarm was provided to the participants.
3.5.2. Distracted Scenario
- Normal manual driving (10 min). Using the simulator, the participants drove at 80 km/h in a normal state without performing any other tasks.
- Distracted autonomous driving section (20 min). The participants pressed the “autonomous driving” button and transferred driving control to the vehicle. During the autonomous driving, the participants listened to a song and entered the lyrics of the song on a smartphone (visual/auditory/manual task). In this situation, the participants were not looking in the forward direction, and their hearing was focused on the music. Additionally, their hands were away from the steering wheel, indicating the visual/manual distraction state.
- Takeover request (5 s, three repetitions). When system failure occurred on the road, the autonomous driving system provided a visual and auditory takeover alarm to the participants. Furthermore, when the participants recognized the alarm (the alarm repeated thrice with a ringing duration >5 s), they held the steering wheel with both hands and looked forward.
- Manual driving (5 min). After retaking control of the vehicle, the participant manually drove the vehicle for roughly 5 min before the scenario ended.
- Usability evaluation questionnaire (10 min). After completing the scenario, a questionnaire was provided to the participants to evaluation various aspects such as the effectiveness and recognition of the distraction alarm.
3.6. Participants
4. Results and Discussion
4.1. Results of Visual and Auditory Recognition
4.2. Results of Reaction Time Detection
4.3. Results of Blink Counting
4.4. Results of Gaze Distance Measurements
4.5. Results of Pupil-Diameter-Change Measurements
4.6. EEG-Based Drowsiness Verification Results
5. Proposed Advanced Alarm Method
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Lee, L.D. Driving attention: Cognitive engineering in designing attractions and distractions. Front. Eng. 2008, 34, 32–38. [Google Scholar]
- Regan, M.A.; Strayer, D.L. Towards an understanding of driver inattention: Taxonomy and theory. Ann. Adv. Automot. Med. 2014, 58, 5–14. [Google Scholar] [PubMed]
- Hwang, Y.S.; Kim, K.H.; Yoon, D.S.; Sohn, J.C. Human-vehicle interaction: Technology trends in drivers’ driving workload management. Electron. Telecommun. Trends 2014, 29, 1–8. [Google Scholar]
- National Highway Traffic Safety Administration. Traffic Safety Facts Research Note: Distracted Driving in Fatal Crashes 2017; National Highway Traffic Safety Administration: Washington, DC, USA, 2019; DOT HS 812 700. Available online: https://www.nhtsa.gov/press-releases/us-dot-announces-2017-roadway-fatalities-down (accessed on 9 November 2021).
- Force, E.T. Automated Driving Road Map; European road Transport Research Advisory Council: Brussel, Belgium, 2015. [Google Scholar]
- SAE On-Road Automated Driving Committee. SAE J3016. Taxonomy and Definitions for Terms Related to Driving Automation Systems for On-Road Motor Vehicles, Technical Report; SAE International: Warrendale, PA, USA, 2016. [Google Scholar]
- Bahram, M.; Aeberhard, M.; Wollherr, D. Please take over! An analysis and strategy for a driver take over request during autonomous driving. In IEEE Intelligent Vehicles Symposium (IV); IEEE: Seoul, Korea, 2015; pp. 913–919. [Google Scholar]
- Morales-Alvarez, W.; Sipele, O.; Léberon, R.; Tadjine, H.H.; Olaverri-Monreal, C. Automated driving: A literature review of the take over request in conditional automation. Electronics 2020, 9, 2087. [Google Scholar] [CrossRef]
- Yun, H.; Lee, J.W.; Yang, H.D.; Yang, J.H. Experimental Design for Multimodal Take-over Request for automated driving. In Communications in Computer and Information Science International Conference on Human-Computer Interaction; Springer: Cham, Switzerland, 2018; pp. 418–425. [Google Scholar]
- Naujoks, F.; Forster, Y.; Wiedemann, K.; Neukum, A. Improving usefulness of automated driving by lowering primary task interference through HMI design. J. Adv. Transport. 2017, 2017, 6105087. [Google Scholar] [CrossRef] [Green Version]
- Zhang, J.; Suto, K.; Fujiwara, A. Effects of in-vehicle warning information on drivers’ decelerating and accelerating behaviors near an arch-shaped intersection. Accid Anal. Prev. 2009, 41, 948–958. [Google Scholar] [CrossRef] [PubMed]
- Campbell, J.L.; Carney, C.; Kantowitz, B.H. Human Factors Design Guidelines for Advanced Traveler Information Systems (ATIS) and Commercial Vehicle Operations (CVO) (No. FHWA-RD-98-057); Federal Highway Administration: Washington, DC, USA, 1998.
- Campbell, J.L.; Brown, J.L.; Graving, J.S.; Richard, C.M.; Lichty, M.G.; Bacon, L.P.; Morgan, J.F.; Li, H.; Williams, D.N.; Sanquist, T. Human Factors Design Guidance for Level 2 and Level 3 Automated Driving Concepts; National Highway Traffic Safety Administration: Washington, DC, USA, 2018; Rep. DOT HS 812 555.
- Telpaz, A.; Rhindress, B.; Zelman, I.; Tsimhoni, O. Haptic seat for automated driving: Preparing the driver to take control effectively. In Proceedings of the 7th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Nottingham, UK, 1–3 September 2015; pp. 23–30. [Google Scholar]
- Borojeni, S.S.; Chuang, L.; Heuten, W.; Boll, S. Assisting drivers with ambient take-over requests in highly automated driving. In Proceedings of the 8th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Ann Arbor, MI, USA, 24–26 October 2016; pp. 237–244. [Google Scholar]
- Royal, D. National Survey of Distracted and Drowsy Driving Attitudes and Behavior: 2002; Findings (No. DOT-HS-809-566); Office of Research and Traffic Records, National Highway Traffic Safety Administration: Washington, DC, USA, 2003; Volume 1.
- Kass, S.J.; Cole, K.S.; Stanny, C.J. Effects of distraction and experience on situation awareness and simulated driving. Transport. Res. Part. F Traffic Psych. Behav. 2007, 10, 321–329. [Google Scholar] [CrossRef]
- Campbell, J.L.; Brown, J.L.; Graving, J.S.; Richard, C.M.; Lichty, M.G.; Sanquist, T.; Morgan, J. Human Factors Design Guidance for Driver-Vehicle Interfaces; National Highway Traffic Safety Administration: Washington, DC, USA, 2016; Rep. DOT HS 812 360.
- Campbell, J.L.; Richard, C.M.; Brown, J.L.; McCallum, M. Crash Warning System Interfaces: Human Factors Insights and Lessons Learned, Final Report; National Highway Traffic Safety Administration: Washington, DC, USA, 2007; Rep. DOT HS 810 697.
- Prinzel, L.J.; Risser, M. Head-Up Displays and Attention Capture. NASA/TM-2004-213000; 2004. Available online: https://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/20040065771.pdf (accessed on 9 November 2021).
- ISO/TR 7239:1984. Development and Principles for Application of Public Information Symbols; International Organization for Standardization: Geneva, Switzerland, 1984. [Google Scholar]
- Gold, C.; Damböck, D.; Lorenz, L.; Bengler, K. ‘Take over!’ How long does it take to get the driver back into the loop? Proc. Hum. Factors Ergon. Soc. Annu. Meet. 2013, 57, 1938–1942. [Google Scholar] [CrossRef] [Green Version]
- Kim, H.J.; Yang, J.H. Takeover requests in simulated partially autonomous vehicles considering human factors. IEEE Trans. Hum.-Mach. Syst. 2017, 47, 735–740. [Google Scholar] [CrossRef]
- Gonçalves, J.; Olaverri-Monreal, C.; Bengler, K. Driver Capability Monitoring in Highly Automated Driving: From state to capability monitoring. In Proceedings of the 2015 IEEE 18th International Conference on Intelligent Transportation Systems, Gran Canaria, Spain, 15–18 September 2015; pp. 2329–2334. [Google Scholar]
- Hills, B.L. Vision, visibility, and perception in driving. Perception 1980, 9, 183–216. [Google Scholar] [CrossRef] [PubMed]
- Fogarty, C.; Stern, J.A. Eye movements and blinks: Their relationship to higher cognitive processes. Int. J. Psychophysiol. 1989, 8, 35–42. [Google Scholar] [CrossRef]
- Victor, T.W.; Harbluk, J.L.; Engström, J.A. Sensitivity of eye-movement measures to in-vehicle task difficulty. Transport. Res. Part. F Traffic Psych. Behav. 2005, 8, 167–190. [Google Scholar] [CrossRef]
- Sodhi, M.; Reimer, B.; Llamazares, I. Glance analysis of driver eye movements to evaluate distraction. Behav Res. Meth. Instrum. Comput. 2002, 34, 529–538. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Nguyen, T.; Ahn, S.; Jang, H.; Jun, S.C.; Kim, J.G. Utilization of a combined EEG/NIRS system to predict driver drowsiness. Sci. Rep. 2017, 7, 43933. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Yin, Y.; Zhu, Y.; Xiong, S.; Zhang, J. Drowsiness detection from EEG spectrum analysis. In Informatics in Control, Automation and Robotics; Springer: Berlin/Heidelberg, Germany, 2011; Volume 133, pp. 753–759. [Google Scholar]
Visual | Auditory | ||||
---|---|---|---|---|---|
Icon | Text | Voice | Beep | ||
Size (HUD) | 1 inch< | 0.5 inches< | dB | 90 dB | 115 dB |
Size (CID) | 2 inches< | 1 inch< | Rate | 250 words/min | 4 times/s |
Flash | 4 times/s | Hz | - | 1500 Hz | |
Location | Within 5° of the driver’s field of view | Method | 1. Beep → 2. Voice | ||
Color | Red | - | Female voice | - | |
Method | - | Same phrase as in voice | Same phrase as in the text | - |
Visual | Auditory | ||||
---|---|---|---|---|---|
Icon | Text | Voice | Beep | ||
Size (HUD) | 0.5 inch< | 0.3 inches< | dB | 90 dB | |
Size (CID) | 1 inch< | 0.6 inch< | Rate | 200 words/min | 3 times/s |
Flash | 3 times/s | Hz | - | 1000 Hz | |
Location | Within 15° of the driver’s field of view | Method | 1. Beep → 2. Voice | ||
Color | Orange | - | Female voice | - | |
Method | - | Same phrase as in voice | Same phrase as in the text | - |
Classification | Variable | Descriptions |
---|---|---|
Qualitative | Visual recognition | The level of visually recognizing TOR is evaluated using the five-point Likert scale: 1 = bad to 5 = good |
Auditory recognition | The level of auditorily recognizing TOR is evaluated using the five-point Likert scale: 1 = bad to 5 = good | |
Quantitative | Reaction time (s) | Time from the point of TOR generation to the completion of control takeover by a driver. |
Blink count (count/s) | The number of eye blinks per second from the point of TOR generation to the completion of control takeover by a driver: | |
Gaze distance (mm/s) | The gaze distance per second from the point of TOR generation to the completion of control takeover by a driver: | |
Pupil diameter (mm/s) | The pupil diameter changes per second from the point of TOR generation to the completion of control takeover by a driver: |
Dependent Variables. | Mean (SD) | p-Value (a = 0.05) | |
---|---|---|---|
Drowsy State | Distracted State | ||
Visual recognition (score) | 3.60 (1.02) | 3.92 (1.19) | - |
Auditory recognition (score) | 4.44 (0.82) | 4.58 (0.64) | - |
Reaction time (s) | 4.15 (1.66) | 5.63 (2.84) | 0.03 |
Blink count (count count/s) | 0.25 (0.24) | 0.35 (0.29) | 0.20 |
Gaze distance (mm/s) | 9.84 (3.13) | 13.22 (7.44) | 0.03 |
Pupil diameter (mm/s) | 0.0075 (0.0027) | 0.0097 (0.0038) | 0.02 |
Design Elements | Method |
---|---|
Location |
|
Flashing method |
|
Icon and text |
|
Voice and beep sound |
|
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Han, J.-H.; Ju, D.-Y. Advanced Alarm Method Based on Driver’s State in Autonomous Vehicles. Electronics 2021, 10, 2796. https://doi.org/10.3390/electronics10222796
Han J-H, Ju D-Y. Advanced Alarm Method Based on Driver’s State in Autonomous Vehicles. Electronics. 2021; 10(22):2796. https://doi.org/10.3390/electronics10222796
Chicago/Turabian StyleHan, Ji-Hyeok, and Da-Young Ju. 2021. "Advanced Alarm Method Based on Driver’s State in Autonomous Vehicles" Electronics 10, no. 22: 2796. https://doi.org/10.3390/electronics10222796
APA StyleHan, J. -H., & Ju, D. -Y. (2021). Advanced Alarm Method Based on Driver’s State in Autonomous Vehicles. Electronics, 10(22), 2796. https://doi.org/10.3390/electronics10222796