Next Article in Journal
MeshID: Few-Shot Finger Gesture Based User Identification Using Orthogonal Signal Interference
Previous Article in Journal
Realistic 3D Phantoms for Validation of Microwave Sensing in Health Monitoring Applications
Previous Article in Special Issue
Road Descriptors for Fast Global Localization on Rural Roads Using OpenStreetMap
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Persona-PhysioSync AV: Personalized Interaction through Personality and Physiology Monitoring in Autonomous Vehicles

1
Advanced Reality Lab, Sammy Ofer School of Communication, Reichman University, 8 Ha’Universita st., Herzliya 4610101, Israel
2
The Research Center for Internet Psychology (CIP), Sammy Ofer School of Communication, Reichman University, 8 Ha’Universita st., Herzliya 4610101, Israel
3
Department of Mathematics and Computer Science, Open University of Israel, 1 University Road P.O. Box 808, Raanana 4353701, Israel
4
Adelson School of Entrepreneurship, Reichman University, 8 Ha’Universita st., Herzliya 4610101, Israel
*
Author to whom correspondence should be addressed.
Sensors 2024, 24(6), 1977; https://doi.org/10.3390/s24061977
Submission received: 31 January 2024 / Revised: 8 March 2024 / Accepted: 13 March 2024 / Published: 20 March 2024

Abstract

:
The emergence of autonomous vehicles (AVs) marks a transformative leap in transportation technology. Central to the success of AVs is ensuring user safety, but this endeavor is accompanied by the challenge of establishing trust and acceptance of this novel technology. The traditional “one size fits all” approach to AVs may limit their broader societal, economic, and cultural impact. Here, we introduce the Persona-PhysioSync AV (PPS-AV). It adopts a comprehensive approach by combining personality traits with physiological and emotional indicators to personalize the AV experience to enhance trust and comfort. A significant aspect of the PPS-AV framework is its real-time monitoring of passenger engagement and comfort levels within AVs. It considers a passenger’s personality traits and their interaction with physiological and emotional responses. The framework can alert passengers when their engagement drops to critical levels or when they exhibit low situational awareness, ensuring they regain attentiveness promptly, especially during Take-Over Request (TOR) events. This approach fosters a heightened sense of Human–Vehicle Interaction (HVI), thereby building trust in AV technology. While the PPS-AV framework currently provides a foundational level of state diagnosis, future developments are expected to include interaction protocols that utilize interfaces like haptic alerts, visual cues, and auditory signals. In summary, the PPS-AV framework is a pivotal tool for the future of autonomous transportation. By prioritizing safety, comfort, and trust, it aims to make AVs not just a mode of transport but a personalized and trusted experience for passengers, accelerating the adoption and societal integration of autonomous vehicles.

1. Introduction

Autonomous vehicle (AV) systems are experiencing rapid advancements, as highlighted by Chan (2017), showing remarkable technological progress [1]. However, they are still limited in achieving complete automation [2]. Industry research reveals promises of a revolution in traffic safety, mobility, and quality of life. However, the success of AVs depends on their acceptance [3]. Amidst this progress, there remains a gap in our comprehension of driver preferences within decision scenarios that semi-autonomous vehicles (semi-AVs) encounter [4], all except Level 5 Automation. The Society of Automotive Engineers International (SAE) introduced a classification system in 2016 for vehicle automation levels, ranging from Level 1 (driver assistance) to Level 5 (complete driving automation). The levels are defined as follows: (0) Level 0—No Automation: traditional vehicles without any automation; (1) Level 1—Minimal and Optional Automation: includes features like adaptive cruise control; (2) Level 2—Partial Automation: the vehicle’s automated system can control the car, but the driver maintains responsibility and must be ready to intervene if necessary; (3) Level 3—Conditional Automation: the car is primarily operated by the automated system, but the driver must be present to take control in emergencies; (4) Level 4—High Automation: the automated system handles all driving tasks, including parking, although the driver has the option to take control; (5) Level 5—Full Automation: the vehicle is entirely autonomous, functioning like a robotic taxi [5]. As described, despite significant technological advancements, current autonomous vehicles (AVs) up to Level 4 still require active passenger interaction to facilitate safe traveling. The process of AV decision making, shaped by passengers’ inclinations and preferences, underscores Human–Vehicle Interaction (HVI) as a critical domain within the realm of these emerging challenges [6].
The progression towards empowering AVs, particularly those operating in dense, highly populated areas and responsible for transporting individuals safely from one location to the other [7], is contingent on establishing robust trust and social acceptance towards these robotic entities. The level of trust that would result in high acceptance and product penetration has yet to be achieved [8]. This aspect is of paramount importance for the AV industry, as low levels of trust translate to decreased demand, subsequently leading to higher production costs. Conversely, as consumer trust in new technology grows, demand is expected to increase, driving production levels up and, ideally, reducing costs. The paradox within this context arises from the dual nature of the intent to incorporate autonomous driving agents into society. While there is a drive to design and incorporate autonomous machines and vehicles into daily life, over-reliance and autonomous control have been linked to accidents [9]. One of the central and primary goals of autonomous driving is to provide its passengers with the leisure to participate in other activities such as work, watching a video, or simply enjoying the ride without the stress and demands of driving [10]. Consequently, and unwantedly, the human component within this interaction assumes a vital role, especially in Level 3 [5], diminishing the potential of AVs to provide this leisure. Therefore, we learn that the autonomous component cannot rely only on technological aid, but human involvement must be maintained.
The experience within an autonomous vehicle should enable passengers and drivers to engage in non-driving-related tasks (NDRTs) while keeping relevant engagement with the journey and limiting motion sickness and maintaining high comfort [11]. Presently, numerous sensors are being integrated into the cabins of autonomous vehicles (AVs) to facilitate advanced interaction, enabling the understanding of driver and/or passenger actions, emotions, and personal preferences. This integration aims to provide adequate functionalities and services for a safe and enjoyable drive [6]. Among these interfaces, some focus on detecting attention, emotion recognition, drowsiness, and mental workload [6]. These interfaces prioritize evaluating and ensuring the proper state of the driver/passenger to maximize safety levels during the drive using driving assistance systems [6].
A critical aspect of the interaction between the autonomous vehicle and the passenger/driver are Take-Over Requests (TORs); these are crucial and must be acted upon quickly [12]. The TOR transition necessitates the passive passenger’s shift into an active driver role, which poses a recurring challenge. During this transition, the driver, who has been in a passive role, is asked by a driving assistance system to take control of the vehicle navigation system and wheel. One of the risks that they may experience is a loss of situational awareness (SA) [13] required to assume the active role swiftly [14,15]. Nevertheless, as outlined in Vogelphol et al. (2018), individuals who are highly distracted need more time to restore their situational awareness and respond appropriately to the TOR [16]. The TOR represents one of the most critical interaction challenges between an autonomous vehicle and its driver/passenger. This interaction is paramount for ensuring passenger safety and, in turn, bolstering trust in the system. Therefore, confirming passenger engagement is essential to guarantee appropriate reaction times when necessary, such as TOR [17,18].
Du, Yang, and Zhou (2020) presented evidence indicating that cognitive workload, emotional states, attention levels, and engagement play pivotal roles in the TOR process; as such, it is imperative to monitor these factors [12]. In the paper by Melcher et al. (2015), the authors present the challenge of adequately designing the TOR strategies. These include different strategies for the initiation of the TOR as well as integration with brake jerk stimuli and an audio one [19]. Although they did not find a significant difference between their test groups, they report on the importance of integrating the TOR request as part of the NDRTs (in their case, a mobile application in which they are engaged) which results in higher perceived safety of the system [19]. In a study conducted by Zue et al. (2023), TOR design was assessed while passengers were reading or watching videos in NDRTs. Their findings indicate that incorporating tactile-auditory or tactile-visual cues with 7-s and 9-s TOR Lead Times produced positive outcomes in reading scenarios, while tactile cues showed effectiveness in video watching scenarios [20]. As presented, the process of reacquiring control of a vehicle following automation is of critical importance. This process necessitates that the driver maintain awareness of the specific emergency situation, process information conveyed by the system, and discern and comprehend the prevailing traffic conditions. Several factors contribute to the complexity of this task, encompassing objective factors such as traffic conditions, road conditions, and the control interface, as well as subjective factors including NDRTs, age, trust, urgency, the human–machine interface (HMI) employed, and SA [21]. Current solutions include the evaluation of different interaction paradigms that aim to recruit passenger engagement in low response times [6] and the incorporation of human-like behaviors into the operational parameters of the vehicles [22]. The project Mediator (European Union’s Horizon 2020 research and innovation programme) was designed to develop a system that can determine the most suitable choice between using a car and having a driver based on who is best suited for driving at any given moment. In our current work, we suggest expanding on current perspectives and unlocking a deeper understanding of users’ state and engagement levels by incorporating additional subjective factors into the measurement process such as their personality traits [23] and psychophysiological responses [24,25]. Our goal is to introduce a novel interface to aid in this aspect for AVs up to Level 4, encompassing those involving TOR interaction. The integration of subjective measures related to personality and physiological traits may evoke apprehensions regarding data privacy [26]. To mitigate this, it is imperative that the system guarantees full privacy and encryption in data collection. By offering complete assurance of data privacy, the system can leverage the potential benefits of data collection while fostering a balanced interaction with the AV, thereby nurturing trust and bolstering acceptance.

2. Trust in the Safety of the Autonomous Car

Trust in and acceptance of technology are significantly influenced by an individual’s personality [23], disposition [27], and physiological tendencies [28]. A fundamental aspect of engaging in semi-AV driving involves transitioning from an active driver role to a passive passenger role and the other way around. This distinction gives rise to substantial concerns regarding the adoption of AVs and their market expansion [29]. Although highly advanced and capable, the autonomous vehicle, at certain times, would choose to relinquish control and activate a TOR strategy in order to navigate it over the current obstacle or path with the help of the user. These reservations manifest themselves in reduced levels of trust and societal acceptance of AVs, thereby limiting the potential for the widespread adoption of such systems. Ganesh (2020) further highlights the necessity of incorporating human controls into the vehicle’s operational framework as a complementary mechanism to enhance agency and elevate safety, specifically in terms of information exchange between the vehicle, its occupants, and drivers [30]. Effective communication is vital, ideally through interactions that engage users without overtaxing their cognitive and emotional resources based on a human-centered design approach [31] Interaction mechanisms vary, ranging from voice commands [32] and touch-based inputs to gestures [6]. These interactions can be explicit, where the user is fully aware of the interaction, or implicit, occurring subconsciously. This assumption was displayed in the work by Nakade et al. (2023), who use haptic co-driving to show such concepts [9]. Applying this approach can lead to a smoother, more comfortable experience for passengers and fewer issues with the flawed operation of TORs.
It is suggested that the interaction with autonomous vehicles (AVs) might reflect the subtle, implicit interactions found in human–human collaborations during joint physical tasks or be akin to the way individuals use glasses for reading, where there is an underlying sense of merging or collaboration with the vehicle. Researchers such as Fuchs, Ganesh, and Nakade have explored these concepts [9,30,33]. The development and application of suitable implicit interfaces could potentially foster these sensations. By integrating human elements into the interaction systems between humans and vehicles, particularly through the ongoing monitoring of physiological responses, movement, eye tracking, and more, trust and acceptance are likely to increase. To cultivate trust, AVs should not solely rely on their sophisticated visual capabilities for navigation. They should also take into account the intentions of the passengers [34], their personalities [23,35,36,37], and levels of engagement [18]. In Nakade et al. (2023), a driver-centric automation strategy was introduced with the aim of achieving collaborative steering through haptic co-driving [9].
Our interaction framework involves two distinct stages, each tailored to different aspects of the autonomous vehicle (AV) experience. Firstly, our framework prioritizes safety during TOR transitions. This phase is crucial as it marks the shift from autonomous driving to human control. To ensure the safety of the drive, the AV employs comprehensive safety protocols, basing the operation on real-time monitoring systems provided by the framework. This includes rigorous testing and validation of the AV’s sensor fusion algorithms and decision making processes to accurately assess the driving environment and respond to potential hazards effectively. Additionally, clear communication channels are established to alert passengers about the transition and provide reassurance regarding the safety measures in place. Secondly, the framework seamlessly transitions into a mode focused on enhancing user comfort throughout the drive. This involves optimizing the AV’s driving behavior to prioritize passenger comfort, such as smooth acceleration, braking, and steering. Furthermore, the framework incorporates personalized settings and preferences to tailor the driving experience to individual passengers, including options for climate control, entertainment, and seating adjustments. By prioritizing passenger comfort, the framework aims to enhance the overall user experience and promote trust in and acceptance of autonomous driving technology. Based on the suggested implicit interaction framework, in this paper, we will propose a set of measures that can be incorporated into autonomous vehicles. These sensors, in conjunction with personality trait measurement, can offer an appropriate interface to monitor passenger engagement levels. This can help in adapting driving strategies to achieve the successful transportation of riders as well as TOR transitions. By implementing this method, the AV will be able to interpret both explicit (verbal, touch movement, etc.) and implicit signals from users. This will lead to enhanced HVI that is expected to bolster user acceptance and trust in AV technology. The AVs will adjust their driving and navigation based on these signals to ensure that users are engaged and comfortable. For example, the AV may alert users if they are drifting away or not paying attention, or it may adapt its driving patterns to reduce discomfort for passengers. The operation should be unobtrusive yet engaging, providing necessary control to avert potential disasters. We suggest the use of an array of sensors that will provide an implicit interface between the vehicle and the passenger/driver, resulting in a more natural way of interaction, one that induces a cooperative interaction within the HVI. We propose a conceptual framework referred to as the Persona-PhysioSync AV (PPS-AV) in Figure 1, designed to improve how users interact with semi-AVs. This PPS-AV framework is based on aligning the interaction with the user’s personality traits, as described by Amichai-Hamburger et al. (2022) [23]. Additionally, the PPS-AV framework considers users’ current psychophysiological and emotional state, indicated by factors like facial expressions. By integrating personality traits with real-time emotional and psychophysiological data that produce the state reaction of the interaction, the PPS-AV framework aims to tailor the interaction style between the user and the vehicle. This tailored approach is intended to enhance the user’s trust in and acceptance of the autonomous vehicle. The current work aims to elucidate the hypothesis that natural cooperative interaction with autonomous vehicles is crucial and might be even more critical than safety measures in elevating trust in and acceptance of the technology.
The PPS-AV framework draws on research by Pascale et al. (2021), which indicates that trust in technology, specifically autonomous vehicles (AVs), can be enhanced through direct experience [38]. Their study reveals that after individuals ride in an autonomous car, they tend to have a more favorable attitude towards this technology, showing increased acceptance and decreased risk perception. This change is particularly significant, as participants noted less false hazard detection during their experience [38]. These findings underscore the importance of human interaction with cars, suggesting that aligning driving patterns with passengers’ expectations and comfort levels could lead to greater trust in and acceptance of AVs.
Tailored interactions and interfaces, which are non-intrusive and require minimal cognitive effort [39] but elevate motivation [40], are particularly vital in the context of self-driving cars. Hartwich et al. (2021) conducted a simulator-based study wherein they assessed the impact of different human–machine interfaces (HMIs) on passenger experiences, with a particular focus on trust [29]. The study involved a comparison between a context-adaptive HMI and a fixed (permanent) HMI. Additionally, the researchers controlled for initial trust levels, distinguishing between low-trust and high-trust groups among their participants. The findings revealed a significant reliance on participants’ initial levels of trust and their perception of trust in various HMIs. Specifically, the low-trust group displayed a preference for the permanent HMI, while the high-trust group exhibited greater comfort with the context-adaptive HMI [29]. These results suggest the potential benefits of tailoring autonomous systems to individual users to enhance trust and overall user experience. Inducing trust in and acceptance of a system relies on the perception of risk and uncertainty [41].

3. Sensing Users and Tailoring the Experience of the Autonomous Car

It has been demonstrated that an individual’s personality influences how they are likely to use technology [42,43], specifically AVs [3,23,35,44]. Physiological tendencies were also found to play a role in the interaction between users and technology and how they use it [45] and even more so in driving styles and interaction with AVs [46]. The concept of creating interfaces that are based on personality was developed by Amichai Hamburger (2002) [42]. The link and prediction ability between personality traits and physiological response has been previously studied [45,47,48,49,50], specifically in the AV setting [51]. Engagement with the technology and attention to it were critical to induce adequate interaction [52] and elevate levels of trust in and acceptance of it [53]. Personality measures provide valuable trait classification of users [54], and physiological tendencies and ongoing responses provide the dynamic aspect of user interaction [54]. Incorporating these tendencies can be a key predictor of their level of engagement and attention while interacting with the technology.
We suggest a framework that will be based on the development of a continuous monitoring protocol of passengers’ comfort [55], engagement [56,57], stress [58,59], and fatigue levels [60] through multimodal physiological and emotional parameters [61] adjusted to their personality traits (Figure 1) [62]. By proposing this protocol, we aim to create a new way for passengers and drivers to interact with their AVs. This approach will help design a personalized human–machine interface that gathers essential information about the user’s engagement and comfort. The vehicle can then use these data to adjust its driving style accordingly. This strategy is expected to elevate trust in and acceptance of this technology, as it tailors the driving experience to individual needs and preferences.

3.1. Personality

Amichai Hamburger et al. (2020, 2022) suggested that individualizing the autonomous travel experience using personality trait measures can induce the confidence of passengers in the car [23,63]. One of the most influential models about personality is the Big Five, which states five significant factors of personality: (1) openness to experience (inventive/curious vs. consistent/cautious), (2) conscientiousness (efficient/organized vs. easy-going/careless), (3) extroversion (outgoing/energetic vs. solitary/reserved), (4) agreeableness (friendly/compassionate vs. challenging/detached), and (5) neuroticism (sensitive/nervous vs. secure/confident) [64], which were found to be linked to driving styles [23] and crucial in designing intelligent systems and engagement classification [65]. Furthermore, individuals with different needs for control in personality traits respond differently to a given situation. Those in high need of control would feel less comfortable relinquishing control to the car. When it comes to the “locus of control” of individuals, it seems the tendency is to blur the lines and abdicate responsibility to technology. This results in overconfidence in the technology’s ability to keep them safe, which results in slower intervention in autonomous control such as cruise control. However, there are clear signs of system failure [66]. Similarly, their reckless driving pattern results in the belief that their chances of getting into an accident are out of their control [67]. When contemplating sensation seeking traits, it was expected that individuals who are high on the thrill and adventure seeking scale and are high on experience seeking, reported as sensation seekers, drove faster, left less headway between cars, and tended to brake later and harder [3,67].

3.2. Physiological Sensing Indices

Physiological measures provide a window to the emotional state of an individual. As indicators of the activation of the autonomous system, these measures provide a glimpse into the dynamic activation of the sympathetic nervous system when at unease or stress and the default activation of the parasympathetic system when at ease [68]. The extraction of engagement and trust from such physiological parameters was explored by Mühl and colleagues (2020) [69]. Using an autonomous vehicle and a simulator, they aimed to test whether trust is reflected in physiological responses. They measured trust through self-reports, skin conductance, and driving scenarios. Participants experienced the same route with both human and autonomous drivers. Results indicated higher trust in human drivers, confirmed by physiological measures like skin conductance. The study also found a preference for human drivers using a defensive driving style [69]. Feasibility has been demonstrated in measuring individual responses using physiological indices, including stress and emotions while driving, in real-life scenarios [69]. Further evidence of the possibility of evaluating users’ emotional responses using physiology was shown in [61,70,71,72] using relevant simulators [73,74]. Tavakoli et al. (2021) presented the HARMONY experimental platform, which can analyze drivers’ states in naturalistic driving studies using multimodal data collection [71]. This includes in-cabin parameters such as light, noise, and ambiance, driver measures such as heart rate, facial, gaze, and pose features, and outside measures based on global positioning system (GPS) and video-based AI [71]. Utilizing physiological sensing technologies such as heart rate, electrodermal activity, pupillometry, and others has proven to be valuable for revealing cognitive states, motion sickness [75], and drowsiness [76] as well as reducing distraction levels while measuring engagement levels while interacting with an AV [6]. Automatic engagement detection was shown using wearable sensors [77]; the classification of engagement while learning using these methods reached up to 85% classification rates. Belle et al. (2011) present a signal processing system aimed at detecting optimal engagement and attention with successful results in distinguishing between attentional states [78]. Katsis et al. (2008) performed an evaluation of car-racing drivers’ emotional states using facial electromyograms, electrocardiograms, respiration, and electrodermal activity, reaching 75% accuracy rates [79]. As presented, an array of measurements can provide the required interface and support the suggested implicit interface.

3.3. Validated Psychophysiological Sensors

The PPS-AV framework should be equipped with advanced sensors designed to track and assess passenger engagement and comfort inside the AV. We advocate for a comprehensive multimodal approach, which simultaneously analyzes multiple measures. This strategy is based on the following key indices.

3.3.1. Electrodermal Activity (EDA)

Electrodermal activity (EDA) is a biosignal that reflects an individual’s level of arousal by measuring the electrical properties such as conductance variance induced by sweat in their hands during an experience [80]. This method has been extensively researched and validated in numerous studies, including those specifically evaluating arousal in autonomous vehicles (AVs) [81]. Su and jia (2022) used EDA features to detect the comfort levels of passengers in an AV [82]. In a study by Zheng et al. (2015), various biosignals were used to quantify drivers’ stress responses in autonomous trucks [83]. The researchers utilized palmar perspirations, EDA, masseter electromyography, and teeth clenching to assess the stress responses of drivers in autonomous trucks, establishing a correlation between the inter-vehicular distance (distance between trucks) and the corresponding psychophysiological reactions. This study was contextualized within an operational framework to minimize the spatial gap between trucks to reduce aerodynamic drag caused by wind resistance. The primary focus of the investigation was to evaluate the impact of this proximity reduction strategy on the well-being of drivers and passengers within these vehicles. They concluded that reported stress significantly increases as the gap between trucks decreases [83].

3.3.2. Electroencephalography (EEG)

EEG is a technique for recording the brain’s surface layer’s macroscopic activity through electrical activity on the scalp using dry contact electrodes or conductive gel. It was found to be a useful tool in driver distraction research, showing promise in detecting driver fatigue or sleepiness [84,85]. EEG is differentiated into beta, alpha, delta, and theta bands based on the frequency range. Theta waves mark sleepiness onset, while delta activity indicates a sleep state [86]. Some examples of suggested in-vehicle interaction are the recording of EEG for measuring fatigue and sleepiness [84,86,87]. Furthermore, the use EEG in a work scenario has proved helpful as well [88].

3.3.3. Electrooculography (EOG)

The use of optical sensors to track eye movement and blinking patterns was validated to provide insights into cognitive alertness in drivers [89]. Rapid eye movements signify alertness, while drowsiness leads to slower movements and prolonged blink rates. Using EOG was shown to provide up to 80% effectiveness in detecting driver drowsiness [90]. Other work has shown that EOG can be used to predict cognitive alertness and drowsiness [91].

3.3.4. Electromyography (EMG)

The recording of EMG involves an electrical sensor that records muscle electrical activity. Studies such as that by Katsis et al. (2008) show that driver distraction is marked by reduced EMG signal amplitude and frequency, making it a useful tool for assessing alertness levels [79]. The work by Hussain et al. (2023) expresses the use of EMG for intended action prediction as an addition to the use of cameras and radar- and lidar-based advanced driver assistance systems in AVs [92]. Additionally, the research showcases the effectiveness of the suggested concept through experiments aimed at categorizing online and offline EMG data in real-world environments [92]. Demonstrating the assessment of a driver’s capability to respond to TORs using EMG measurements adequately was exemplified by [93]. The SafeDriving system by Fan et al. (2022) employs EMG sensors and deep learning to identify real-time abnormal driving behaviors [94]. The system utilizes a wearable EMG sensor on the driver’s forearm to gather data related to five predefined abnormal driving actions, each of which is subsequently labeled [94].

3.3.5. Electrocardiography (ECG)

ECG monitoring involves a sensor that records heart activity and rate, offering a more straightforward way to detect drivers’ alertness [95]. It can also indicate a driver’s mood, with high heart rates suggesting excitement or anger and normal rates indicating calmness. ECG was effective in identifying alertness [95] and stress [73]. In the work described by Healey and Picard (2005), driver stress was extracted from heart rate measures known as heart rate variability (HRV), a measure that represented the ratio of sympathetic activation compared to parasympathetic activation [81]. Dillen et al. (2021) present compelling evidence of the differences in responses elicited by various autonomous driving styles [96]. Their paper describes ongoing physiological measurements (EDA and heart rate) recorded while the autonomous vehicle took different driving strategies, including acceleration thresholds or lane change behavior. Their results show that the physiological responses were proportionally related to the magnitude of the acceleration and jerk [96]. The AUTOMOTIVE case study by Esteves et al. (2021) focused on using signal processing and machine learning methods to detect drowsiness in individual drivers within AVs, utilizing immersive driving simulators [97]. They employed ECG and facial data to continuously develop personalized drowsiness models, enhancing monitoring efficiency [97].

3.3.6. Body Temperature

Evaluating the comfort levels of drivers and passengers can involve assessing their body temperature, which can be used to fine-tune the car’s internal climate settings to suit individual preferences. Several wearable systems are available to measure body temperature, with some advanced models offering wireless data transmission to smartphones or in-car electronic devices [98,99,100]. Gwak et al. (2015, 2019) present evidence that changes in indoor ambient temperature can enhance both thermal comfort and occupants’ arousal levels [101,102]. Furthermore, as described by Sunagawa et al. (2023), during brief driving sessions, the individual’s thermal surroundings may exert a more significant impact on the progression of drowsiness than the duration of driving [103].
Variations in body temperature can be indicative of stress, anxiety, or discomfort, all of which can significantly impair cognitive functions and decision making abilities [104]. In the context of autonomous vehicles, where the interaction between human and machine is nuanced and complex, recognizing and responding to such physiological cues becomes essential [105]. For instance, an increase in body temperature could signal stress or anxiety, prompting the vehicle’s systems to initiate calming interventions, such as adjusting the interior lighting, climate, or even suggesting a break in the journey. Furthermore, monitoring body temperature can enhance the personalization of the user experience [106]. By understanding an individual’s typical physiological responses under various conditions, the vehicle can preemptively adjust settings for optimal comfort and performance. This level of personalization not only improves the user’s experience but also fosters a deeper trust and rapport between the user and the autonomous system, crucial for the widespread acceptance and success of these vehicles.

3.4. Emotional Measurement

Facial Action Coding System (FACS)

Implementing continuous camera input aimed at the driver or passenger’s face allows for the analysis of ongoing emotional states. The effectiveness of using FACS [107] in an autonomous vehicle (AV) environment was confirmed in a study by Meza-Garcia et al. (2021), which demonstrated that simultaneously recording EDA and FACS is highly effective in capturing emotional responses from participants [108]. Additionally, in-cabin recording and emotion extraction from drivers’ expressions were significantly validated in the work by Liu et al. (2021). Their study describes the concept of an “empathic car”, which is capable of reading and responding to users’ emotions during the driving experience [109]. In the study by Beggiato and Rauh and Krems (2020), facial expressions were used to indicate discomfort while participating in automated driving; they concluded that facial action unit analysis can contribute to the automatic detection of discomfort while riding in AVs [110].
Evaluating experiences based solely on a single signal measurement often falls short of accurately representing an individual’s actual state. By integrating a range of signals, as described, including those related to personality traits, and applying multimodal analysis techniques [111], a deeper and more comprehensive understanding can be achieved. This approach is particularly beneficial when subjective reports are not fully available or influenced by social apprehensions.

3.5. Personality and Physiological Indices

Personality traits and their interactions with physiological responses have been previously studied. Stemmler and Wacker (2010) describe the personality–physiology relationships for central and peripheral nervous system activity measures. They suggest an interactionist conceptualization of traits as dispositions of the situational context, and its subjective representation by the participants moderated this relationship. They conclude by outlining the implications of the interactionistic approach to passengers’ physiological personality research [50]. Evin et al. (2022) describe a machine learning process that could predict drivers’ personality trait levels using EDA and driving behaviors. Their study evaluated risky urban situation behaviors simultaneously with EDA activity. The machine learning protocols employed in the study resulted in 0.968 to 0.974 prediction levels with better detection of neuroticism, extroversion, and conscientiousness [48]. In a study by Childs, White, and De Wit (2014), individuals with high negative emotionality exhibited considerably more significant emotional distress and lower blood pressure responses to the Trier Social Stress Test [47]. Individuals with high agentic positive emotionality exhibited prolonged heart rate responses to stress. In contrast, those with high communal positive emotionality exhibited smaller cortisol and machine learning responses. Shui et al. (2023) report on the ability to predict Big Five personality traits using physiological signals. In their experiment, they recorded HR-based features in five situations averaged across ten days [49]. This evaluation revealed correlations between HR features of 0.32 and 0.26 for the dimensions of openness and extroversion, with the prediction correlation trending significance for conscientiousness and neuroticism. Their findings demonstrate the link between personality and daily heart rate measures using state-of-the-art commercial devices [49]. Physiology and driver characteristics were found valid for the classification of sleep deprivation and cell phone use. The combination of all three features (physiology, personal characteristics including personality measures, and vehicle kinematics) results in the highest classification accuracy [51]. As illustrated, the strong interconnection between personality and physiological responses underscores the necessity of considering both aspects in tandem. For instance, an individual with high scores in thrill seeking traits will exhibit distinct physiological responses compared to someone who scores low on the same measures. Personality traits constitute the stable component of evaluation, while physiological indices contribute to the dynamic elements of measurements. Together, they provide a comprehensive account of the entire experience. Hence, personalization is crucial in fostering a robust interaction between humans and vehicles.

4. Conclusions

The introduction of AVs is a significant technological breakthrough with the potential to transform the transportation sector. User safety is the cornerstone of any mobility technology, including AVs, making it the top priority. While ensuring safety, a secondary goal for AVs is to offer a comfortable experience that enables users to engage in non-driving-related tasks (NDRTs), as outlined in the AV mandate. Despite advancements in safety features, trust in and acceptance of AV technology remain relatively low, which could impede its broader adoption, resulting in low demand and higher production costs. The widespread adoption of AVs may be hindered by the traditional “one size fits all” approach, which could limit their social, economic, and cultural impact. A key challenge identified in the literature is the low level of trust in and acceptance of AV technology. Research indicates the feasibility of measuring users’ sensations and emotional responses during autonomous drives. As highlighted in the work of Hamburger et al., variations in personality traits are crucial in understanding and predicting the appropriate driving style for each passenger (2022) [23]. Additionally, a growing body of research is exploring the link between understanding individual personality traits and their psychophysiological responses. Enhancing user experience and interaction design is critical to building trust in AV technology.
The main paradox, as discussed above, revolves around the seamless transition between autonomous driving and human driver control. Autonomous vehicles (AVs) aim to offer passengers the opportunity to engage in non-driving-related tasks (NDRTs). However, until Level 5 autonomy is achieved, where complete automation is possible [5], fully disengaging passengers poses significant risks. This means that while participating in NDRTs, passengers must maintain a certain level of engagement to respond effectively to TOR events. The suggested PPS-AV framework is designed to resolve this paradox by providing passengers with a dependable monitoring protocol. This protocol allows passengers to disengage to a reasonable extent while ensuring they are promptly alerted if their disengagement becomes excessive. Furthermore, to the best of our knowledge, the integration of personality traits with physiological and emotional indices to classify passengers’ engagement and comfort in autonomous vehicles has not yet been established. This gap highlights the novelty of our proposed framework, as it pioneers linking these elements to enhance the understanding and personalization of passenger experiences in autonomous driving contexts.
For effective interaction with AVs, innovative interfaces are required. These interfaces should monitor personality traits, which affect users’ experiences with AVs, and capture both explicit responses (like speech or behavior) and implicit reactions (such as psychophysiological and emotional indices) that influence the overall sensation during the experience. Providing AVs with the right interface can foster a heightened sense of Human–Vehicle Interaction (HVI), perceived by users as collaboration, thereby fostering trust in the technology. In this paper, we propose a new framework for designing an implicit interface with AVs. This interface aims to monitor passengers’ states to tailor driving styles to individual preferences and track user engagement. It is designed to alert users when they reach critical levels of disengagement and low situational awareness in which they may not respond adequately to TORs, enhancing the overall experience with AVs.
The PPS-AV framework places its emphasis on the initial assessment of passengers’ engagement and comfort levels in autonomous vehicles (AVs). We believe that monitoring individuals’ ongoing states can subsequently help identify conditions unsuitable for vehicle operation during TOR events. Additionally, it can recommend adjustments to driving patterns in response to passengers’ discomfort alerts. Personality traits form the enduring aspect of assessment, while physiological indices add the dynamic facets to measurements. When combined, they offer a holistic depiction of the overall experience. For example, the AV will notify the passenger using naturalistic notifications (haptic, visual, or auditory) when they are too engaged in the NDRTs to pay attention to the road and regain the necessary situational awareness. Without this assessment, identification and intervention would not be possible. The PPS-AV system serves as the foundational level of interaction, paving the way for advancements in Human–Vehicle Interaction within AVs. The next levels will include interaction protocols based on interfaces such as haptic alerts and visual and auditory cues, all based on the concept of emotional haptics [112]. This is for the purpose of circumventing the low engagement of passengers without interrupting their NDRTs.
Another potential application of the system could involve assessing a passenger’s initial condition to determine their eligibility to drive, ensuring they can handle the necessary TOR activation in the required response time to avoid crashing and aligning with their personality traits and inclinations.
A crucial concern demanding attention is the integration of sensors in the automotive sector, which presents two significant challenges: precision and cost. The precision of measurements profoundly affects safety, while cost considerations are pivotal in fully realizing the products’ potential. Therefore, it is imperative to address these challenges during the system’s development. Future research and development efforts should place significant emphasis on resolving these issues, as their successful resolution is essential for advancing the automotive industry.

5. Implications

The implications of the personality–physiological interface (PPS-AV) framework and the broader concepts discussed in the paper for the field of autonomous vehicles (AVs) are significant:
  • Enhanced User Experience: The PPS-AV framework and the focus on monitoring and accommodating individual passenger preferences and comfort levels can greatly enhance the overall user experience. This could lead to increased user satisfaction and greater adoption of AV technology.
  • Safety: By monitoring passengers’ engagement and providing alerts when necessary, the framework contributes to passenger safety. It ensures that passengers remain aware and capable of responding to TOR events promptly, reducing the risk of accidents.
  • Trust Building: AVs often face trust and acceptance challenges. The ability to tailor the driving style and level of engagement to individual passengers’ preferences and personality traits can build trust and comfort with the technology, potentially accelerating its adoption.
  • Personalization: AVs equipped with the PPS-AV framework can offer a more personalized travel experience. This personalization extends beyond just driving style and engagement levels to include entertainment, climate control, and other aspects of the passenger’s journey.
  • Human–Vehicle Interaction (HVI): The development of innovative interfaces that consider personality traits, explicit and implicit responses, and emotional indices can foster a sense of collaboration between passengers and AVs. This HVI can make the interaction with AVs more intuitive and user-friendly.
  • Research and Development: The proposed framework highlights the need for ongoing research and development in the AV field. It encourages the integration of psychophysiological and emotional data into AV systems, which can lead to more advanced and capable AV technologies.
  • Safety Regulations: As AV technology evolves, the introduction of frameworks like PPS-AV may lead to establishing safety regulations and standards that focus on monitoring passenger states and engagement.
  • Market Differentiation: Companies implementing such personalized and safety-enhancing technologies may gain a competitive edge in the AV market, attracting customers who prioritize safety, comfort, and a personalized experience.
  • Data Privacy and Security: The collection of psychophysiological and emotional data raises concerns about data privacy and security. Implications for data protection and secure handling of sensitive passenger information must be addressed in the development and implementation of such systems.
The PPS-AV framework and related concepts have the potential to revolutionize the AV industry by addressing some of its key challenges and significantly improving the way passengers interact with and trust autonomous vehicles. These implications can drive advancements in technology, regulations, and user acceptance in the AV field.
In summary, our interaction framework is designed to address both the safety and comfort aspects of the AV journey. By effectively managing the TOR transition and prioritizing passenger comfort throughout the drive, we aim to deliver a seamless and enjoyable autonomous driving experience for all passengers.

Author Contributions

Writing—original draft, J.G. and Y.A.-H.; Writing—review and editing, Y.S., L.B. and G.G.-F. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

Data are contained within the article.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
HVIHuman–Vehicle Interaction
TORTake-Over Request
AVautonomous vehicle (AV)
AVsautonomous vehicles
EDAelectrodermalactivity
ECGelectrocardiograph
PPGphotoplethysmogram
FACSFacial Action Coding System
EMGelectromyopgraphy
EOGelectrooculargraph
HRVheart rate variability
HMIhuman–machine interface
NDRTsnon-driving-related tasks
SAsituational awareness
PPS-AVPersona-PhysioSync AV
semi-AVssemi-autonomous vehicles
SEASociety of Automotive Engineers International

References

  1. Chan, C.Y. Advancements, prospects, and impacts of automated driving systems. Int. J. Transp. Sci. Technol. 2017, 6, 208–216. [Google Scholar] [CrossRef]
  2. Khan, M.A.; Sayed, H.E.; Malik, S.; Zia, T.; Khan, J.; Alkaabi, N.; Ignatious, H. Level-5 Autonomous Driving—Are We There Yet? A Review of Research Literature. ACM Comput. Surv. 2022, 55, 1–38. [Google Scholar] [CrossRef]
  3. Hegner, S.M.; Beldad, A.D.; Brunswick, G.J. In Automatic We Trust: Investigating the Impact of Trust, Control, Personality Characteristics, and Extrinsic and Intrinsic Motivations on the Acceptance of Autonomous Vehicles. Int. J. Hum.–Comput. Interact. 2019, 35, 1769–1780. [Google Scholar] [CrossRef]
  4. Park, S.Y.; Moore, D.J.; Sirkin, D. What a Driver Wants: User Preferences in Semi-Autonomous Vehicle Decision-Making. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (CHI ’20), Honolulu, HI, USA, 25–30 April 2020; pp. 1–13. [Google Scholar] [CrossRef]
  5. SAE International Releases Updated Visual Chart for Its “Levels of Driving Automation” Standard for Self-Driving Vehicles—sae.org. Available online: https://www.sae.org/news/press-room/2018/12/sae-international-releases-updated-visual-chart-for-its-%E2%80%9Clevels-of-driving-automation%E2%80%9D-standard-for-self-driving-vehicles (accessed on 7 March 2024).
  6. Murali, P.K.; Kaboli, M.; Dahiya, R. Intelligent In-Vehicle Interaction Technologies. Adv. Intell. Syst. 2022, 4, 2100122. [Google Scholar] [CrossRef]
  7. Hang, P.; Lv, C.; Xing, Y.; Huang, C.; Hu, Z. Human-like Decision Making for Autonomous Driving: A Noncooperative Game Theoretic Approach. IEEE Trans. Intell. Transp. Syst. 2020, 22, 2076–2087. [Google Scholar] [CrossRef]
  8. Oh, P.; Jung, Y. Chapter 12. A Machine-Learning Approach to Assessing Public Trust in AI-powered Technologies. In Research Handbook on Artificial Intelligence and Communication; Edward Elgar Publishing: Cheltenham, UK, 2023; p. 193. [Google Scholar]
  9. Nakade, T.; Fuchs, R.; Bleuler, H.; Schiffmann, J. Haptics Based Multi-Level Collaborative Steering Control for Automated Driving. Commun. Eng. 2023, 2, 2. [Google Scholar] [CrossRef]
  10. Krueger, R.; Rashidi, T.H.; Rose, J.M. Preferences for Shared Autonomous Vehicles. Transp. Res. Part C Emerg. Technol. 2016, 69, 343–355. [Google Scholar] [CrossRef]
  11. Htike, Z.; Papaioannou, G.; Siampis, E.; Velenis, E.; Longo, S. Minimisation of Motion Sickness in Autonomous Vehicles. In Proceedings of the 2020 IEEE Intelligent Vehicles Symposium (IV), Las Vegas, NV, USA, 19 October–13 November 2020; pp. 1135–1140. [Google Scholar] [CrossRef]
  12. Du, N.; Yang, X.J.; Zhou, F. Psychophysiological Responses to Takeover Requests in Conditionally Automated Driving. Accid. Anal. Prev. 2020, 148, 105804. [Google Scholar] [CrossRef]
  13. Merat, N.; Seppelt, B.; Louw, T.; Engström, J.; Lee, J.D.; Johansson, E.; Green, C.A.; Katazaki, S.; Monk, C.; Itoh, M.; et al. The “Out-of-the-Loop” concept in automated driving: Proposed definition, measures and implications. Cogn. Technol. Work 2019, 21, 87–98. [Google Scholar] [CrossRef]
  14. White, H.; Large, D.R.; Salanitri, D.; Burnett, G.; Lawson, A.; Box, E. Rebuilding Drivers’ Situation Awareness during Take-over Requests in Level 3 Automated Cars. In Proceedings of the Ergonomics & Human Factors 2019, Stratford-upon-Avon, UK, 29 April–1 May 2019. [Google Scholar]
  15. Röckel, C.; Hecht, H. Regular looks out the window do not maintain situation awareness in highly automated driving. Transp. Res. Part F Traffic Psychol. Behav. 2023, 98, 368–381. [Google Scholar] [CrossRef]
  16. Vogelpohl, T.; Kühn, M.; Hummel, T.; Gehlert, T.; Vollrath, M. Transitioning to manual driving requires additional time after automation deactivation. Transp. Res. Part F Traffic Psychol. Behav. 2018, 55, 464–482. [Google Scholar] [CrossRef]
  17. Funkhouser, K.; Drews, F. Reaction Times When Switching From Autonomous to Manual Driving Control: A Pilot Investigation. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 2016, 60, 1854–1858. [Google Scholar] [CrossRef]
  18. Rauffet, P.; Botzer, A.; Chauvin, C.; Saïd, F.; Tordet, C. The Relationship between Level of Engagement in a Non-Driving Task and Driver Response Time When Taking Control of an Automated Vehicle. Cogn. Technol. Work 2020, 22, 721–731. [Google Scholar] [CrossRef]
  19. Melcher, V.; Rauh, S.; Diederichs, F.; Widlroither, H.; Bauer, W. Take-Over Requests for Automated Driving. Procedia Manuf. 2015, 3, 2867–2873. [Google Scholar] [CrossRef]
  20. Zhu, J.; Zhang, Y.; Ma, Y.; Lv, C.; Zhang, Y. Designing Human-machine Collaboration Interface Through Multimodal Combination Optimization to Improve Takeover Performance in Highly Automated Driving. In Proceedings of the 2023 IEEE 26th International Conference on Intelligent Transportation Systems (ITSC), Bilbao, Spain, 24–28 September 2023; pp. 4895–4900. [Google Scholar] [CrossRef]
  21. Morales-Alvarez, W.; Sipele, O.; Léberon, R.; Tadjine, H.H.; Olaverri-Monreal, C. Automated Driving: A Literature Review of the Take over Request in Conditional Automation. Electronics 2020, 9, 2087. [Google Scholar] [CrossRef]
  22. Lindorfer, M.; Mecklenbraeuker, C.F.; Ostermayer, G. Modeling the Imperfect Driver: Incorporating Human Factors in a Microscopic Traffic Model. IEEE Trans. Intell. Transp. Syst. 2017, 19, 2856–2870. [Google Scholar] [CrossRef]
  23. Amichai-Hamburger, Y.; Sela, Y.; Kaufman, S.; Wellingstein, T.; Stein, N.; Sivan, J. Personality and the autonomous vehicle: Overcoming psychological barriers to the driverless car. Technol. Soc. 2022, 69, 101971. [Google Scholar] [CrossRef]
  24. Deng, M.; Gluck, A.; Zhao, Y.; Li, D.; Menassa, C.C.; Kamat, V.R.; Brinkley, J. An Analysis of Physiological Responses as Indicators of Driver Takeover Readiness in Conditionally Automated Driving. Accid. Anal. Prev. 2024, 195, 107372. [Google Scholar] [CrossRef]
  25. Pakdamanian, E.; Sheng, S.; Baee, S.; Heo, S.; Kraus, S.; Feng, L. DeepTake: Prediction of Driver Takeover Behavior Using Multimodal Data. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (CHI ’21), Yokohama, Japan, 8–13 May 2021; pp. 1–14. [Google Scholar] [CrossRef]
  26. Brell, T.; Biermann, H.; Philipsen, R.; Ziefle, M. Conditional Privacy: Users’ Perception of Data Privacy in Autonomous Driving. In Proceedings of the VEHITS, Heraklion, Greece, 3–5 May 2019; pp. 352–359. [Google Scholar]
  27. Behrenbruch, K.; Söllner, M.; Leimeister, J.M.; Schmidt, L. Understanding Diversity—The Impact of Personality on Technology Acceptance. In Proceedings of the Human-Computer Interaction—INTERACT 2013, Cape Town, South Africa, 2–6 September 2013; Kotzé, P., Marsden, G., Lindgaard, G., Wesson, J., Winckler, M., Eds.; Lecture Notes in Computer Science. Springer: Berlin/Heidelberg, Germany, 2013; pp. 306–313. [Google Scholar] [CrossRef]
  28. Ajenaghughrure, I.B.; Sousa, S.C.; Kosunen, I.J.; Lamas, D. Predictive Model to Assess User Trust: A Psycho-Physiological Approach. In Proceedings of the 10th Indian Conference on Human-Computer Interaction (IndiaHCI ’19), Hyderabad, India, 1–3 November 2019; pp. 1–10. [Google Scholar] [CrossRef]
  29. Hartwich, F.; Hollander, C.; Johannmeyer, D.; Krems, J.F. Improving Passenger Experience and Trust in Automated Vehicles through User-Adaptive HMIs: “The More the Better” Does Not Apply to Everyone. Front. Hum. Dyn. 2021, 3, 669030. [Google Scholar] [CrossRef]
  30. Ganesh, M.I. The Ironies of Autonomy. Humanit. Soc. Sci. Commun. 2020, 7, 157. [Google Scholar] [CrossRef]
  31. Oviatt, S. Human-centered design meets cognitive load theory: Designing interfaces that help people think. In Proceedings of the 14th ACM International Conference on Multimedia (MM ’06), Santa Barbara, CA, USA, 23–27 October 2006; pp. 871–880. [Google Scholar] [CrossRef]
  32. Chang, C.C.; Sodnik, J.; Boyle, L.N. Don’t Speak and Drive: Cognitive Workload of In-Vehicle Speech Interactions. In Proceedings of the Adjunct Proceedings of the 8th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI ’16 Adjunct), Ann Arbor, MI, USA, 24–26 October 2016; pp. 99–104. [Google Scholar] [CrossRef]
  33. Fuchs, R.; Nakade, T.; Schiffmann, J. Teaming with the Driving Automation. ATZelectronics Worldw. 2023, 18, 8–13. [Google Scholar] [CrossRef]
  34. Harada, R.; Yoshitake, H.; Shino, M. Can Visual Information Reduce Anxiety During Autonomous Driving? Analysis and Reduction of Anxiety Based on Eye Movements in Passengers of Autonomous Personal Mobility Vehicles. In Proceedings of the VISIGRAPP (2: HUCAPP), Lisbon, Portugal, 19–21 February 2023; pp. 318–325. [Google Scholar]
  35. Qu, W.; Sun, H.; Ge, Y. The Effects of Trait Anxiety and the Big Five Personality Traits on Self-Driving Car Acceptance. Transportation 2021, 48, 2663–2679. [Google Scholar] [CrossRef]
  36. Sela, Y.; Amichai-Hamburger, Y. “Baby, I Can’t Drive My Car”: How Controllability Mediates the Relationship between Personality and the Acceptance of Autonomous Vehicles? Int. J. Hum.–Comput. Interact. 2023, 1–11. [Google Scholar] [CrossRef]
  37. Tan, H.; Zhao, X.; Yang, J. Exploring the Influence of Anxiety, Pleasure and Subjective Knowledge on Public Acceptance of Fully Autonomous Vehicles. Comput. Hum. Behav. 2022, 131, 107187. [Google Scholar] [CrossRef]
  38. Pascale, M.T.; Rodwell, D.; Coughlan, P.; Kaye, S.A.; Demmel, S.; Dehkordi, S.G.; Bond, A.; Lewis, I.; Rakotonirainy, A.; Glaser, S. Passengers’ Acceptance and Perceptions of Risk While Riding in an Automated Vehicle on Open, Public Roads. Transp. Res. Part F Traffic Psychol. Behav. 2021, 83, 274–290. [Google Scholar] [CrossRef]
  39. Ju, W.; Leifer, L. The Design of Implicit Interactions: Making Interactive Systems Less Obnoxious. Des. Issues 2008, 24, 72–84. [Google Scholar] [CrossRef]
  40. Cincuegrani, S.M.; Jordà, S.; Väljamäe, A. Physiopucks: Increasing User Motivation by Combining Tangible and Implicit Physiological Interaction. ACM Trans. Comput.-Hum. Interact. 2016, 23, 1–22. [Google Scholar] [CrossRef]
  41. Li, X.; Hess, T.J.; Valacich, J.S. Why Do We Trust New Technology? A Study of Initial Trust Formation with Organizational Information Systems. J. Strateg. Inf. Syst. 2008, 17, 39–71. [Google Scholar] [CrossRef]
  42. Amichai-Hamburger, Y. Internet and personality. Comput. Hum. Behav. 2002, 18, 1–10. [Google Scholar] [CrossRef]
  43. Amichai-Hamburger, Y.; Ben-Artzi, E. Loneliness and Internet Use. Comput. Hum. Behav. 2003, 19, 71–80. [Google Scholar] [CrossRef]
  44. Bellem, H.; Thiel, B.; Schrauf, M.; Krems, J.F. Comfort in Automated Driving: An Analysis of Preferences for Different Automated Driving Styles and Their Dependence on Personality Traits. Transp. Res. Part F Traffic Psychol. Behav. 2018, 55, 90–100. [Google Scholar] [CrossRef]
  45. Xu, J.; Montague, E. Psychophysiology of the Passive User: Exploring the Effect of Technological Conditions and Personality Traits. Int. J. Ind. Ergon. 2012, 42, 505–512. [Google Scholar] [CrossRef]
  46. Beggiato, M.; Hartwich, F.; Krems, J. Physiological Correlates of Discomfort in Automated Driving. Transp. Res. Part F Traffic Psychol. Behav. 2019, 66, 445–458. [Google Scholar] [CrossRef]
  47. Childs, E.; White, T.L.; de Wit, H. Personality Traits Modulate Emotional and Physiological Responses to Stress. Behav. Pharmacol. 2014, 25, 493–502. [Google Scholar] [CrossRef]
  48. Evin, M.; Hidalgo-Munoz, A.; Béquet, A.J.; Moreau, F.; Tattegrain, H.; Berthelon, C.; Fort, A.; Jallais, C. Personality Trait Prediction by Machine Learning Using Physiological Data and Driving Behavior. Mach. Learn. Appl. 2022, 9, 100353. [Google Scholar] [CrossRef]
  49. Shui, X.; Chen, Y.; Hu, X.; Wang, F.; Zhang, D. Personality in Daily Life: Multi-Situational Physiological Signals Reflect Big-Five Personality Traits. IEEE J. Biomed. Health Inform. 2023, 27, 2853–2863. [Google Scholar] [CrossRef]
  50. Stemmler, G.; Wacker, J. Personality, emotion, and individual differences in physiological responses. Biol. Psychol. 2010, 84, 541–551. [Google Scholar] [CrossRef]
  51. Darzi, A.; Gaweesh, S.M.; Ahmed, M.M.; Novak, D. Identifying the Causes of Drivers’ Hazardous States Using Driver Characteristics, Vehicle Kinematics, and Physiological Measurements. Front. Neurosci. 2018, 12, 392979. [Google Scholar] [CrossRef]
  52. Fan, L.; Liu, X.; Wang, B.; Wang, L. Interactivity, Engagement, and Technology Dependence: Understanding Users’ Technology Utilisation Behaviour. Behav. Inf. Technol. 2017, 36, 113–124. [Google Scholar] [CrossRef]
  53. Victor, T.W.; Tivesten, E.; Gustavsson, P.; Johansson, J.; Sangberg, F.; Ljung Aust, M. Automation Expectation Mismatch: Incorrect Prediction Despite Eyes on Threat and Hands on Wheel. Hum. Factors 2018, 60, 1095–1116. [Google Scholar] [CrossRef]
  54. Ozer, D.J.; Benet-Martínez, V. Personality and the Prediction of Consequential Outcomes. Annu. Rev. Psychol. 2006, 57, 401–421. [Google Scholar] [CrossRef] [PubMed]
  55. Martínez, M.V.; Del Campo, I.; Echanobe, J.; Basterretxea, K. Driving Behavior Signals and Machine Learning: A Personalized Driver Assistance System. In Proceedings of the 2015 IEEE 18th International Conference on Intelligent Transportation Systems, Gran Canaria, Spain, 15–18 September 2015; pp. 2933–2940. [Google Scholar]
  56. Politis, I.; Langdon, P.; Adebayo, D.; Bradley, M.; Clarkson, P.J.; Skrypchuk, L.; Mouzakitis, A.; Eriksson, A.; Brown, J.W.H.; Revell, K.; et al. An Evaluation of Inclusive Dialogue-Based Interfaces for the Takeover of Control in Autonomous Cars. In Proceedings of the 23rd International Conference on Intelligent User Interfaces (IUI ’18), Tokyo, Japan, 7–11 March 2018; pp. 601–606. [Google Scholar] [CrossRef]
  57. Wen, W.; Kuroki, Y.; Asama, H. The Sense of Agency in Driving Automation. Front. Psychol. 2019, 10, 2691. [Google Scholar] [CrossRef] [PubMed]
  58. Heikoop, D.D.; de Winter, J.C.; van Arem, B.; Stanton, N.A. Effects of Platooning on Signal-Detection Performance, Workload, and Stress: A Driving Simulator Study. Appl. Ergon. 2017, 60, 116–127. [Google Scholar] [CrossRef] [PubMed]
  59. Niermann, D.; Lüdtke, A. Measuring Driver Discomfort in Autonomous Vehicles. In Proceedings of the Intelligent Human Systems Integration, Modena, Italy, 19–21 February 2020; Ahram, T., Karwowski, W., Vergnano, A., Leali, F., Taiar, R., Eds.; Advances in Intelligent Systems and Computing. Springer International Publishing: Cham, Switzerland, 2020; pp. 52–58. [Google Scholar] [CrossRef]
  60. Chowdhury, A.; Shankaran, R.; Kavakli, M.; Haque, M.M. Sensor Applications and Physiological Features in Drivers’ Drowsiness Detection: A Review. IEEE Sens. J. 2018, 18, 3055–3067. [Google Scholar] [CrossRef]
  61. Tavakoli, A.; Balali, V.; Heydarian, A. A Multimodal Approach for Monitoring Driving Behavior and Emotions; San José State University: San Jose, CA, USA, 2020. [Google Scholar]
  62. Cellar, D.F.; Nelson, Z.C.; Yorke, C.M. The Five-Factor Model and Driving Behavior: Personality and Involvement in Vehicular Accidents. Psychol. Rep. 2000, 86, 454–456. [Google Scholar] [CrossRef] [PubMed]
  63. Amichai-Hamburger, Y.; Mor, Y.; Wellingstein, T.; Landesman, T.; Ophir, Y. The personal autonomous car: Personality and the driverless car. Cyberpsychology Behav. Soc. Netw. 2020, 23, 242–245. [Google Scholar] [CrossRef] [PubMed]
  64. Akbari, M.; Kamran, B.L.; Heydari, S.T.; Motevalian, S.A.; Tabrizi, R.; Asadi-Shekari, Z.; Sullman, M.J. Meta-Analysis of the Correlation between Personality Characteristics and Risky Driving Behaviors. J. Inj. Violence Res. 2019, 11, 107. [Google Scholar]
  65. Salam, H.; Çeliktutan, O.; Hupont, I.; Gunes, H.; Chetouani, M. Fully Automatic Analysis of Engagement and Its Relationship to Personality in Human-Robot Interactions. IEEE Access 2017, 5, 705–721. [Google Scholar] [CrossRef]
  66. Rudin-Brown, C.M.; Parker, H.A. Behavioural Adaptation to Adaptive Cruise Control (ACC): Implications for Preventive Strategies. Transp. Res. Part F Traffic Psychol. Behav. 2004, 7, 59–76. [Google Scholar] [CrossRef]
  67. Payre, W.; Cestac, J.; Delhomme, P. Intention to Use a Fully Automated Car: Attitudes and a Priori Acceptability. Transp. Res. Part F Traffic Psychol. Behav. 2014, 27, 252–263. [Google Scholar] [CrossRef]
  68. Norman, G.J.; Necka, E.; Berntson, G.G. 4—The Psychophysiology of Emotions. In Emotion Measurement; Meiselman, H.L., Ed.; Woodhead Publishing: Sawston, UK, 2016; pp. 83–98. [Google Scholar] [CrossRef]
  69. Mühl, K.; Strauch, C.; Grabmaier, C.; Reithinger, S.; Huckauf, A.; Baumann, M. Get Ready for Being Chauffeured: Passenger’s Preferences and Trust While Being Driven by Human and Automation. Hum. Factors 2020, 62, 1322–1338. [Google Scholar] [CrossRef] [PubMed]
  70. Giron, J.; Mladenovic, J.; Hanrieder, M.; Golland, Y.; Friedman, D. Physiological correlates of stress induced by virtual humans in a naturalistic virtual reality scenario. In Proceedings of the 2023 11th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW), Cambridge, MA, USA, 10–13 September 2023; pp. 1–8. [Google Scholar]
  71. Tavakoli, A.; Kumar, S.; Guo, X.; Balali, V.; Boukhechba, M.; Heydarian, A. HARMONY: A Human-Centered Multimodal Driving Study in the Wild. IEEE Access 2021, 9, 23956–23978. [Google Scholar] [CrossRef]
  72. Tavakoli, A.; Heydarian, A. Multimodal Driver State Modeling through Unsupervised Learning. Accid. Anal. Prev. 2022, 170, 106640. [Google Scholar] [CrossRef] [PubMed]
  73. Kerautret, L.; Dabic, S.; Navarro, J. Detecting Driver Stress and Hazard Anticipation Using Real-Time Cardiac Measurement: A Simulator Study. Brain Behav. 2022, 12, e2424. [Google Scholar] [CrossRef] [PubMed]
  74. Roidl, E.; Frehse, B.; Höger, R. Emotional States of Drivers and the Impact on Speed, Acceleration and Traffic Violations—A Simulator Study. Accid. Anal. Prev. 2014, 70, 282–292. [Google Scholar] [CrossRef] [PubMed]
  75. Lev, R.; Fried, E.; Giron, J. Systems and Methods for Predicting and Preventing Motion Sickness. U.S. Patent App. 17/259,168, 27 May 2021. [Google Scholar]
  76. Albadawi, Y.; Takruri, M.; Awad, M. A Review of Recent Developments in Driver Drowsiness Detection Systems. Sensors 2022, 22, 2069. [Google Scholar] [CrossRef]
  77. Carroll, M.; Ruble, M.; Dranias, M.; Rebensky, S.; Chaparro, M.; Chiang, J.; Winslow, B. Automatic Detection of Learner Engagement Using Machine Learning and Wearable Sensors. J. Behav. Brain Sci. 2020, 10, 165–178. [Google Scholar] [CrossRef]
  78. Belle, A.; Hobson, R.; Najarian, K. A Physiological Signal Processing System for Optimal Engagement and Attention Detection. In Proceedings of the 2011 IEEE International Conference on Bioinformatics and Biomedicine Workshops (BIBMW), Washington, DC, USA, 12–15 November 2011; pp. 555–561. [Google Scholar] [CrossRef]
  79. Katsis, C.D.; Katertsidis, N.; Ganiatsas, G.; Fotiadis, D.I. Toward emotion recognition in car-racing drivers: A biosignal processing approach. IEEE Trans. Syst. Man, Cybern.-Part A Syst. Humans 2008, 38, 502–512. [Google Scholar] [CrossRef]
  80. Boucsein, W. Electrodermal Activity; Springer Science & Business Media: Berlin, Germany, 2012. [Google Scholar]
  81. Healey, J.A.; Picard, R.W. Detecting stress during real-world driving tasks using physiological sensors. IEEE Trans. Intell. Transp. Syst. 2005, 6, 156–166. [Google Scholar] [CrossRef]
  82. Su, H.; Jia, Y. Study of Human Comfort in Autonomous Vehicles Using Wearable Sensors. IEEE Trans. Intell. Transp. Syst. 2022, 23, 11490–11504. [Google Scholar] [CrossRef]
  83. Zheng, R.; Yamabe, S.; Nakano, K.; Suda, Y. Biosignal Analysis to Assess Mental Stress in Automatic Driving of Trucks: Palmar Perspiration and Masseter Electromyography. Sensors 2015, 15, 5136–5150. [Google Scholar] [CrossRef] [PubMed]
  84. Campagne, A.; Pebayle, T.; Muzet, A. Correlation between driving errors and vigilance level: Influence of the driver’s age. Physiol. Behav. 2004, 80, 515–524. [Google Scholar] [CrossRef] [PubMed]
  85. Subasi, A.; Saikia, A.; Bagedo, K.; Singh, A.; Hazarika, A. EEG-Based Driver Fatigue Detection Using FAWT and Multiboosting Approaches. IEEE Trans. Ind. Inform. 2022, 18, 6602–6609. [Google Scholar] [CrossRef]
  86. Sanei, S.; Chambers, J.A. EEG Signal Processing; John Wiley & Sons: Hoboken, NJ, USA, 2013. [Google Scholar]
  87. Sengupta, A.; Dasgupta, A.; Chaudhuri, A.; George, A.; Routray, A.; Guha, R. A Multimodal System for Assessing Alertness Levels Due to Cognitive Loading. IEEE Trans. Neural Syst. Rehabil. Eng. 2017, 25, 1037–1046. [Google Scholar] [CrossRef]
  88. Deng, M.; Wang, X.; Menassa, C.C. Measurement and prediction of work engagement under different indoor lighting conditions using physiological sensing. Build. Environ. 2021, 203, 108098. [Google Scholar] [CrossRef]
  89. Tag, B.; Vargo, A.W.; Gupta, A.; Chernyshov, G.; Kunze, K.; Dingler, T. Continuous Alertness Assessments: Using EOG Glasses to Unobtrusively Monitor Fatigue Levels In-The-Wild. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019; pp. 1–12. [Google Scholar] [CrossRef]
  90. Chieh, T.C.; Mustafa, M.M.; Hussain, A.; Hendi, S.F.; Majlis, B.Y. Development of vehicle driver drowsiness detection system using electrooculogram (EOG). In Proceedings of the 2005 1st International Conference on Computers, Communications, & Signal Processing with Special Track on Biomedical Engineering, Kuala Lumpur, Malaysia, 15–16 November 2005; pp. 165–168. [Google Scholar]
  91. Xiao, H.; Duan, Y.; Zhang, Z.; Li, M. Detection and Estimation of Mental Fatigue in Manual Assembly Process of Complex Products. Assem. Autom. 2017, 38, 239–247. [Google Scholar] [CrossRef]
  92. Hussain, M.I.; Rafique, M.A.; Kim, J.; Jeon, M.; Pedrycz, W. Artificial Proprioceptive Reflex Warning Using EMG in Advanced Driving Assistance System. IEEE Trans. Neural Syst. Rehabil. Eng. 2023, 31, 1635–1644. [Google Scholar] [CrossRef] [PubMed]
  93. Kim, K.T.; Kim, J.W.; Han, W. The user interface based on Electromyography analysis to takeover Driving mode in autonomous vehicle. In Proceedings of the 2016 IEEE Transportation Electrification Conference and Expo, Asia-Pacific (ITEC Asia-Pacific), Busan, Republic of Korea, 1–4 June 2016; pp. 697–701. [Google Scholar] [CrossRef]
  94. Fan, Y.; Gu, F.; Wang, J.; Wang, J.; Lu, K.; Niu, J. SafeDriving: An Effective Abnormal Driving Behavior Detection System Based on EMG Signals. IEEE Internet Things J. 2022, 9, 12338–12350. [Google Scholar] [CrossRef]
  95. Bogunovic, N.; Jovic, A. Processing and analyisis of biomedical nonlinear signals by data mining methods. In Proceedings of the 17th International Conference on Systems, Signals and Image Processing, IWSSIP, Rio de Janeiro, Brazil, 17–19 June 2010; pp. 276–279. [Google Scholar]
  96. Dillen, N.; Ilievski, M.; Law, E.; Nacke, L.E.; Czarnecki, K.; Schneider, O. Keep Calm and Ride along: Passenger Comfort and Anxiety as Physiological Responses to Autonomous Driving Styles. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020; pp. 1–13. [Google Scholar]
  97. Esteves, T.; Pinto, J.R.; Ferreira, P.M.; Costa, P.A.; Rodrigues, L.A.; Antunes, I.; Lopes, G.; Gamito, P.; Abrantes, A.J.; Jorge, P.M.; et al. AUTOMOTIVE: A Case Study on AUTOmatic multiMOdal Drowsiness detecTIon for smart VEhicles. IEEE Access 2021, 9, 153678–153700. [Google Scholar] [CrossRef]
  98. Bhattacharjee, M.; Nikbakhtnasrabadi, F.; Dahiya, R. Printed chipless antenna as flexible temperature sensor. IEEE Internet Things J. 2021, 8, 5101–5110. [Google Scholar] [CrossRef]
  99. Escobedo, P.; Bhattacharjee, M.; Nikbakhtnasrabadi, F.; Dahiya, R. Smart bandage with wireless strain and temperature sensors and batteryless NFC tag. IEEE Internet Things J. 2020, 8, 5093–5100. [Google Scholar] [CrossRef]
  100. Soni, M.; Bhattacharjee, M.; Ntagios, M.; Dahiya, R. Printed temperature sensor based on PEDOT: PSS-graphene oxide composite. IEEE Sensors J. 2020, 20, 7525–7531. [Google Scholar] [CrossRef]
  101. Gwak, J.; Shino, M.; Ueda, K.; Kamata, M. Effects of Changes in the Thermal Factor on Arousal Level and Thermal Comfort. In Proceedings of the 2015 IEEE International Conference on Systems, Man, and Cybernetics, Hong Kong, 9–12 October 2015; pp. 923–928. [Google Scholar] [CrossRef]
  102. Gwak, J.; Shino, M.; Ueda, K.; Kamata, M. An Investigation of the Effects of Changes in the Indoor Ambient Temperature on Arousal Level, Thermal Comfort, and Physiological Indices. Appl. Sci. 2019, 9, 899. [Google Scholar] [CrossRef]
  103. Sunagawa, M.; Shikii, S.i.; Beck, A.; Kek, K.J.; Yoshioka, M. Analysis of the effect of thermal comfort on driver drowsiness progress with Predicted Mean Vote: An experiment using real highway driving conditions. Transp. Res. Part F Traffic Psychol. Behav. 2023, 94, 517–527. [Google Scholar] [CrossRef]
  104. Chen, J.; Abbod, M.; Shieh, J.S. Pain and stress detection using wearable sensors and devices—A review. Sensors 2021, 21, 1030. [Google Scholar] [CrossRef] [PubMed]
  105. Shin, Y.; Ham, J.; Cho, H. Experimental study of thermal comfort based on driver physiological signals in cooling mode under summer conditions. Appl. Sci. 2021, 11, 845. [Google Scholar] [CrossRef]
  106. Kajiwara, S. Evaluation of driver status in autonomous vehicles: Using thermal infrared imaging and other physiological measurements. Int. J. Veh. Inf. Commun. Syst. 2019, 4, 232. [Google Scholar] [CrossRef]
  107. Ekman, P.; Friesen, W.V. Facial action coding system. Environ. Psychol. Nonverbal Behav. 1978. [Google Scholar] [CrossRef]
  108. Meza-García, B.; Rodríguez-Ibáñez, N. Driver’s Emotions Detection with Automotive Systems in Connected and Autonomous Vehicles (CAVs). In Proceedings of the CHIRA, Online Streaming, 28–29 October 2021; pp. 258–265. [Google Scholar]
  109. Liu, S.; Koch, K.; Zhou, Z.; Föll, S.; He, X.; Menke, T.; Fleisch, E.; Wortmann, F. The Empathetic Car: Exploring Emotion Inference via Driver Behaviour and Traffic Context. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2021, 5, 1–34. [Google Scholar] [CrossRef]
  110. Beggiato, M.; Rauh, N.; Krems, J. Facial Expressions as Indicator for Discomfort in Automated Driving. In Proceedings of the Intelligent Human Systems Integration, Modena, Italy, 19–21 February 2020; Ahram, T., Karwowski, W., Vergnano, A., Leali, F., Taiar, R., Eds.; Advances in Intelligent Systems and Computing. Springer International Publishing: Cham, Switzerland, 2020; pp. 932–937. [Google Scholar] [CrossRef]
  111. Rukavina, S.; Gruss, S.; Walter, S.; Hoffmann, H.; Traue, H.C. Open_emorec_ii-a multimodal corpus of human-computer interaction. Int. J. Comput. Inf. Eng. 2015, 9, 1216–1222. [Google Scholar]
  112. Obrist, M.; Subramanian, S.; Gatti, E.; Long, B.; Carter, T. Emotions Mediated Through Mid-Air Haptics. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI ’15), Seoul, Republic of Korea, 18–23 April 2015; pp. 2053–2062. [Google Scholar] [CrossRef]
Figure 1. The interface diagram of the PPS-AV illustrates the initial login process for static components like environmental features and trait personality measures. As the user experience unfolds, dynamic components, including psychophysiological and emotional responses, as well as driving events, are logged. These dynamic components encompass heart rate, skin conductance, facial expressions, emotional state, and specific driving events like sudden stops, accelerations, or changes in traffic conditions. The system continuously monitors, records, and synchronizes these data, processes them, and extracts key features. These features are then classified and used to inform the autonomous vehicle (AV), enabling it to respond dynamically to the user’s current state and the driving context. Responses include alerting the user in the case of a loss of situational awareness, adapting the vehicle’s driving style to align with the user’s psychophysiological and emotional conditions, and reacting appropriately to the driving events occurring in real time.
Figure 1. The interface diagram of the PPS-AV illustrates the initial login process for static components like environmental features and trait personality measures. As the user experience unfolds, dynamic components, including psychophysiological and emotional responses, as well as driving events, are logged. These dynamic components encompass heart rate, skin conductance, facial expressions, emotional state, and specific driving events like sudden stops, accelerations, or changes in traffic conditions. The system continuously monitors, records, and synchronizes these data, processes them, and extracts key features. These features are then classified and used to inform the autonomous vehicle (AV), enabling it to respond dynamically to the user’s current state and the driving context. Responses include alerting the user in the case of a loss of situational awareness, adapting the vehicle’s driving style to align with the user’s psychophysiological and emotional conditions, and reacting appropriately to the driving events occurring in real time.
Sensors 24 01977 g001
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Giron, J.; Sela, Y.; Barenboim, L.; Gilboa-Freedman, G.; Amichai-Hamburger, Y. Persona-PhysioSync AV: Personalized Interaction through Personality and Physiology Monitoring in Autonomous Vehicles. Sensors 2024, 24, 1977. https://doi.org/10.3390/s24061977

AMA Style

Giron J, Sela Y, Barenboim L, Gilboa-Freedman G, Amichai-Hamburger Y. Persona-PhysioSync AV: Personalized Interaction through Personality and Physiology Monitoring in Autonomous Vehicles. Sensors. 2024; 24(6):1977. https://doi.org/10.3390/s24061977

Chicago/Turabian Style

Giron, Jonathan, Yaron Sela, Leonid Barenboim, Gail Gilboa-Freedman, and Yair Amichai-Hamburger. 2024. "Persona-PhysioSync AV: Personalized Interaction through Personality and Physiology Monitoring in Autonomous Vehicles" Sensors 24, no. 6: 1977. https://doi.org/10.3390/s24061977

APA Style

Giron, J., Sela, Y., Barenboim, L., Gilboa-Freedman, G., & Amichai-Hamburger, Y. (2024). Persona-PhysioSync AV: Personalized Interaction through Personality and Physiology Monitoring in Autonomous Vehicles. Sensors, 24(6), 1977. https://doi.org/10.3390/s24061977

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop