Coverage of Emotion Recognition for Common Wearable Biosensors
Abstract
:1. Introduction
2. Related Work
3. Materials and Methods
3.1. emotionWear Framework
- (i)
- Wearable Sensing GloveBiomedical sensors (PPG, photoplethysmogram; EDA, electrodermal activity; SKT, fingertip temperature; EMG, electromyography) are mounted on a sensing glove made from a wrist support with Velcro straps for easy installation and setup. The PPG and EDA sensors are attached to the inside of the glove touching the thenar region of the hand of a subject; the SKT digital temperature sensor is attached to the tip of a middle finger; and the EMG sensor is attached to the triceps area of the same arm. All these sensors are wired to a SPHERE (Sensor Platform for HEalthcare in a Residential Environment) (http://www.irc-sphere.ac.uk) module ⓐ, which is a wireless wearable platform having built-in gyroscopes and accelerometer for activity recognition and a wireless connectivity using Bluetooth Smart [61]. PPG ⓓ, EDA ⓔ and EMG ⓕ sensors produce analogue outputs and are connected to a 4-channel A/D (analogue to digital) converter ⓒ allowing the SPHERE module to sample the readings at 100 Hz. The SKT sensor ⓑ, due to the small variation (less than 1 degree centigrade), is connected directly to the SPHERE module through I2C (Inter-Integrated Circuit) bus and sampled at the same rate. Data sampling at 100 Hz is sent as a notification package of 20 bytes in length including all 4 sensor data readings when the SPHERE module is connected and subscribed by a BLE (Bluetooth Low Energy) client, which, in this case, is an Android smartphone. All wearable sensors used in the emotionWear sensing glove are common off-the-shelf biomedical sensors, and the details of the sensors are listed in Appendix A.
- (ii)
- Android smartphone:An Android smartphone (model: Google Pixel with Android OS Version 7.1.2) with tailor-made application software acts as a central control in the emotionWear framework. The application software is comprised of four different modules: ⓖ allows manual selection of multimedia contents previously downloaded to a particular folder as separate files and displayed as two images on the smartphone’s screen such that when wearing a VR headset, the user sees a 2-dimensional image presented to each eye. Audio content is applied through an earphone; therefore, the subject can be treated as being isolated from the outside world and focussed only on the multimedia contents. ⓗ During the multimedia playback, the corresponding signals measured by the wearable biosensors in (i) are collected through the BLE connection and saved to a file. ⓚ The ambient sound is kept to a minimum during the study, and audio from the surroundings is picked up by the smartphone’s built-in microphone to record any sound generated by the subject during the test. ⓜ Once a study session is over, the smartphone pushes the related data files (i.e., biosignals and the context) to the Internet through WiFi or cellular connection and stored in the cloud ⓝ.
- (iii)
- Data analysisData analysis is done using interactive Python (iPython), based on the jupyter notebook application (http://www.jupyter.org), which has an active community providing third-party libraries for data sciences [62]. The iPython code is written to show a dashboard view of the emotionWear framework for analysing the synchronised stimulation (i.e., the multimedia contents including visual and audio) and the accompanying physiological responses (i.e., the signals picked up by the 4 wearable biosensors, together with the capture of the environmental sound) under emotional perceptions. The visual stimulus is displayed as still images, which are the screen shots of the video in 100-ms intervals ⓞ; the audio part of the video is displayed as an amplitude envelope plot ⓟ; the capture of the environmental sound generated from the subject during the study is also plotted in the same scale ⓠ; a set of interactive control widgets of ipywidgets (https://ipywidgets.readthedocs.io/en/stable/) is implemented to move the dashboard view to the specific time instances according to the following criteria:
- Slider: slide to anywhere on the time scale in the 100-ms interval, and the corresponding stimulus (video screen shot and audio segments) and responses (waveforms of the biosignals) at the selected time instance are displayed.
- Threshold level detection for the four biosignals activates the comparison logic for small, medium and high threshold amplitudes preset by the program. Once the comparison output is positive, the slider is moved to the time instance of the comparison.
- Threshold level detection for ANS specificity activates the comparison logic for ANS patterns comprising the four biosignals preset by the program. Once the comparison output is positive, the slider is moved to the time instance of the comparison.
The actual waveforms for the corresponding biosignals in the time and frequency domains are displayed in ⓣ. These waveforms show the biosignals being sampled at 100 Hz synchronised with the multimedia stimuli in ⓞ ⓟ and the sound capture ⓠ.
3.2. Biosensors and Features Selection
- (a)
- PPG, EDA and SKT:PPG captures the blood volume pulses, which can then be used to derive features from both the time and frequency domains. EDA captures the skin conductance, and both the mean and standard deviation are extracted as features. Similar to EDA, SKT captures fingertip temperature with features derived from both the mean and standard deviation. Table 1 lists all features with the associated calculations.All features are calculated based on different rolling windows defined by the “window” parameters, which affect the resolution for temporal analysis. A short window such as 10 s is applied for most of the time domain features, but it is not reliable for frequency domain features, especially low frequency (LF) (0.04 Hz–0.15 Hz) [63]. While there are obvious variations in the high frequency (HF) features, the LF features are not stable since a frequency transform on low frequency components is not reliable due to insufficient data over a short time frame, e.g., a period of 25 s is required for one cycle of a 0.04-Hz signal in LF. For normal film watching, as well as real-time situations, emotional scenes are usually very short compared to the 5-min rule in medical cardiac evaluation [64,65].
- (b)
- EMG:EMG indicates the degree of muscle contraction. Ekman has demonstrated that facial muscle contraction is concomitant to ANS specificity such that a motor program is initiated and generates specific facial expression [66]. Other than facial muscles, skeletal muscles of body parts are also studied by different researchers showing a system of emotional muscle response termed BCAS (body action coding system) [67,68]. However, there is not much empirical research supporting this system on emotion recognition. Facial expression detection may not be easily applied to a wearable sensor, so EMG in this study has been placed on forearm, biceps and triceps, but no significant relationship between these signals on those areas could be found. Finally, the EMG data were discarded in the current study.
3.3. Stimulation of Emotions
3.4. Procedures
4. Results
- (a)
- Still pictures:This visual stimulus contained 20 still pictures without auditory content, thus a relatively simple attention to the perception process was established. The pictures were separated into two groups with similar valence ratings within each group, but high valence contrast between the groups. RSS results showed the responses of all participants during the viewing of individual pictures for high and low valence, as well as the switching of pictures within-groups and between-groups.
- (b)
- Short film clips:Film clips with dedicated emotional rankings and short durations (see Table 2) stimulated specific perceived emotions in participants, and the purposes of using RSS were (i) validation of emotion elicitation, (ii) comparison between perceived and felt emotions and (iii) pattern matching of biosignals with ANS specificity for successful and unsuccessful emotion elicitations.
- (c)
- Longer version film clip:A longer film clip around 30 min with multiple emotional stimulation moments of ‘joy’ caught viewers’ attention and elicited corresponding emotions, and the RSS demonstrated the whole stimulation, attention, perception and autonomic responses process. Additionally, synchronisation was extended to the environmental context by demonstrating how the collected background sound was synchronised with the physiological responses during the emotional perceptions.
4.1. Still Pictures
4.2. Short Film Clips
4.3. Longer Version Film Clip
5. Discussions
6. Conclusions
Acknowledgments
Author Contributions
Conflicts of Interest
Appendix A. Off-The-Shelf Biomedical Sensors
- PPG (Photoplethysmography)Pulse Sensor is an open source hardware project by Joel Murphy and Yury Gitman, and the specification can be obtained from https://pulsesensor.com/pages/open-hardware.
- EDA (Electrodermal Activity)Grove GSR Sensor from Seeed; specification can be obtained from http://wiki.seeed.cc/Grove-GSR_Sensor/.
- SKT (Skin Temperature)Si7051 Digital Temperature Sensors with I2C Interface from Silicon Labs; specification can be obtained from https://www.silabs.com/products/sensors/si705x-temperature-sensors.
- EMG (Surface Electromyography)MyoWare Muscle Sensor from Advencer Technologies; specification can be obtained from http://www.advancertechnologies.com/p/myoware.html.
References
- James, W. What is an Emotion? Mind 1884, 9, 188–205. [Google Scholar] [CrossRef]
- Deigh, J. William James and the Rise of the Scientific Study of Emotion. Emot. Rev. 2014, 6, 4–12. [Google Scholar] [CrossRef]
- Keltner, D.; Gross, J.J. Functional Accounts of Emotions. Cogn. Emot. 1999, 13, 467–480. [Google Scholar] [CrossRef]
- Levenson, R.W. The Intrapersonal Functions of Emotion. Cogn. Emot. 1999, 13, 481–504. [Google Scholar] [CrossRef]
- Frijda, N.H. The evolutionary emergence of what we call “emotions”. Cogn. Emot. 2016, 30, 609–620. [Google Scholar] [CrossRef] [PubMed]
- Cacioppo, J.T.; Tassinary, L.G.; Berntson, G. Handbook of Psychophysiology; Cambridge University Press: New York, NY, USA, 2007. [Google Scholar]
- Wells, A.; Matthews, G. Attention and Emotion (Classic Edition): A Clinical Perspective; Psychology Press: New York, NY, USA, 2014. [Google Scholar]
- Pessoa, L. Précis on The Cognitive-Emotional Brain. Behav. Brain Sci. 2015, 38, e71. [Google Scholar] [CrossRef] [PubMed]
- Maren, S.; Phan, K.L.; Liberzon, I. The contextual brain: Implications for fear conditioning, extinction and psychopathology. Nat. Rev. Neurosci. 2013, 14, 417–428. [Google Scholar] [CrossRef] [PubMed]
- Bergado, J.A.; Lucas, M.; Richter-Levin, G. Emotional tagging—A simple hypothesis in a complex reality. Prog. Neurobiol. 2011, 94, 64–76. [Google Scholar] [CrossRef] [PubMed]
- Critchley, H.; Harrison, N. Visceral Influences on Brain and Behavior. Neuron 2013, 77, 624–638. [Google Scholar] [CrossRef] [PubMed]
- Shiota, M.N.; Neufeld, S.L. My heart will go on: Aging and autonomic nervous system responding in emotion. In The Oxford Handbook of Emotion, Social Cognition, and Problem Solving in Adulthood; Verhaeghen, P., Hertzog, C., Eds.; Oxford University Press: New York, NY, USA, 2014; pp. 225–235. [Google Scholar]
- Henry, J.D.; Castellini, J.; Moses, E.; Scott, J.G. Emotion regulation in adolescents with mental health problems. J. Clin. Exp. Neuropsychol. 2016, 38, 197–207. [Google Scholar] [CrossRef] [PubMed]
- Barrett, L.F. The Conceptual Act Theory: A Précis. Emot. Rev. 2014, 6, 292–297. [Google Scholar] [CrossRef]
- Quigley, K.S.; Barrett, L.F. Is there consistency and specificity of autonomic changes during emotional episodes? Guidance from the Conceptual Act Theory and psychophysiology. Biol. Psychol. 2014, 98, 82–94. [Google Scholar] [CrossRef] [PubMed]
- Marwitz, M.; Stemmler, G. On the status of individual response specificity. Psychophysiology 1998, 35, 1–15. [Google Scholar] [CrossRef] [PubMed]
- Stemmler, G.; Wacker, J. Personality, emotion, and individual differences in physiological responses. Biol. Psychol. 2010, 84, 541–551. [Google Scholar] [CrossRef] [PubMed]
- Duclot, F.; Perez-Taboada, I.; Wright, K.N.; Kabbaj, M. Prediction of individual differences in fear response by novelty seeking, and disruption of contextual fear memory reconsolidation by ketamine. Neuropharmacology 2016, 109, 293–305. [Google Scholar] [CrossRef] [PubMed]
- Schubert, E. Emotion felt by the listener and expressed by the music: Literature review and theoretical perspectives. Front. Psychol. 2013, 4, 837. [Google Scholar] [CrossRef] [PubMed]
- Tian, L.; Muszynski, M.; Lai, C.; Moore, J.D.; Kostoulas, T.; Lombardo, P.; Pun, T.; Chanel, G. Recognizing induced emotions of movie audiences: Are induced and perceived emotions the same? In Proceedings of the Seventh International Conference on Affective Computing and Intelligent Interaction (ACII), San Antonio, TX, USA, 23–26 October 2017; pp. 28–35. [Google Scholar]
- Picard, R.W.; Healey, J. Affective Computing; MIT Press: Cambridge, MA, USA, 1997; Volume 252. [Google Scholar]
- Picard, R.W. Emotion Research by the People, for the People. Emot. Rev. 2010, 2, 250–254. [Google Scholar] [CrossRef] [Green Version]
- Hui, T.K.L.; Sherratt, R.S.; Sánchez, D.D. Major requirements for building Smart Homes in Smart Cities based on Internet of Things technologies. Future Gener. Comput. Syst. 2017, 76, 358–369. [Google Scholar] [CrossRef]
- Hui, T.K.L.; Sherratt, R.S. Towards disappearing user interfaces for ubiquitous computing: Human enhancement from sixth sense to super senses. J. Ambient Intell. Humaniz. Comput. 2017, 8, 449–465. [Google Scholar] [CrossRef]
- Winkielman, P.; Berridge, K.C. Unconscious Emotion. Curr. Dir. Psychol. Sci. 2004, 13, 120–123. [Google Scholar] [CrossRef]
- Smith, R.; Lane, R.D. Unconscious emotion: A cognitive neuroscientific perspective. Neurosci. Biobehav. Rev. 2016, 69, 216–238. [Google Scholar] [CrossRef] [PubMed]
- Ax, A.F. The physiological differentiation between fear and anger in humans. Psychosom. Med. 1953, 15, 433–442. [Google Scholar] [CrossRef] [PubMed]
- Levenson, R.W. Autonomic Nervous System Differences among Emotions. Psychol. Sci. 1992, 3, 23–27. [Google Scholar] [CrossRef]
- Kreibig, S.D. Autonomic nervous system activity in emotion: A review. Biol. Psychol. 2010, 84, 394–421. [Google Scholar] [CrossRef] [PubMed]
- Cutmore, T.; James, D.A. Sensors and Sensor Systems for Psychophysiological Monitoring: A Review of Current Trends. J. Psychophysiol. 2007, 21, 51–71. [Google Scholar] [CrossRef]
- Norman, G.J.; Berntson, G.G.; Cacioppo, J.T. Emotion, Somatovisceral Afference, and Autonomic Regulation. Emot. Rev. 2014, 6, 113–123. [Google Scholar] [CrossRef]
- Picard, R.W.; Healey, J. Affective wearables. In Proceedings of the Digest of Papers. First International Symposium on Wearable Computers, Cambridge, MA, USA, 13–14 October 1997; pp. 90–97. [Google Scholar]
- Cima, M.J. Next-generation wearable electronics. Nat. Biotechnol. 2014, 32, 642–643. [Google Scholar] [CrossRef] [PubMed]
- Yoon, S.; Sim, J.K.; Cho, Y.H. A Flexible and Wearable Human Stress Monitoring Patch. Sci. Rep. 2016, 6, 23468. [Google Scholar] [CrossRef] [PubMed]
- Wac, K.; Tsiourti, C. Ambulatory Assessment of Affect: Survey of Sensor Systems for Monitoring of Autonomic Nervous Systems Activation in Emotion. IEEE Trans. Affect. Comput. 2014, 5, 251–272. [Google Scholar] [CrossRef]
- Chen, M.; Ma, Y.; Li, Y.; Wu, D.; Zhang, Y.; Youn, C.H. Wearable 2.0: Enabling Human-Cloud Integration in Next Generation Healthcare Systems. IEEE Commun. Mag. 2017, 55, 54–61. [Google Scholar] [CrossRef]
- Mauss, I.B.; Robinson, M.D. Measures of emotion: A review. Cogn. Emot. 2009, 23, 209–237. [Google Scholar] [CrossRef] [PubMed]
- Landowska, A. Emotion Monitoring—Verification of Physiological Characteristics Measurement Procedures. Metrol. Meas. Syst. 2014, 21, 719–732. [Google Scholar] [CrossRef]
- Kragel, P.A.; LaBar, K.S. Advancing Emotion Theory with Multivariate Pattern Classification. Emot. Rev. 2014, 6, 160–174. [Google Scholar] [CrossRef] [PubMed]
- Verma, G.K.; Tiwary, U.S. Multimodal fusion framework: A multiresolution approach for emotion classification and recognition from physiological signals. NeuroImage 2014, 102, 162–172. [Google Scholar] [CrossRef] [PubMed]
- Khezri, M.; Firoozabadi, M.; Sharafat, A.R. Reliable emotion recognition system based on dynamic adaptive fusion of forehead biopotentials and physiological signals. Comput. Methods Prog. Biomed. 2015, 122, 149–164. [Google Scholar] [CrossRef] [PubMed]
- Kim, J.; André, E. Emotion recognition based on physiological changes in music listening. IEEE Trans. Pattern Anal. Mach. Intell. 2008, 30, 2067–2083. [Google Scholar] [PubMed]
- Godin, C.; Prost-Boucle, F.; Campagne, A.; Charbonnier, S.; Bonnet, S.; Vidal, A. Selection of the most relevant physiological features for classifying emotion. Emotion 2015, 40, 20. [Google Scholar]
- He, C.; Yao, Y.J.; Ye, X.S. An Emotion Recognition System Based on Physiological Signals Obtained by Wearable Sensors. In Wearable Sensors and Robots; Springer: Singapore, 2017; pp. 15–25. [Google Scholar]
- Howley, T.; Madden, M.G.; O’Connell, M.L.; Ryder, A.G. The effect of principal component analysis on machine learning accuracy with high-dimensional spectral data. Knowl. Based Syst. 2006, 19, 363–370. [Google Scholar] [CrossRef]
- Shlens, J. A tutorial on principal component analysis. arXiv, 2014; arXiv:1404.1100. [Google Scholar]
- Wechsler, D.; Jones, H.E. A Study of Emotional Specificity. Am. J. Psychol. 1928, 40, 600–606. [Google Scholar] [CrossRef]
- Averill, J.R. Autonomic response patterns during sadness and mirth. Psychophysiology 1969, 5, 399–414. [Google Scholar] [CrossRef]
- Stemmler, G. The Autonomic Differentiation of Emotions Revisited: Convergent and Discriminant Validation. Psychophysiology 1989, 26, 617–632. [Google Scholar] [CrossRef] [PubMed]
- Levenson, R.W.; Ekman, P.; Friesen, W.V. Voluntary Facial Action Generates Emotion-Specific Autonomic Nervous System Activity. Psychophysiology 1990, 27, 363–384. [Google Scholar] [CrossRef] [PubMed]
- Kret, M.E.; Stekelenburg, J.J.; Roelofs, K.; de Gelder, B. Perception of Face and Body Expressions Using Electromyography, Pupillometry and Gaze Measures. Front. Psychol. 2013, 4, 1–12. [Google Scholar] [CrossRef] [PubMed]
- Gothard, K.M. The amygdalo-motor pathways and the control of facial expressions. Front. Neurosci. 2014, 8, 1–7. [Google Scholar] [CrossRef] [PubMed]
- Shafir, T.; Tsachor, R.P.; Welch, K.B. Emotion Regulation through Movement: Unique Sets of Movement Characteristics are Associated with and Enhance Basic Emotions. Front. Psychol. 2016, 6, 1–15. [Google Scholar] [CrossRef] [PubMed]
- Graham, F.K.; Clifton, R.K. Heart-rate change as a component of the orienting response. Psychol. Bull. 1966, 65, 305–320. [Google Scholar] [CrossRef] [PubMed]
- Bradley, M.M. Natural selective attention: Orienting and emotion. Psychophysiology 2009, 46, 1–11. [Google Scholar] [CrossRef] [PubMed]
- Barry, R.J.; Steiner, G.Z.; De Blasio, F.M. Reinstating the Novelty P3. Sci. Rep. 2016, 6, 31200. [Google Scholar] [CrossRef] [PubMed]
- MacDonald, B.; Barry, R.J. Significance and Novelty effects in single-trial ERP components and autonomic responses. Int. J. Psychophysiol. 2017, 117, 48–64. [Google Scholar] [CrossRef] [PubMed]
- Codispoti, M.; Surcinelli, P.; Baldaro, B. Watching emotional movies: Affective reactions and gender differences. Int. J. Psychophysiol. 2008, 69, 90–95. [Google Scholar] [CrossRef] [PubMed]
- Rooney, B.; Benson, C.; Hennessy, E. The apparent reality of movies and emotional arousal: A study using physiological and self-report measures. Poetics 2012, 40, 405–422. [Google Scholar] [CrossRef]
- Bradley, M.M.; Keil, A.; Lang, P.J. Orienting and Emotional Perception: Facilitation, Attenuation, and Interference. Front. Psychol. 2012, 3, 493. [Google Scholar] [CrossRef] [PubMed]
- Fafoutis, X.; Vafeas, A.; Janko, B.; Sherratt, R.S.; Pope, J.; Elsts, A.; Mellios, E.; Hilton, G.; Oikonomou, G.; Piechocki, R.; et al. Designing Wearable Sensing Platforms for Healthcare in a Residential Environment. EAI Endorsed Trans. Pervasive Health Technol. 2017, 17, 1–12. [Google Scholar]
- Shen, H. Interactive notebooks: sharing the code: The free IPython notebook makes data analysis easier to record, understand and reproduce. Nature 2014, 515, 151–152. [Google Scholar] [CrossRef] [PubMed]
- Heathers, J.A.J. Everything Hertz: Methodological issues in short-term frequency-domain HRV. Front. Physiol. 2014, 5, 1–15. [Google Scholar] [CrossRef] [PubMed]
- Camm, A.J.; Malik, M.; Bigger, J.; Breithardt, G.; Cerutti, S.; Cohen, R.J.; Coumel, P.; Fallen, E.L.; Kennedy, H.L.; Kleiger, R.E. Heart rate variability: Standards of measurement, physiological interpretation and clinical use. Task Force of the European Society of Cardiology and the North American Society of Pacing and Electrophysiology. Circulation 1996, 93, 1043–1065. [Google Scholar]
- Sassi, R.; Cerutti, S.; Lombardi, F.; Malik, M.; Huikuri, H.V.; Peng, C.K.; Schmidt, G.; Yamamoto, Y.; Gorenek, B.; Lip, G.Y.H.; et al. Advances in heart rate variability signal analysis: Joint position statement by the e-Cardiology ESC Working Group and the European Heart Rhythm Association co-endorsed by the Asia Pacific Heart Rhythm Society. EP Eur. 2015, 17, 1341–1353. [Google Scholar] [CrossRef] [PubMed]
- Ekman, P. Facial expression and emotion. Am. Psychol. 1993, 48, 384–392. [Google Scholar] [CrossRef] [PubMed]
- Huis in ‘t Veld, E.M.J.; Van Boxtel, G.J.M.; de Gelder, B. The Body Action Coding System I: Muscle activations during the perception and expression of emotion. Soc. Neurosci. 2014, 9, 249–264. [Google Scholar] [CrossRef] [PubMed]
- Huis In ‘t Veld, E.M.J.; van Boxtel, G.J.M.; de Gelder, B. The Body Action Coding System II: Muscle activations during the perception and expression of emotion. Front. Behav. Neurosci. 2014, 8, 1–13. [Google Scholar] [CrossRef]
- Lang, P.J. The emotion probe: Studies of motivation and attention. Am. Psychol. 1995, 50, 372–385. [Google Scholar] [CrossRef] [PubMed]
- Lang, P.; Bradley, M.M. The International Affective Picture System (IAPS) in the study of emotion and attention. In Handbook of Emotion Elicitation and Assessment; Coan, J.A., Allen, J.J.B., Eds.; Oxford University Press: New York, NY, USA, 2007; pp. 29–46. [Google Scholar]
- Lang, P.J.; Bradley, M.M.; Cuthbert, B.N. International Affective Picture System (IAPS): Affective Ratings of Pictures and Instruction Manual; Technical Report A-8; University of Florida: Gainesville, FL, USA, 2008. [Google Scholar]
- Gross, J.J.; Levenson, R.W. Emotion elicitation using films. Cogn. Emot. 1995, 9, 87–108. [Google Scholar] [CrossRef]
- Schaefer, A.; Nils, F.; Sanchez, X.; Philippot, P. Assessing the effectiveness of a large database of emotion-eliciting films: A new tool for emotion researchers. Cogn. Emot. 2010, 24, 1153–1172. [Google Scholar] [CrossRef]
- Uhrig, M.K.; Trautmann, N.; Baumgärtner, U.; Treede, R.D.; Henrich, F.; Hiller, W.; Marschall, S. Emotion Elicitation: A Comparison of Pictures and Films. Front. Psychol. 2016, 7, 1–12. [Google Scholar] [CrossRef] [PubMed]
- Zupan, B.; Babbage, D.R. Film clips and narrative text as subjective emotion elicitation techniques. J. Soc. Psychol. 2017, 157, 194–210. [Google Scholar] [CrossRef] [PubMed]
- Yiend, J. The effects of emotion on attention: A review of attentional processing of emotional information. Cogn. Emot. 2010, 24, 3–47. [Google Scholar] [CrossRef]
- Samanez-Larkin, G.R.; Robertson, E.R.; Mikels, J.A.; Carstensen, L.L.; Gotlib, I.H. Selective attention to emotion in the aging brain. Psychol. Aging 2009, 24, 519–529. [Google Scholar] [CrossRef] [PubMed]
- Yiend, J.; Mathews, A.; Burns, T.; Dutton, K.; Fernández-Martín, A.; Georgiou, G.A.; Luckie, M.; Rose, A.; Russo, R.; Fox, E. Mechanisms of selective attention in generalized anxiety disorder. Clin. Psychol. Sci. 2015, 3, 758–771. [Google Scholar] [CrossRef] [PubMed] [Green Version]
PPG Features | Calculations Based on Python (Import Numpy, Pandas and Scipy) |
---|---|
IBI | Peak detection of raw PPG signal and get an array (ppgnn) |
Interbeat interval (IBI) = ppgnn.interpolate(method = ”cubic”) | |
HR | Heart Rate = (60 s × sampling frequency)/peak-to-peak duration |
HR = IBI.rolling (window, min_periods = 1, centre = True).mean() | |
SDNN | Standard deviation of IBI |
SDNN = IBI.rolling (window, min_periods = 1, centre = True).std() | |
SDSD | Standard deviation of the difference between adjacent ppgnn |
ppgdiff = pd.DataFrame(np.abs(np.ediff1d(ppgnn))) | |
ppgdiff = ppgdiff.interpolate (method = “cubic”) | |
SDSD = ppgdiff.rolling (window, min_periods = 1, centre = True).std() | |
RMSSD | Root Mean Square of the difference between adjacent ppgnn |
ppgsqdiff = pd.DataFrame (np.power(np.ediff1d (ppgnn), 2)) | |
ppgsqdiff = ppgsqdiff.interpolate (method = ”cubic”) | |
RMSSD = np.sqrt (ppgsqdiff.rolling (window, min_periods = 1, centre = True).mean()) | |
SDNN/RMSSD | Ratio between SDNN and RMSSD |
SDNN_RMSSD = SDNN / RMSSD | |
LF | Power Spectral Density (PSD) for low frequency range (0.04 Hz to 0.15 Hz) |
Y = np.fft.fft (IBI)/window, Y = Y [range(window//2)] | |
LF = np.trapz (np.abs(Y[(freq ≥ 0.04) & (freq ≤ 0.15)])) | |
HF | PSD for high frequency range (0.16 Hz to 0.4 Hz) |
HF = np.trapz(np.abs (Y[(freq ≥ 0.15) & (freq ≤ 0.4)])) | |
LF/HF | PSD ratio between LF and HF |
LHF = LF / HF | |
EDA Features | Calculations Based on Python (Import Numpy, Pandas and Scipy) |
EDA (filtered) | eda = raw EDA signal sampling at 100 ms |
B, A = signal.butter (2, 0.005, output = “ba”) | |
EDAf = signal.filtfilt (B, A, eda) | |
EDA (mean) | Getting rolling mean of filtered EDA raw signal (EDAf) |
EDAmean = EDAf.rolling (window, min_periods = 1, centre = True).mean() | |
EDA (std) | Getting rolling standard deviation of filtered EDA raw signal (EDAf) |
EDAstd = EDAf.rolling (window, min_periods = 1, centre = True).std() | |
SKT Features | Calculations Based on Python (Import Numpy, Pandas and Scipy) |
SKT (filtered) | skt = raw SKT signal sampling at 100 ms |
B, A = signal.butter (2, 0.005, output = “ba”) | |
SKTf = signal.filtfilt (B, A, skt) | |
SKT (mean) | Getting rolling mean of filtered SKT raw signal (SKTf) |
SKTmean = SKTf.rolling (window, min_periods = 1, centre = True).mean() | |
SKT (std) | Getting rolling standard deviation of filtered SKT raw signal (SKTf) |
SKTstd = SKTf.rolling (window, min_periods = 1, centre = True).std() |
Emotions | Pictures (Numbers Refer to IAPS Database [71]) | Film Clips (Names and Duration of Film Clips Refer to Schaefer et al. [73]) |
---|---|---|
Happiness/Joy | High valence rating: | (1) Something About Mary [2] |
#1710 (Puppies), #1750 (Bunnies), | (2) A fish called Wanda | |
#5833 (Beach), #1460 (Kitten), | (3) When Harry met Sally | |
#2050 (Baby), #1440 (Seal), | ||
Anger | #2040 (Baby), #2070 (Baby), | (1) Schindler’s list [2] |
#8190 (Skier), #2080 (Babies) | (2) Sleepers | |
(3) Leaving Las Vegas | ||
Fear | Low valence rating: | (1) The Blair Witch Project |
#3053 (BurnVictim), #3102 (BurnVictim), | (2) The Shining | |
#3000 (Mutilation), #3064 (Mutilation), | (3) Misery | |
#3170 (BabyTumor), #3080 (Mutilation), | ||
Disgust | #3063 (Mutilation), #9410 (Soldier), | (1) Trainspotting [2] |
#3131 (Mutilation), #3015 (Accident) | (2) Seven [3] | |
(3) Hellraiser | ||
Sadness | (1) City of angels | |
(2) Dangerous mind | ||
(3) Philadelphia |
Film Clips Target Emotions (And Distribution) | Emotion (Target = Subjective) | Emotion Elicitation (Subjective) | Emotion Elicitation (Measure) | Hit-Rate (Measure = Subjective) | Subjective Arousal (Average) | Subjective Valence (Average) |
---|---|---|---|---|---|---|
Joy (19%) | 100% | 19% | 12% | 60% | 3.50 | 3.75 |
Anger (23%) | 0% | 0% | 4% | 0% | 0.00 | 0.00 |
Fear (15%) | 43% | 27% | 12% | 43% | 5.43 | 7.43 |
Disgust (21%) | 71% | 26% | 19% | 57% | 5.57 | 8.14 |
Sadness (23%) | 57% | 18% | 7% | 43% | 4.20 | 7.14 |
Reasons for Unsuccessful Emotion Elicitation | Percentage |
---|---|
Saw the film clip before (many times) | 17% |
Film clip too short | 42% |
Cannot understand the language | 4% |
Did not feel the target emotion | 33% |
Others | 4% |
Biosignal | Responses Match with ANS Specificity Before Successful Emotion Elicitation | Responses Match with ANS Specificity After Successful Emotion Elicitation |
---|---|---|
HR | p < 0.992 | p < 0.126 |
EDA | p < 0.989 | p < 0.042 |
SKT | p < 0.362 | p < 0.036 |
© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Hui, T.K.L.; Sherratt, R.S. Coverage of Emotion Recognition for Common Wearable Biosensors. Biosensors 2018, 8, 30. https://doi.org/10.3390/bios8020030
Hui TKL, Sherratt RS. Coverage of Emotion Recognition for Common Wearable Biosensors. Biosensors. 2018; 8(2):30. https://doi.org/10.3390/bios8020030
Chicago/Turabian StyleHui, Terence K.L., and R. Simon Sherratt. 2018. "Coverage of Emotion Recognition for Common Wearable Biosensors" Biosensors 8, no. 2: 30. https://doi.org/10.3390/bios8020030
APA StyleHui, T. K. L., & Sherratt, R. S. (2018). Coverage of Emotion Recognition for Common Wearable Biosensors. Biosensors, 8(2), 30. https://doi.org/10.3390/bios8020030