Automated Affective Computing Based on Bio-Signals Analysis and Deep Learning Approach
Abstract
:1. Introduction
2. Materials and Methods
2.1. Participants
2.2. Emotional Stimuli Dataset
2.3. Procedure
2.4. Physiological Signals Acquisition and Processing
2.4.1. Cardiac Features: Heart Rate Variability and Blood Volume Pulse
2.4.2. Respiration Features
2.4.3. Thermal Infrared Imaging Features
- -
- Co-registration between visible and thermal optics. The co-registration was performed by applying a geometric transformation of the visible coordinates (calculated based on different fields of view), resolution, and position.
- -
- -
- Facial landmarks’ identification on the corresponding IR images by applying the geometrical transformation obtained from the optical co-registration process (Figure 3b).
- -
- ROI identification with respect to the facial landmark in the IR images (Figure 3c)
2.5. Classification Algorithms
2.5.1. Machine Learning—Random Forest
2.5.2. Deep Learning—Feed Forward Neural Network
3. Results
3.1. Random Forest Classification Results
3.2. FeedForward-Neural Network Classification Results
Features Analysis Results
4. Discussion
Study’s Limitations
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Greco, A.; Valenza, G.; Citi, L.; Scilingo, E.P. Arousal and Valence Recognition of Affective Sounds Based on Electrodermal Activity. IEEE Sens. J. 2016, 17, 716–725. [Google Scholar] [CrossRef]
- Beale, R.; Peter, C. Affect and Emotion in Human–Computer Interaction; The Role of Affect and Emotion in HCI; Springer International Publishing: Cham, Switzerland, 2008; pp. 1–11. [Google Scholar]
- Reeves, B.; Nass, C. The Media Equation: How People Treat Computers, Television, and New Media like Real People and Places; The Media Equation: How People Treat Computers, Television, and New Media Like Real People and Places; Cambridge University Press: Cambridge, UK, 1996; p. xiv 305. ISBN 978-1-57586-052-7. [Google Scholar]
- Picard, R.W.; Vyzas, E.; Healey, J. Toward Machine Emotional Intelligence: Analysis of Affective Physiological State. IEEE Trans. Pattern Anal. Mach. Intell. 2001, 23, 1175–1191. [Google Scholar] [CrossRef] [Green Version]
- Uljarevic, M.; Hamilton, A. Recognition of Emotions in Autism: A Formal Meta-Analysis. J. Autism Dev. Disord 2013, 43, 1517–1526. [Google Scholar] [CrossRef] [PubMed]
- Ceccato, I.; Palumbo, R.; Di Crosta, A.; La Malva, P.; Marchetti, D.; Maiella, R.; Verrocchio, M.C.; Marin, A.; Mammarella, N.; Palumbo, R. Age-Related Differences in the Perception of COVID-19 Emergency during the Italian Outbreak. Aging Mental Health 2021, 25, 1305–1313. [Google Scholar] [CrossRef] [PubMed]
- Mammarella, N.; Fairfield, B.; De Leonardis, V.; Carretti, B.; Borella, E.; Frisullo, E.; Di Domenico, A. Is There an Affective Working Memory Deficit in Patients with Chronic Schizophrenia? Schizophr. Res. 2012, 138, 99–101. [Google Scholar] [CrossRef] [PubMed]
- Prodi, N.; Visentin, C.; Borella, E.; Mammarella, I.C.; Di Domenico, A. Noise, Age, and Gender Effects on Speech Intelligibility and Sentence Comprehension for 11-to 13-Year-Old Children in Real Classrooms. Front. Psychol. 2019, 2166. [Google Scholar] [CrossRef] [PubMed]
- Mammarella, N.; Fairfield, B.; Frisullo, E.; Domenico, A.D. Saying It with a Natural Child’s Voice! When Affective Auditory Manipulations Increase Working Memory in Aging. Aging Ment. Health 2013, 17, 853–862. [Google Scholar] [CrossRef]
- Thimbleby, H. Technology and the Future of Healthcare. J. Public Health Res. 2013, 2, e28. [Google Scholar] [CrossRef] [Green Version]
- Nasoz, F.; Alvarez, K.; Lisetti, C.L.; Finkelstein, N. Emotion Recognition from Physiological Signals Using Wireless Sensors for Presence Technologies. Cogn. Technol. Work. 2004, 6, 4–14. [Google Scholar] [CrossRef]
- Tivatansakul, S.; Ohkura, M.; Puangpontip, S.; Achalakul, T. Emotional Healthcare System: Emotion Detection by Facial Expressions Using Japanese Database. In Proceedings of the 2014 6th Computer Science and Electronic Engineering Conference (CEEC), Colchester, UK, 25–26 September 2014; pp. 41–46. [Google Scholar]
- Reisenzein, R. A Structuralist Reconstruction of Wundt’s Three-Dimensional Theory of Emotion. In The Structuralist Program in Psychology: Foundations and Applications; Hogrefe & Huber Publishers: Ashland, OH, USA, 1992; pp. 141–189. ISBN 978-0-88937-100-2. [Google Scholar]
- Tracy, J.L.; Randles, D. Four Models of Basic Emotions: A Review of Ekman and Cordaro, Izard, Levenson, and Panksepp and Watt. Emot. Rev. 2011, 3, 397–405. [Google Scholar] [CrossRef] [Green Version]
- Izard, C.E. Basic Emotions, Relations among Emotions, and Emotion-Cognition Relations. Psychol. Rev. 1992, 99, 561–565. [Google Scholar] [CrossRef] [PubMed]
- Ortony, A.; Turner, T.J. What’s Basic about Basic Emotions? Psychol. Rev. 1990, 97, 315–331. [Google Scholar] [CrossRef] [PubMed]
- Posner, J.; Russell, J.A.; Peterson, B.S. The Circumplex Model of Affect: An Integrative Approach to Affective Neuroscience, Cognitive Development, and Psychopathology. Dev. Psychopathol. 2005, 17, 715–734. [Google Scholar] [CrossRef] [PubMed]
- Saarni, C. The Development of Emotional Competence; Guilford Press: New York, NY, USA, 1999; ISBN 978-1-57230-434-5. [Google Scholar]
- Barrett, L.F. Discrete Emotions or Dimensions? The Role of Valence Focus and Arousal Focus. Cogn. Emot. 1998, 12, 579–599. [Google Scholar] [CrossRef]
- Rubin, D.C.; Talarico, J.M. A Comparison of Dimensional Models of Emotion: Evidence from Emotions, Prototypical Events, Autobiographical Memories, and Words. Memory 2009, 17, 802–808. [Google Scholar] [CrossRef] [Green Version]
- Russell, J.A. A Circumplex Model of Affect. J. Personal. Soc. Psychol. 1980, 39, 1161–1178. [Google Scholar] [CrossRef]
- Gerber, A.J.; Posner, J.; Gorman, D.; Colibazzi, T.; Yu, S.; Wang, Z.; Kangarlu, A.; Zhu, H.; Russell, J.; Peterson, B.S. An Affective Circumplex Model of Neural Systems Subserving Valence, Arousal, and Cognitive Overlay during the Appraisal of Emotional Faces. Neuropsychologia 2008, 46, 2129–2139. [Google Scholar] [CrossRef] [Green Version]
- Posner, J.; Russell, J.A.; Gerber, A.; Gorman, D.; Colibazzi, T.; Yu, S.; Wang, Z.; Kangarlu, A.; Zhu, H.; Peterson, B.S. The Neurophysiological Bases of Emotion: An FMRI Study of the Affective Circumplex Using Emotion-Denoting Words. Hum. Brain Mapp. 2009, 30, 883–895. [Google Scholar] [CrossRef] [Green Version]
- Tseng, A.; Bansal, R.; Liu, J.; Gerber, A.J.; Goh, S.; Posner, J.; Colibazzi, T.; Algermissen, M.; Chiang, I.-C.; Russell, J.A.; et al. Using the Circumplex Model of Affect to Study Valence and Arousal Ratings of Emotional Faces by Children and Adults with Autism Spectrum Disorders. J. Autism. Dev. Disord. 2014, 44, 1332–1346. [Google Scholar] [CrossRef] [Green Version]
- Lang, P.J.; Bradley, M.M.; Cuthbert, B.N. International Affective Picture System (IAPS): Technical Manual and Affective Ratings; NIMH Center for the Study of Emotion and Attention: Gainesville, FL, USA, 1995. [Google Scholar]
- Montefinese, M.; Ambrosini, E.; Fairfield, B.; Mammarella, N. The Adaptation of the Affective Norms for English Words (ANEW) for Italian. Behav. Res. Methods 2014, 46, 887–903. [Google Scholar] [CrossRef]
- Soares, A.P.; Pinheiro, A.P.; Costa, A.; Frade, C.S.; Comesaña, M.; Pureza, R. Affective Auditory Stimuli: Adaptation of the International Affective Digitized Sounds (IADS-2) for European Portuguese. Behav. Res. Methods 2013, 45, 1168–1181. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Di Crosta, A.; La Malva, P.; Manna, C.; Marin, A.; Palumbo, R.; Verrocchio, M.C.; Cortini, M.; Mammarella, N.; Di Domenico, A. The Chieti Affective Action Videos Database, a Resource for the Study of Emotions in Psychology. Sci. Data 2020, 7, 32. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- La Malva, P.; Ceccato, I.; Di Crosta, A.; Marin, A.; Fasolo, M.; Palumbo, R.; Mammarella, N.; Palumbo, R.; Di Domenico, A. Updating the Chieti Affective Action Videos Database with Older Adults. Sci. Data 2021, 8, 272. [Google Scholar] [CrossRef] [PubMed]
- Du, N.; Zhou, F.; Pulver, E.M.; Tilbury, D.M.; Robert, L.P.; Pradhan, A.K.; Yang, X.J. Examining the Effects of Emotional Valence and Arousal on Takeover Performance in Conditionally Automated Driving. Transp. Res. Part C Emerg. Technol. 2020, 112, 78–87. [Google Scholar] [CrossRef]
- Ganapathy, N.; Veeranki, Y.R.; Swaminathan, R. Convolutional Neural Network Based Emotion Classification Using Electrodermal Activity Signals and Time-Frequency Features. Expert Syst. Appl. 2020, 159, 113571. [Google Scholar] [CrossRef]
- Filippini, C.; Perpetuini, D.; Cardone, D.; Merla, A. Improving Human–Robot Interaction by Enhancing NAO Robot Awareness of Human Facial Expression. Sensors 2021, 21, 6438. [Google Scholar] [CrossRef]
- Liu, W.; Zhang, L.; Tao, D.; Cheng, J. Reinforcement Online Learning for Emotion Prediction by Using Physiological Signals. Pattern Recognit. Lett. 2018, 107, 123–130. [Google Scholar] [CrossRef]
- Dunn, J.; Runge, R.; Snyder, M. Wearables and the Medical Revolution. Pers. Med. 2018, 15, 429–448. [Google Scholar] [CrossRef] [Green Version]
- Shu, L.; Xie, J.; Yang, M.; Li, Z.; Li, Z.; Liao, D.; Xu, X.; Yang, X. A Review of Emotion Recognition Using Physiological Signals. Sensors 2018, 18, 2074. [Google Scholar] [CrossRef] [Green Version]
- Guo, H.-W.; Huang, Y.-S.; Lin, C.-H.; Chien, J.-C.; Haraikawa, K.; Shieh, J.-S. Heart Rate Variability Signal Features for Emotion Recognition by Using Principal Component Analysis and Support Vectors Machine. In Proceedings of the 2016 IEEE 16th International Conference on Bioinformatics and Bioengineering (BIBE), Taichung, Taiwan, 31 October–2 November 2016; pp. 274–277. [Google Scholar]
- Lee, M.S.; Lee, Y.K.; Pae, D.S.; Lim, M.T.; Kim, D.W.; Kang, T.K. Fast Emotion Recognition Based on Single Pulse PPG Signal with Convolutional Neural Network. Appl. Sci. 2019, 9, 3355. [Google Scholar] [CrossRef] [Green Version]
- Zhang, Q.; Chen, X.; Zhan, Q.; Yang, T.; Xia, S. Respiration-Based Emotion Recognition with Deep Learning. Comput. Ind. 2017, 92–93, 84–90. [Google Scholar] [CrossRef]
- Filippini, C.; Perpetuini, D.; Cardone, D.; Chiarelli, A.M.; Merla, A. Thermal Infrared Imaging-Based Affective Computing and Its Application to Facilitate Human Robot Interaction: A Review. Appl. Sci. 2020, 10, 2924. [Google Scholar] [CrossRef]
- Filippini, C.; Spadolini, E.; Cardone, D.; Bianchi, D.; Preziuso, M.; Sciarretta, C.; del Cimmuto, V.; Lisciani, D.; Merla, A. Facilitating the Child–Robot Interaction by Endowing the Robot with the Capability of Understanding the Child Engagement: The Case of Mio Amico Robot. Int. J. Soc. Robot. 2021, 13, 677–689. [Google Scholar] [CrossRef]
- Cruz-Albarran, I.A.; Benitez-Rangel, J.P.; Osornio-Rios, R.A.; Morales-Hernandez, L.A. Human Emotions Detection Based on a Smart-Thermal System of Thermographic Images. Infrared Phys. Technol. 2017, 81, 250–261. [Google Scholar] [CrossRef]
- Perpetuini, D.; Cardone, D.; Filippini, C.; Chiarelli, A.M.; Merla, A. Modelling Impulse Response Function of Functional Infrared Imaging for General Linear Model Analysis of Autonomic Activity. Sensors 2019, 19, 849. [Google Scholar] [CrossRef] [Green Version]
- Provost, F.; Kohavi, R. Guest Editors’ Introduction: On Applied Research in Machine Learning. Mach. Learn. 1998, 30, 127–132. [Google Scholar] [CrossRef]
- Bishop, C.M. Pattern Recognition. Mach. Learn. 2006, 128. [Google Scholar]
- Noroozi, F.; Kaminska, D.; Sapinski, T.; Anbarjafari, G. Supervised Vocal-Based Emotion Recognition Using Multiclass Support Vector Machine, Random Forests, and Adaboost. J. Audio Eng. Soc. 2017, 65, 562–572. [Google Scholar] [CrossRef]
- Gonzalez, J.; Prevost, L. Personalizing Emotion Recognition Using Incremental Random Forests. In Proceedings of the 2021 29th European Signal Processing Conference (EUSIPCO), Dublin, Ireland, 23–27 August 2021; pp. 781–785. [Google Scholar]
- Salmaso, D.; Longoni, A.M. Problems in the Assessment of Hand Preference. Cortex 1985, 21, 533–549. [Google Scholar] [CrossRef]
- Westermann, R.; Spies, K.; Stahl, G.; Hesse, F.W. Relative Effectiveness and Validity of Mood Induction Procedures: A Meta-Analysis. Eur. J. Soc. Psychol. 1996, 26, 557–580. [Google Scholar] [CrossRef]
- Gross, J.J.; Levenson, R.W. Emotion Elicitation Using Films. Cogn. Emot. 1995, 9, 87–108. [Google Scholar] [CrossRef]
- Bradley, M.M.; Lang, P.J. Measuring Emotion: The Self-Assessment Manikin and the Semantic Differential. J. Behav. Ther. Exp. Psychiatry 1994, 25, 49–59. [Google Scholar] [CrossRef]
- Bernard, V.; Staffa, E.; Mornstein, V.; Bourek, A. Infrared Camera Assessment of Skin Surface Temperature–Effect of Emissivity. Phys. Med. 2013, 29, 583–591. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Diakides, M.; Bronzino, J.D.; Peterson, D.R. Medical Infrared Imaging: Principles and Practices; CRC Press: Boca Raton, FL, USA, 2012; ISBN 1-4398-7250-3. [Google Scholar]
- Filippini, C.; Cardone, D.; Perpetuini, D.; Chiarelli, A.M.; Gualdi, G.; Amerio, P.; Merla, A. Convolutional Neural Networks for Differential Diagnosis of Raynaud’s Phenomenon Based on Hands Thermal Patterns. Appl. Sci. 2021, 11, 3614. [Google Scholar] [CrossRef]
- Vardasca, R. The Influence of Angles and Distance on Assessing Inner-Canthi of the Eye Skin Temperature. Thermol. Int. 2017, 27, 130–135. [Google Scholar]
- Appelhans, B.M.; Luecken, L.J. Heart Rate Variability as an Index of Regulated Emotional Responding. Rev. Gen. Psychol. 2006, 10, 229–240. [Google Scholar] [CrossRef] [Green Version]
- Shaffer, F.; Ginsberg, J.P. An Overview of Heart Rate Variability Metrics and Norms. Front. Public Health 2017, 5. [Google Scholar] [CrossRef] [Green Version]
- Chen, C.-L.; Chuang, C.-T. A QRS Detection and R Point Recognition Method for Wearable Single-Lead ECG Devices. Sensors 2017, 17, 1969. [Google Scholar] [CrossRef] [Green Version]
- Gao, R.; Islam, A.; Gedeon, T.; Hossain, M.Z. Identifying Real and Posed Smiles from Observers’ Galvanic Skin Response and Blood Volume Pulse. In Proceedings of the International Conference on Neural Information Processing, Bangkok, Thailand, 18–22 November 2020; pp. 375–386. [Google Scholar]
- Perpetuini, D.; Chiarelli, A.M.; Cardone, D.; Filippini, C.; Rinella, S.; Massimino, S.; Bianco, F.; Bucciarelli, V.; Vinciguerra, V.; Fallica, P. Prediction of State Anxiety by Machine Learning Applied to Photoplethysmography Data. PeerJ 2021, 9, e10448. [Google Scholar] [CrossRef]
- Homma, I.; Masaoka, Y. Breathing Rhythms and Emotions. Exp. Physiol. 2008, 93, 1011–1021. [Google Scholar] [CrossRef] [Green Version]
- Valderas, M.T.; Bolea, J.; Laguna, P.; Bailón, R.; Vallverdú, M. Mutual Information between Heart Rate Variability and Respiration for Emotion Characterization. Physiol. Meas. 2019, 40, 084001. [Google Scholar] [CrossRef] [PubMed]
- Cardone, D.; Perpetuini, D.; Filippini, C.; Spadolini, E.; Mancini, L.; Chiarelli, A.M.; Merla, A. Driver Stress State Evaluation by Means of Thermal Imaging: A Supervised Machine Learning Approach Based on ECG Signal. Appl. Sci. 2020, 10, 5673. [Google Scholar] [CrossRef]
- Abd Latif, M.H.; Yusof, H.M.; Sidek, S.N.; Rusli, N. Thermal Imaging Based Affective State Recognition. In Proceedings of the 2015 IEEE International Symposium on Robotics and Intelligent Sensors (IRIS), Langkawi, Malaysia, 18–20 October 2015; pp. 214–219. [Google Scholar]
- Cardone, D.; Spadolini, E.; Perpetuini, D.; Filippini, C.; Chiarelli, A.M.; Merla, A. Automated Warping Procedure for Facial Thermal Imaging Based on Features Identification in the Visible Domain. Infrared Phys. Technol. 2021, 112, 103595. [Google Scholar] [CrossRef]
- Amos, B.; Ludwiczuk, B.; Satyanarayanan, M. OpenFace: A General-Purpose Face Recognition Library with Mobile Applications. CMU Sch. Comput. Sci. 2016, 6, 20. [Google Scholar]
- Ayata, D.; Yaslan, Y.; Kamaşak, M. Emotion Recognition via Random Forest and Galvanic Skin Response: Comparison of Time Based Feature Sets, Window Sizes and Wavelet Approaches. In Proceedings of the 2016 Medical Technologies National Congress (TIPTEKNO), Antalya, Turkey, 27–29 October 2016; pp. 1–4. [Google Scholar]
- Agarap, A.F. Deep Learning Using Rectified Linear Units (Relu). arXiv 2018, arXiv:1803.08375. [Google Scholar]
- Murugan, P.; Durairaj, S. Regularization and Optimization Strategies in Deep Convolutional Neural Network. arXiv 2017, arXiv:1712.04711. [Google Scholar]
- Shi, H.; Yang, L.; Zhao, L.; Su, Z.; Mao, X.; Zhang, L.; Liu, C. Differences of Heart Rate Variability between Happiness and Sadness Emotion States: A Pilot Study. J. Med. Biol. Eng. 2017, 37, 527–539. [Google Scholar] [CrossRef]
- Wan, P.; Wen, J.; Wu, C. A Discriminating Method of Driving Anger Based on Sample Entropy of EEG and BVP. In Proceedings of the 2015 International Conference on Transportation Information and Safety (ICTIS), Wuhan, China, 25–28 June 2015; pp. 156–161. [Google Scholar]
- Pan, L.; Yin, Z.; She, S.; Song, A. Emotional State Recognition from Peripheral Physiological Signals Using Fused Nonlinear Features and Team-Collaboration Identification Strategy. Entropy 2020, 22, 511. [Google Scholar] [CrossRef]
- Kosonogov, V.; Zorzi, L.D.; Honoré, J.; Martínez-Velázquez, E.S.; Nandrino, J.-L.; Martinez-Selva, J.M.; Sequeira, H. Facial Thermal Variations: A New Marker of Emotional Arousal. PLoS ONE 2017, 12, e0183592. [Google Scholar] [CrossRef]
- Kiashari, S.E.H.; Nahvi, A.; Homayounfard, A.; Bakhoda, H. Monitoring the Variation in Driver Respiration Rate from Wakefulness to Drowsiness: A Non-Intrusive Method for Drowsiness Detection Using Thermal Imaging. J. Sleep Sci. 2018, 3, 1–9. [Google Scholar]
- Kajiwara, S. Driver-Condition Detection Using a Thermal Imaging Camera and Neural Networks. Int. J. Automot. Technol. 2021, 22, 1505–1515. [Google Scholar] [CrossRef]
- Reshma, R. Emotional and Physical Stress Detection and Classification Using Thermal Imaging Technique. Ann. Rom. Soc. Cell Biol. 2021, 8364–8374. [Google Scholar]
- Gizińska, M.; Rutkowski, R.; Szymczak-Bartz, L.; Romanowski, W.; Straburzyńska-Lupa, A. Thermal Imaging for Detecting Temperature Changes within the Rheumatoid Foot. J. Therm. Anal. Calorim. 2021, 145, 77–85. [Google Scholar] [CrossRef]
- Hunt, B.; Ruiz, A.J.; Pogue, B.W. Smartphone-Based Imaging Systems for Medical Applications: A Critical Review. J. Biomed. Opt. 2021, 26, 040902. [Google Scholar] [CrossRef] [PubMed]
- Goulart, C.; Valadão, C.; Delisle-Rodriguez, D.; Tavares, D.; Caldeira, E.; Bastos-Filho, T. Emotional State Analysis through Infrared Thermal Imaging. In Proceedings of the XXVI Brazilian Congress on Biomedical Engineering, Armação de Buzios, RJ, Brazil, 21–25 October 2018; pp. 199–203. [Google Scholar]
- Sarath, S. Human Emotions Recognition from Thermal Images Using Yolo Algorithm. In Proceedings of the 2020 International Conference on Communication and Signal Processing (ICCSP), Chennai, India, 28–30 July 2020; pp. 1139–1142. [Google Scholar]
Variables | Formula |
---|---|
Mean RR | |
RMSSD | |
SDNN |
Signals | Features * | Description |
---|---|---|
HRV | LF | Power in low frequency range (0.04–0.15 Hz) |
HF | Power in high frequency range (0.15–0.4 Hz) | |
LF/HF ratio | LF/HF | |
Mean RR | mean RR interval | |
RMSSD | root-mean-square of successive RR interval differences | |
SDNN | Standard deviation of normal-to-normal interval | |
BVP, RSP, IR (× 4 ROIs) | STD | Signal’s standard deviation |
Mean | Signal’s average value | |
Δ | Difference between the signal’s average value during task and rest | |
SampEn | Sample entropy, which measures the regularity and complexity of a time series | |
IR ( 4 ROIs) | Der1 | Signal’s first-time derivative, which describes the signal tangent line slope |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Filippini, C.; Di Crosta, A.; Palumbo, R.; Perpetuini, D.; Cardone, D.; Ceccato, I.; Di Domenico, A.; Merla, A. Automated Affective Computing Based on Bio-Signals Analysis and Deep Learning Approach. Sensors 2022, 22, 1789. https://doi.org/10.3390/s22051789
Filippini C, Di Crosta A, Palumbo R, Perpetuini D, Cardone D, Ceccato I, Di Domenico A, Merla A. Automated Affective Computing Based on Bio-Signals Analysis and Deep Learning Approach. Sensors. 2022; 22(5):1789. https://doi.org/10.3390/s22051789
Chicago/Turabian StyleFilippini, Chiara, Adolfo Di Crosta, Rocco Palumbo, David Perpetuini, Daniela Cardone, Irene Ceccato, Alberto Di Domenico, and Arcangelo Merla. 2022. "Automated Affective Computing Based on Bio-Signals Analysis and Deep Learning Approach" Sensors 22, no. 5: 1789. https://doi.org/10.3390/s22051789
APA StyleFilippini, C., Di Crosta, A., Palumbo, R., Perpetuini, D., Cardone, D., Ceccato, I., Di Domenico, A., & Merla, A. (2022). Automated Affective Computing Based on Bio-Signals Analysis and Deep Learning Approach. Sensors, 22(5), 1789. https://doi.org/10.3390/s22051789