Assessment of a Person’s Emotional State Based on His or Her Posture Parameters
Abstract
:1. Introduction
- We developed a posturometric chair to detect sitting posture (body tilt, weight distribution, degree of leaning on the backrest) using sensors embedded in the seat and backrest.
- We applied and compared the effectiveness of several machine learning methods in the task of emotion recognition from sitting posture.
- We have proposed an approach to estimating the emotional state of a person based on the body position changes of the person sitting in the armchair, based on the concept of co-evolving hybrid intelligence, which will allow taking into account the individual characteristics of the tested people.
- We formed the basis of a database of sitting postures characterizing six emotions classified according to three levels of manifestation.
2. Literature Review
3. Materials and Methods
3.1. The Concept of Co-Evolutionary Hybrid Intelligence and a Cognitive Architecture for Its Implementation
- Participants and elements of the system should have cognitive interoperability;
- The system has to be self-developing;
- The system and its participants must have the property of self-reflexivity;
- The feedback in the system should be obligatory;
- The participants of the system should contribute to the development of each other and the system as a whole.
- Data sources. These are primary sources representing data about a person and various objects of his environment. In our system of an assessment of a person’s emotional state, such sources are sensors of a posturometric armchair, a video camera, and sensors for monitoring of the temperature and parameters of breathing of the person. Sources also can be various databases and other means of storing and accumulating information. In this article we show processing of data from only one source—a group of sensors of a posturometric chair.
- Narrow AI. A module that implements preprocessing and data processing methods. In our case, it provides primary filtering of sensor signals, which we describe in the article, and video preprocessing.
- Multimodal data. Multimodal data processing is implemented in a module that collects data about the object at a given point in time from different sources, coordinates these data on a timeline or other scale, and selects from the whole set of data the ones that are relevant to the current situation.
- Hybrid system identification. This operation is realized in the module of state parameters identification of the system and its individual components. For example, in our system, this module can identify parameters of the human state.
- Activity models. The activity model module processes the existing models of actions in the system. For example, in our system, this module selects appropriate decision-making models depending on the known parameters and generates action scenarios.
- Generalized modeling module. The generalized modeling module handles the models, refines them, and combines them into a generalized system model.
- Generative AI. The module provides modification of existing models and creation of the new ones.
- Decision-making and action planning. In this module, an optimal model is selected from a set of models, and a single decision is chosen from a group of decisions. Decisions can have different characters. For example, if some condition is defined, it is a fixation of the fact, and if the scheme of correction of a condition is offered, it is some plan. Other variants of decisions are also possible.
- Execution and management. The module provides the execution of an action plan, if there is a need. In our system, it can be, for example, automatic adjustment of the chair to the position of the person, taking into account his or her current state and the target state. This module is currently under development.
3.2. System for Defining the Characteristics of the Human Body Position
- A force transmission element transmitting the force exerted by a sitting person to the strain gauge.
- The strain gauge with built-in bridge generates an analog signal proportional to the applied force. The outputs of the supply and measurement diagonals of the bridge are connected to the digitizer unit through a cable shortened to minimize interference. In the measuring system, sensors with the following characteristics are used: maximum permissible weight—10 kg; error—0.05%; working temperature—−10–50 °C; supply voltage—3–12 V.
- The analog-to-digital converter provides the digitization of the bridge mismatch signal. In the system, HX711 converters are used. The converter has a built-in multiplexer that allows communication with one of the two input channels. Channel A can be programmed with a gain of 128 or 64; channel B has a fixed gain of 32. The chip has a built-in voltage regulator, eliminating the need for an external regulator. The clock input is multifunctional; in this design we use it to supply external clock pulses.
- Calibration mode. In this mode, the sensors are polled without any influence on them to form corrective values, used to compensate for the additive error.
- Measuring mode. In this mode, the results of measurement are read out. To control this mode, the signals for starting the sensor polling and stopping the polling are used. In this mode, 10 measurements per second were performed, then averaged in accordance with the sensor manufacturer’s recommendation. Thus, the frequency of measurements was 1 sample per second.
3.3. Formation and Identification of an Emotional State
- Questionnaire-based assessment, for example, using Spielberger anxiety tests and their modifications [49], the subjective MF 20 asthenia assessment scale [50], Holmes and Ray’s stress tolerance and social adaptation methodology [51], the Medical Outcomes Study-Short Form (MOS SF-36) health assessment questionnaire [52], and so on;
- Analysis of visual information, performed both by a specialist and automatically, for example, using neural network technologies.
- Joy—feeling happy, cheerful, enjoying, contentment, bliss, pride, excitement, fascination, pleasure, euphoria, acceptance, friendliness, trust, kindness, sympathy, enthusiasm, admiration.
- Sadness—concern, joylessness, regret, guilt, shame, loneliness, sadness, despair.
- Fear—anxiety, apprehension, nervousness, uneasiness, fright, misgivings, suspicion, doubt, suspense, terror, panic.
- Anger—irritation, resentment, indignation, hostility, annoyance, nervousness, aggression.
- Disgust—contempt, disdain, aversion, dislike.
- Astonishment—amazement, excitement, shock.
3.4. Testing Procedure
- Modeling of the initial neutral state in the subject and fixation of the state on the panel of emotions.
- Reproduction of the necessary emotional state.
- Return of the subject to the initial state of comfort.
4. Processing of Measurement Results
5. Discussion
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Dodig Crnkovic, G. Info-computational Constructivism and Cognition. Constr. Found. 2014, 9, 2223–2231. [Google Scholar]
- Tao, J.; Tan, T.; Picard, R.W. Affective Computing and Intelligent Interaction; Springer: Berlin/Heidelberg, Germany, 2005. [Google Scholar] [CrossRef]
- Scherer, K.R. What are emotions? And how can they be measured? Soc. Sci. Inf. 2005, 44, 695–729. [Google Scholar] [CrossRef]
- Dzedzickis, A.; Kaklauskas, A.; Bucinskas, V. Human Emotion Recognition: Review of Sensors and Methods. Sensors 2020, 20, 592. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Shu, L.; Xie, J.; Yang, M.; Li, Z.; Li, Z.; Liao, D.; Xu, X.; Yang, X. A Review of Emotion Recognition Using Physiological Signals. Sensors 2018, 18, 2074. [Google Scholar] [CrossRef] [Green Version]
- Alarcão, S.M.; Fonseca, M.J. Emotions Recognition Using EEG Signals: A Survey. IEEE Trans. Affect. Comput. 2019, 10, 374–393. [Google Scholar] [CrossRef]
- García-Martínez, B.; Martínez-Rodrigo, A.; Alcaraz, R.; Fernández-Caballero, A. A Review on Nonlinear Methods Using Electroencephalographic Recordings for Emotion Recognition. IEEE Trans. Affect. Comput. 2021, 12, 801–820. [Google Scholar] [CrossRef]
- Goshvarpour, A.; Abbasi, A.; Goshvarpour, A. Dynamical analysis of emotional states from electroencephalogram signals. Biomed. Eng. Appl. Basis Commun. 2016, 28, 1650015. [Google Scholar] [CrossRef]
- Hasnul, M.A.; Aziz, N.A.A.; Alelyani, S.; Mohana, M.; Aziz, A.A. Electrocardiogram-Based Emotion Recognition Systems and Their Applications in Healthcare-A Review. Sensors 2021, 21, 5015. [Google Scholar] [CrossRef]
- Nikolova, D.; Petkova, P.; Manolova, A.; Georgieva, P. ECG-Based Emotion Recognition: Overview of Methods and Applications. In Proceedings of the Advances in Neural Networks and Applications, St. Konstantin and Elena Resort, Varna, Bulgaria, 15–17 September 2018; pp. 1–5. [Google Scholar]
- Mithbavkar, S.A.; Shah, M.S. Analysis of EMG Based Emotion Recognition for Multiple People and Emotions. In Proceedings of the IEEE 3rd Eurasia Conference on Biomedical Engineering, Healthcare and Sustainability (ECBIOS), Tainan, Taiwan, 28–30 May 2021; pp. 1–4. [Google Scholar] [CrossRef]
- Chen, J.; Ro, T.; Zhu, Z. Emotion Recognition with Audio, Video, EEG, and EMG: A Dataset and Baseline Approaches. IEEE Access 2022, 10, 13229–13242. [Google Scholar] [CrossRef]
- Pavlidis, I.; Levine, J.; Baukol, P. Thermal Image Analysis for Anxiety Detection. In Proceedings of the IEEE International Conference on Image Processing, Thessaloniki, Greece, 7–10 October 2001; pp. 315–318. [Google Scholar]
- Lee, J.-M.; An, Y.-E.; Bak, E.; Pan, S. Improvement of Negative Emotion Recognition in Visible Images Enhanced by Thermal Imaging. Sustainability 2022, 14, 15200. [Google Scholar] [CrossRef]
- Udovičić, G.; Derek, J.; Russo, M.; Sikora, M. Wearable Emotion Recognition System Based on GSR and PPG Signals. In Proceedings of the 2nd International Workshop on Multimedia for Personal Health and Health Care, Mountain View, CA, USA, 23 October 2017; pp. 53–59. [Google Scholar]
- Homma, I.; Masaoka, Y. Breathing rhythms and emotions. Exp. Physiol. 2008, 93, 1011–1021. [Google Scholar] [CrossRef] [Green Version]
- Ritsert, F.; Elgendi, M.; Galli, V.; Menon, C. Heart and Breathing Rate Variations as Biomarkers for Anxiety Detection. Bioengineering 2022, 9, 711. [Google Scholar] [CrossRef] [PubMed]
- Klein, R.J.; Towers, C.; Robinson, M.D. Emotion-related variations in motor tremor: Magnitude, time course, and links to emotional temperament. Emotion 2020, 20, 1020–1030. [Google Scholar] [CrossRef] [PubMed]
- Bureneva, O.; Safyannikov, N. Strain Gauge Measuring System for Subsensory Micromotions Analysis as an Element of a Hybrid Human–Machine Interface. Sensors 2022, 22, 9146. [Google Scholar] [CrossRef]
- Giannakakis, G.; Pediaditis, M.; Manousos, D.; Kazantzaki, E.; Chiarugi, F.; Simos, P.G.; Marias, K.; Tsiknakis, M. Stress and anxiety detection using facial cues from videos. Biomed. Signal Process. Control 2017, 31, 89–101. [Google Scholar] [CrossRef]
- Ko, B.C. A Brief Review of Facial Emotion Recognition Based on Visual Information. Sensors 2018, 18, 401. [Google Scholar] [CrossRef]
- Chowdary, M.K.; Nguyen, T.N.; Hemanth, D.J. Deep learning-based facial emotion recognition for human–computer interaction applications. Neural Comput. Appl. 2021, 2021, 1–18. [Google Scholar] [CrossRef]
- Babu, E.K.; Mistry, K.; Anwar, M.N.; Zhang, L. Facial Feature Extraction Using a Symmetric Inline Matrix-LBP Variant for Emotion Recognition. Sensors 2022, 22, 8635. [Google Scholar] [CrossRef] [PubMed]
- Khalil, R.A.; Jones, E.; Babar, M.I. Speech emotion recognition using deep learning techniques: A review. IEEE Access 2019, 7, 117327–117345. [Google Scholar] [CrossRef]
- Wani, T.M.; Gunawan, T.S.; Qadri, S.A.A.; Kartiwi, M.; Ambikairajah, E. A Comprehensive Review of Speech Emotion Recognition Systems. IEEE Access 2021, 9, 47795–47814. [Google Scholar] [CrossRef]
- Yang, Y.; Xu, F. Review of Research on Speech Emotion Recognition. In Machine Learning and Intelligent Communications, Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering; Jiang, X., Ed.; Springer: Cham, Switzerland, 2022; Volume 438, pp. 315–326. [Google Scholar] [CrossRef]
- Kleinsmith, A.; Bianchi-Berthouze, N. Affective Body Expression Perception and Recognition: A Survey. IEEE Trans. Affect. Comput. 2013, 4, 15–33. [Google Scholar] [CrossRef]
- Ahmed, F.; Bari, A.S.M.H.; Gavrilova, M.L. Emotion recognition from body movement. IEEE Access 2019, 8, 11761–11781. [Google Scholar] [CrossRef]
- Jianwattanapaisarn, N.; Sumi, K.; Utsumi, A.; Khamsemanan, N.; Nattee, C. Emotional characteristic analysis of human gait while real-time movie viewing. Front. Artif. Intell. 2022, 5, 989860. [Google Scholar] [CrossRef] [PubMed]
- Coulson, M. Attributing Emotion to Static Body Postures: Recognition Accuracy, Confusions, and Viewpoint Dependence. J. Nonverbal Behav. 2004, 28, 117–139. [Google Scholar] [CrossRef]
- Laban, R. The Mastery of Movement, 3rd ed.; Macdonald & Evans: Braintree, MA, USA, 1971. [Google Scholar]
- Aristidou, A.; Charalambous, P.; Chrysanthou, Y. Emotion Analysis and Classification: Understanding the Performers’ Emotions Using the LMA Entities. Comput. Graph. Forum 2015, 36, 262–276. [Google Scholar] [CrossRef]
- Fu, X.; Cai, T.; Xue, C.; Zhang, Y.; Xu, Y. Research on body-gesture affective computing method based on BGRU-FUS-NN neural network. J. Comput. Aided Des. Comput. Graph. 2020, 32, 1070–1079. [Google Scholar]
- Wang, S.; Wang, X.; Wang, Z.; Xiao, R. Emotion Recognition Based on Static Human Posture Features. In Lecture Notes in Electrical Engineering, Proceedings of the 6th International Technical Conference on Advances in Computing, Control and Industrial Engineering, Hangzhou, China, 16–17 October 2021; Shmaliy, S.Y., Abdelnaby Zekry, A., Eds.; Springer: Singapore, 2022; Volume 920, pp. 529–539. [Google Scholar] [CrossRef]
- Pal, S.; Mukhopadhyay, S.; Suryadevara, N. Development and Progress in Sensors and Technologies for Human Emotion Recognition. Sensors 2021, 21, 5554. [Google Scholar] [CrossRef]
- Elvitigala, D.S.; Matthies, D.J.C.; Nanayakkara, S. StressFoot: Uncovering the Potential of the Foot for Acute Stress Sensing in Sitting Posture. Sensors 2020, 20, 2882. [Google Scholar] [CrossRef]
- Bourahmoune, K.; Ishac, K.; Amagasa, T. Intelligent Posture Training: Machine-Learning-Powered Human Sitting Posture Recognition Based on a Pressure-Sensing IoT Cushion. Sensors 2022, 22, 5337. [Google Scholar] [CrossRef] [PubMed]
- Meyer, J.; Arnrich, B.; Schumm, J.; Troster, G. Design and Modeling of a Textile Pressure Sensor for Sitting Posture Classification. IEEE Sens. J. 2010, 10, 1391–1398. [Google Scholar] [CrossRef]
- Jeong, H.; Park, W. Developing and Evaluating a Mixed Sensor Smart Chair System for Real-Time Posture Classification: Combining Pressure and Distance Sensors. IEEE J. Biomed. Health Inform. 2021, 25, 1805–1813. [Google Scholar] [CrossRef] [PubMed]
- Arnrich, B.; Setz, C.; La Marca, R.; Tröster, G.; Ehlert, U. What Does Your Chair Know About Your Stress Level? IEEE Trans. Inf. Technol. Biomed. 2010, 14, 207–214. [Google Scholar] [CrossRef]
- Hu, Q.; Tang, X.; Tang, W. A smart chair sitting posture recognition system using flex sensors and FPGA implemented artificial neural network. IEEE Sens. J. 2020, 20, 8007–8016. [Google Scholar] [CrossRef]
- Engelbart, D.C. Augmenting Human Intellect: A Conceptual Framework, Summary Report AFOSR-3223 under Contract AF 49(638)-1024, SRI Project 3578 for Air Force Office of Scientific Research; Stanford Research Institute: Menlo Park, CA, USA, 1962. [Google Scholar]
- Krinkin, K.; Shichkina, Y.; Ignatyev, A. Co-evolutionary hybrid intelligence. In Proceedings of the 2021 5th Scientific School Dynamics of Complex Networks and their Applications (DCNA), Kaliningrad, Russia; 2021; pp. 112–115. [Google Scholar] [CrossRef]
- Krinkin, K.; Shichkina, Y.; Ignatyev, A. Co-Evolutionary Hybrid Intelligence Is a Key Concept for the World Intellectualization. Kybernetes 2022. ahead-of-print. [Google Scholar] [CrossRef]
- Grigsby, S.S. Artificial Intelligence for Advanced Human-Machine Symbiosis. In Lecture Notes in Computer Science, Proceedings of the Augmented Cognition: Intelligent Technologies, 12th International Conference AC 2018, Las Vegas, NV, USA, 15–20 July 2018; Schmorrow, D., Fidopiastis, C., Eds.; Springer: Cham, Switzerland, 2018; Volume 10915. [Google Scholar] [CrossRef]
- Parrott, W.G. Emotions in Social Psychology, 1st ed.; Psychology Press: London, UK, 2000. [Google Scholar]
- Egger, M.; Ley, M.; Hanke, S. Emotion Recognition from Physiological Signal Analysis: A Review. Electron. Notes Theor. Comput. Sci. 2019, 343, 35–55. [Google Scholar] [CrossRef]
- Plutchik, R. The Nature of Emotions: Human emotions have deep evolutionary roots, a fact that may explain their complexity and provide tools for clinical practice. Am. Sci. 2001, 89, 344–350. [Google Scholar] [CrossRef]
- Zsido, A.N.; Teleki, S.A.; Csokasi, K.; Rozsa, S.; Bandi, S.A. Development of the short version of the spielberger state—Trait anxiety inventory. Psychiatry Res. 2020, 291, 113223. [Google Scholar] [CrossRef]
- Smets, E.M.; Garssen, B.; Bonke, B.; De Haes, J.C. The Multidimensional Fatigue Inventory (MFI) psychometric qualities of an instrument to assess fatigue. J. Psychosom. Res. 1995, 39, 315–325. [Google Scholar] [CrossRef] [Green Version]
- Holmes, T.H.; Rahe, R.H. The Social Readjustment Rating Scale. J. Psychosom. Research 1967, 11, 213–218. [Google Scholar] [CrossRef]
- Ware, J.E., Jr.; Sherbourne, C.D. The MOS 36-item short-form health survey (SF-36): I. Conceptual framework and item selection. Med. Care 1992, 30, 473–483. [Google Scholar] [CrossRef] [PubMed]
- Karle, H.W.A.; Boy, J.H. Hypnotherapy: A Practical Handbook, 2nd ed.; Free Association Books: London, UK, 2010. [Google Scholar]
- Gilligan, S. Therapeutic Trances: The Cooperation Principle in Ericksonian Hypnotherapy, 1st ed.; Routledge: New York, NY, USA, 2018. [Google Scholar] [CrossRef]
- Plutchik, R. The multifactor-analytic theory of emotion. J. Psychol. 1960, 50, 153–171. [Google Scholar] [CrossRef]
- Izard, C.E. Basic emotions, natural kinds, emotion schemas, and a new paradigm. Perspect. Psychol. Sci. 2007, 2, 260–280. [Google Scholar] [CrossRef] [PubMed]
- Erickson, M.H.; Rossi, E.L. Hypnotherapy: An Exploratory Casebook, 1st ed.; Irvington Pulishers Inc.: North Stratford, NH, USA, 1979. [Google Scholar]
- Kumar, A.; Mase, J.M.; Rengasamy, D.; Rothwell, B.; Torres, M.T.; Winkler, D.A.; Figueredo, G.P. EFI: A Toolbox for Feature Importance Fusion and Interpretation in Python. In Proceedings of the Machine Learning, Optimization, and Data Science: 8th International Conference, Certosa di Pontignano, Italy, 18–22 September 2022; pp. 249–264. [Google Scholar] [CrossRef]
- Lopez, L.D.; Reschke, P.J.; Knothe, J.M.; Walle, E.A. Postural Communication of Emotion: Perception of Distinct Poses of Five Discrete Emotions. Front. Psychol. 2017, 8, 710. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Jeon, J.; Kwon, S.-Y.; Lee, Y.-M.; Hong, J.; Yu, J.; Kim, J.; Kim, S.-G.; Lee, D. Influence of the Hawthorne effect on spatiotemporal parameters, kinematics, ground reaction force, and the symmetry of the dominant and nondominant lower limbs during gait. J. Biomech. 2023, 152, 111555. [Google Scholar] [CrossRef]
Emotion | Sadness | Fear | Anger | Disgust | Astonishment |
---|---|---|---|---|---|
joy | 0.944996544 | 0.82354998 | 0.97826379 | 0.94674795 | 0.90222697 |
sadness | 0.770252641 | 0.93095542 | 0.95747521 | 0.817582839 | |
fear | 0.83155515 | 0.72745024 | 0.903278 | ||
anger | 0.97278435 | 0.932596804 | |||
disgust | 0.865337527 |
Emotion | ||||||
---|---|---|---|---|---|---|
Method | Joy | Sadness | Fear | Anger | Disgust | Astonishment |
LogisticRegression | 0.9370 | 0.9692 | 0.9468 | 0.9780 | 0.9782 | 0.9889 |
KNeighborsClassifier | 0.9451 | 0.9652 | 0.9572 | 0.9658 | 0.9808 | 0.9811 |
SVM | 0.9689 | 0.9760 | 0.9769 | 0.9720 | 0.9848 | 0.9904 |
GaussianNB | 0.5902 | 0.7791 | 0.8059 | 0.7431 | 0.8374 | 0.9443 |
DecisionTreeClassifier | 0.9447 | 0.9690 | 0.9608 | 0.9604 | 0.9749 | 0.9805 |
RandomForestClassifier | 0.9721 | 0.9794 | 0.9773 | 0.9745 | 0.9854 | 0.9915 |
XGBClassifier | 0.9719 | 0.9789 | 0.9772 | 0.9740 | 0.9844 | 0.9908 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Shichkina, Y.; Bureneva, O.; Salaurov, E.; Syrtsova, E. Assessment of a Person’s Emotional State Based on His or Her Posture Parameters. Sensors 2023, 23, 5591. https://doi.org/10.3390/s23125591
Shichkina Y, Bureneva O, Salaurov E, Syrtsova E. Assessment of a Person’s Emotional State Based on His or Her Posture Parameters. Sensors. 2023; 23(12):5591. https://doi.org/10.3390/s23125591
Chicago/Turabian StyleShichkina, Yulia, Olga Bureneva, Evgenii Salaurov, and Ekaterina Syrtsova. 2023. "Assessment of a Person’s Emotional State Based on His or Her Posture Parameters" Sensors 23, no. 12: 5591. https://doi.org/10.3390/s23125591
APA StyleShichkina, Y., Bureneva, O., Salaurov, E., & Syrtsova, E. (2023). Assessment of a Person’s Emotional State Based on His or Her Posture Parameters. Sensors, 23(12), 5591. https://doi.org/10.3390/s23125591