Effective Emoticon Suggestion Technique Based on Active Emotional Input Using Facial Expressions and Heart Rate Signals
Abstract
:1. Introduction
- Active Emotional Input (AEI): A user-centered approach to emotion recognition that encourages users to intentionally express their emotions.
- A design that incorporates psychological factors and usability concerns for a seamless and efficient user experience in computer-mediated communication environments.
2. Related Work
2.1. Emoticon Suggestion Systems
2.2. Commercial Products for Emoticon Suggestion
2.3. User Interfaces for Emoticons using Multi-Modal Signals
3. Background
3.1. Emoticons and Their Impact on Communication
3.2. Role of Emotional Expression
- Intended emotional expression (ITE): ITE, on the other hand, involves more detailed information processing for intentional purposes. This type of emotional expression is often deliberate and controlled, and is used for social or communicative purposes, such as smiling to ingratiate oneself with others, regardless of one’s actual feelings [57,58].
3.3. Heart Rate-Based Emotion Recognition
4. Materials and Methods
4.1. Usability of Emoticons in Mobile Environment
4.2. Emotion Recognition Mechanism
4.3. Active Emotional Input Interface
4.4. Prototype Implementation
5. User Study
5.1. Study Design
5.2. Emotion Recognition Accuracy
5.3. Completion Time for Expressing Emotions
5.4. Task Workload Analysis for Expressing Emotions
6. Discussion
- Expression differences: In addition to explicit and implicit emotional expressions, the emotional expression gap should also be considered an important characteristic. For example, the emotional expression gap between imaginary and actual expressions affects significantly affects the performance and workload for the surprise emotion. Most participants replied that they had difficulty inputting facial expressions in a surprising. This implies that acquired knowledge and actual situations can lead to cognitive dissonance, which can disrupt the use of emotions as a part of the system.
- Emotional loading time: Although emotional changes may regulate HR through the nervous system via electrical signals, it takes some time for these changes to stabilize and appear, as demonstrated in Section 5.4. This implies that emotion recognition using physical body condition changes has a higher cost for recognition of emotions such as anger. However, it can be inferred that the same cost will not be incurred for every recognition because of the persistence of emotions. Consequently, a continuous expression of the same emotion may decrease the emotional expression cost by reducing the emotional loading time.
- Explicit control of implicit signals: Because human emotional expression performs both implicit and explicit roles, our proposed technique also uses HR variation to capture implicit signals. However, although our methodology guarantees high accuracy as described in Section 5.3, there is a tendency to cause stress to the user. For instance, participants claimed that they were stressed when they purposefully recalled anger. More specifically, they responded that they felt nervous when anger was recognized by the system, in addition to the displeasure associated with anger. However, when these implicit signals were changed by external stimuli (e.g., watching videos), participants showed a faster completion time and reported a lower task workload, resulting in a more natural representation. This implies that different emotion recognition techniques should be applied to both the IDE and ITE, and they cannot complement each other owing to the proportional relationship between the completion time and task workload.
- The emotional impact of active emotional input on computer-mediated communication: We have demonstrated that users’ emotions can be utilized as active input to the system through our proposed technique and some of the user study experiments. However, this only means that it is possible from a system perspective, and we cannot predict how it will impact the conversations or overall interactions between users, a topic that goes beyond the scope of this study. For example, according to our qualitative observations, the participants expressed reluctance to deliberately evoke negative emotions, which could potentially affect the flow of the overall conversation or interaction with the system. Therefore, when designing an AEI system, the emotional impact on the user should also be considered in addition to system accuracy and performance time.
7. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Gantiva, C.; Araujo, A.; Castillo, K.; Claro, L.; Hurtado-Parrado, C. Physiological and affective responses to emoji faces: Effects on facial muscle activity, skin conductance, heart rate, and self-reported affect. Biol. Psychol. 2021, 163, 108142. [Google Scholar] [CrossRef] [PubMed]
- Derks, D.; Fischer, A.H.; Bos, A.E.R. The role of emotion in computer-mediated communication: A review. Comput. Hum. Behav. 2008, 24, 766–785. [Google Scholar] [CrossRef]
- Zhang, L.; Tjondronegoro, D. Facial expression recognition using facial movement features. IEEE Trans. Affect. Comput. 2011, 2, 219–229. [Google Scholar] [CrossRef]
- Lakens, D. Using a smartphone to measure heart rate changes during relived happiness and anger. IEEE Trans. Affect. Comput. 2013, 4, 238–241. [Google Scholar] [CrossRef]
- Griggio, C.F.; Mcgrenere, J.; Mackay, W.E. Customizations and expression breakdowns in ecosystems of communication apps. In Proceedings of the ACM on Human-Computer Interaction 3.CSCW; Association for Computing Machinery: New York, NY, USA, 2019; pp. 1–26. [Google Scholar]
- Shan, C.; Gong, S.; McOwan, P.W. Facial expression recognition based on local binary patterns: A comprehensive study. Image Vis. Comput. 2009, 27, 803–816. [Google Scholar] [CrossRef]
- Krothapalli, S.R.; Koolagudi, S.G. Characterization and recognition of emotions from speech using excitation source information. Int. J. Speech Technol. 2013, 16, 181–201. [Google Scholar] [CrossRef]
- Milton, A.; Sharmy Roy, S.; Selvi, S.T. SVM scheme for speech emotion recognition using MFCC feature. Int. J. Comput. Appl. 2013, 69, 1–6. [Google Scholar] [CrossRef]
- Castellano, G.; Kessous, L.; Caridakis, G. Emotion recognition through multiple modalities: Face, body gesture, speech. In Affect and Emotion in Human-Computer Interaction; Springer: Berlin/Heidelberg, Germany, 2008; pp. 92–103. [Google Scholar]
- Lin, Y.-P.; Wang, C.-H.; Jung, T.-P.; Wu, T.-L.; Jeng, S.-K.; Duann, J.-R.; Chen, J.-H. EEG-based emotion recognition in music listening. IEEE Trans. Biomed. Eng. 2010, 57, 1798–1806. [Google Scholar]
- Canento, F.; Fred, A.; Silva, H.; Gamboa, H.; Lourenço, A. Multimodal biosignal sensor data handling for emotion recognition. In Proceedings of the SENSORS, 2011 IEEE, Limerick, Ireland, 28–31 October 2011. [Google Scholar]
- Kim, J.; Gong, T.; Kim, B.; Park, J.; Kim, W.; Huang, E.; Han, K.; Kim, J.; Ko, J.; Lee, S.-J. No more one liners: Bringing context into emoji recommendations. ACM Trans. Soc. Comput. 2020, 3, 1–25. [Google Scholar] [CrossRef]
- Murphy, S.T.; Zajonc, R.B. Affect, cognition, and awareness: Affective priming with optimal and suboptimal stimulus exposures. J. Personal. Soc. Psychol. 1993, 64, 723. [Google Scholar] [CrossRef]
- Hassib, M.; Buschek, D.; Wozniak, P.W.; Alt, F. HeartChat: Heart rate augmented mobile chat to support empathy and awareness. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, Denver, CO, USA, 6–11 May 2017. [Google Scholar]
- Appelhans, B.M.; Luecken, L.J. Heart rate variability as an index of regulated emotional responding. Rev. Gen. Psychol. 2006, 10, 229–240. [Google Scholar] [CrossRef]
- Miller, H.; Thebault-Spieker, J.; Chang, S.; Johnson, I.; Terveen, L.; Hecht, B. “Blissfully happy” or “ready tofight”: Varying interpretations of emoji. In Proceedings of the International AAAI Conference on Web and Social Media, Cologne, Germany, 17–20 May 2016; Volume 10. [Google Scholar]
- Pohl, H.; Stanke, D.; Rohs, M. EmojiZoom: Emoji entry via large overview maps. In Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services, Florence, Italy, 6–9 September 2016. [Google Scholar]
- Miller, H.; Kluver, D.; Thebault-Spieker, J.; Terveen, L.; Hecht, B. Understanding emoji ambiguity in context: The role of text in emoji-related miscommunication. In Proceedings of the International AAAI Conference on Web and Social Media, Montreal, QC, Canada, 15–18 May 2017; Volume 11. [Google Scholar]
- Wu, C.; Wu, F.; Wu, S.; Huang, Y.; Xie, X. Tweet emoji prediction using hierarchical model with attention. In Proceedings of the 2018 ACM International Joint Conference and 2018 International Symposium on Pervasive and Ubiquitous Computing and Wearable Computers, Singapore, 8–12 October 2018. [Google Scholar]
- Cappallo, S.; Svetlichnaya, S.; Garrigues, P.; Mensink, T.; Snoek, C.G.M. New modality: Emoji challenges in prediction, anticipation, and retrieval. IEEE Trans. Multimed. 2018, 21, 402–415. [Google Scholar] [CrossRef]
- Chen, Z.; Lu, X.; Ai, W.; Li, H.; Mei, Q.; Liu, X. Through a gender lens: Learning usage patterns of emojis from large-scale android users. In Proceedings of the 2018 World Wide Web Conference, Lyon, France, 23–27 April 2018. [Google Scholar]
- Barbieri, F.; Anke, L.E.; Camacho-Collados, J.; Schockaert, S.; Saggion, H. Interpretable emoji prediction via label-wise attention LSTMs. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing; Association for Computational Linguistics: Brussels, Belgium, 2018; pp. 4766–4771. [Google Scholar]
- Liebeskind, C.; Liebeskind, S. Emoji prediction for hebrew political domain. In Proceedings of the Companion Proceedings of The 2019 World Wide Web Conference, San Francisco, CA, USA, 13–17 May 2019. [Google Scholar]
- Chen, Z.; Shen, S.; Hu, Z.; Lu, X.; Mei, Q.; Liu, X. Emoji-powered representation learning for cross-lingual sentiment classification. In Proceedings of the World Wide Web Conference, San Francisco, CA, USA, 13–17 May 2019. [Google Scholar]
- Li, M.; Guntuku, S.; Jakhetiya, V.; Ungar, L. Exploring (dis-)similarities in emoji-emotion association on twitter and weibo. In Proceedings of the Companion Proceedings of The 2019 World Wide Web Conference, San Francisco, CA, USA, 13–17 May 2019. [Google Scholar]
- Zhao, G.; Liu, Z.; Chao, Y.; Qian, X. CAPER: Context-aware personalized emoji recommendation. IEEE Trans. Knowl. Data Eng. 2020, 33, 3160–3172. [Google Scholar] [CrossRef]
- Apple. Animoji. 2017. Available online: https://www.apple.com/newsroom/2017/09/the-future-ishere-iphone-x/ (accessed on 5 March 2023).
- LINE Corporation. LINE: Free Calls & Messages. 2011. Available online: https://line.me/en/ (accessed on 5 March 2023).
- Google LLC. Gboard—The Google Keyboard. 2018. Available online: https://apps.apple.com/us/app/gboard-the-google-keyboard/id1091700242 (accessed on 5 March 2023).
- TouchPal. TouchPal Keyboard. 2013. Available online: https://www.touchpal.com/ (accessed on 5 March 2023).
- Microsoft Research. Introduction to Bayesian Statistics. 2016. Available online: https://www.microsoft.com/en-us/garage/profiles/word-flow-keyboard/ (accessed on 5 March 2023).
- TouchType Ltd. SwiftKey. 2010. Available online: https://swiftkey.com/en (accessed on 5 March 2023).
- Whirlscape. Minuum Keyboard. 2015. Available online: http://minuum.com/ (accessed on 5 March 2023).
- Khandekar, S.; Higg, J.; Bian, Y.; Won Ryu, C.; Talton Iii, J.O.; Kumar, R. Opico: A study of emoji-first communication in a mobile social app. In Proceedings of the Companion Proceedings of the 2019 World Wide Web Conference, San Francisco, CA, USA, 13–17 May 2019. [Google Scholar]
- Alvina, J.; Qu, C.; McGrenere, J.; Mackay, W.E. Mojiboard: Generating parametric emojis with gesture keyboards. In Proceedings of the Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019. [Google Scholar]
- Liu, M.; Wong, A.; Pudipeddi, R.; Hou, B.; Wang, D.; Hsieh, G. ReactionBot: Exploring the effects of expression-triggered emoji in text messages. In Proceedings of the ACM on Human-Computer Interaction; ACM: New York, NY, USA, 2018; pp. 1–16. [Google Scholar]
- El Ali, A.; Wallbaum, T.; Wasmann, M.; Heuten, W.; Boll, S.C.J. Face2emoji: Using facial emotional expressions to filter emojis. In Proceedings of the 2017 Chi Conference Extended Abstracts on Human Factors in Computing Systems, Denver, CO, USA, 6–11 May 2017. [Google Scholar]
- Urabe, Y.; Rzepka, R.; Araki, K. Emoticon recommendation system for effective communication. In Proceedings of the 2013 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining, Niagara, ON, Canada, 25–28 August 2013. [Google Scholar]
- Kim, J.; Ojha, A.; Jin, Y.; Lee, M. Pictogram generator from Korean sentences using emoticon and saliency map. In Proceedings of the 3rd International Conference on Human-Agent Interaction, Daegu, Republic of Korea, 21–24 October 2015. [Google Scholar]
- Hu, J.; Xu, Q.; Fu, L.P.; Xu, Y. Emojilization: An automated method for speech to emoji-labeled text. In Proceedings of the Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019. [Google Scholar]
- Zhang, M.R.; Wang, R.; Xu, X.; Li, Q.; Sharif, A.; Wobbrock, J.O. Voicemoji: Emoji entry using voice for visually impaired people. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, Yokohama, Japan, 8–13 May 2021. [Google Scholar]
- Tigwell, G.W.; Gorman, B.M.; Menzies, R. Emoji accessibility for visually impaired people. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020. [Google Scholar]
- Mittal, P.; Aggarwal, K.; Sahu, P.P.; Vatsalya, V.; Mitra, S.; Singh, V.; Veera, V.; Venkatesan, S.M. Photo-realistic emoticon generation using multi-modal input. In Proceedings of the 25th International Conference on Intelligent User Interfaces, Cagliari, Italy, 17–20 March 2020. [Google Scholar]
- Brian. PLATO Emoticons. Available online: http://www.platohistory.org/blog/2012/09/plato-emoticons-revisited.html (accessed on 19 April 2023).
- Fahlman, S. Smiley Lore :-). Available online: https://www.cs.cmu.edu/~sef/sefSmiley.htm (accessed on 19 April 2023).
- Chen, Z.; Cao, Y.; Yao, H.; Lu, X.; Peng, X.; Mei, H.; Liu, X. Emoji-powered sentiment and emotion detection from software developers’ communication data. ACM Trans. Softw. Eng. Methodol. TOSEM 2021, 30, 1–48. [Google Scholar] [CrossRef]
- Zhang, A.X.; Igo, M.; Facciotti, M.; Karger, D. Using student annotated hashtags and emojis to collect nuanced affective states. In Proceedings of the Fourth (2017) ACM Conference on Learning@Scale, Cambridge, MA, USA, 20–21 April 2017. [Google Scholar]
- Hagen, L.; Falling, M.; Lisnichenko, O.; Elmadany, A.A.; Mehta, P.; Abdul-Mageed, M.; Costakis, J.; Keller, T.E. Emoji use in Twitter white nationalism communication. In Proceedings of the Conference Companion Publication of the 2019 on Computer Supported Cooperative Work and Social Computing, Austin, TX, USA, 9–13 November 2019. [Google Scholar]
- Kelly, R.; Watts, L. Characterising the inventive appropriation of emoji as relationally meaningful in mediated close personal relationships. In Experiences of Technology Appropriation: Unanticipated Users, Usage, Circumstances, and Design; University of Bath: Bath, UK, 2015. [Google Scholar]
- Zhou, R.; Hentschel, J.; Kumar, N. Goodbye text, hello emoji: Mobile communication on WeChat in China. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, Denver, CO, USA, 6–11 May 2017. [Google Scholar]
- Griggio, C.F.; Sato, A.J.; Mackay, W.E.; Yatanil, K. Mediating intimacy with dearboard: A co-customizable keyboard for everyday messaging. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, Yokohama, Japan, 8–13 May 2021. [Google Scholar]
- Yamamoto, H.; Kawahara, M.; Kret, M.; Tanaka, A. Cultural differences in emoticon perception: Japanese see the eyes and Dutch the mouth of emoticons. Lett. Evol. Behav. Sci. 2020, 11, 40–45. [Google Scholar] [CrossRef]
- Ekman, P. Are there basic emotions? Psychol. Rev. 1992, 99, 550–553. [Google Scholar] [CrossRef]
- Russell, J.A.; Barrett, L.F. Core affect, prototypical emotional episodes, and other things called emotion: Dissecting the elephant. J. Personal. Soc. Psychol. 1999, 76, 805. [Google Scholar] [CrossRef]
- Rodrigues, D.; Prada, M.; Gaspar, R.; Garrido, M.V.; Lopes, D. Lisbon Emoji and Emoticon Database (LEED): Norms for emoji and emoticons in seven evaluative dimensions. Behav. Res. Methods 2018, 50, 392–405. [Google Scholar] [CrossRef]
- Winkielman, P.; Berridge, K.C.; Wilbarger, J.L. Unconscious affective reactions to masked happy versus angry faces influence consumption behavior and judgments of value. Personal. Soc. Psychol. Bull. 2005, 31, 121–135. [Google Scholar] [CrossRef]
- Jakobs, E.; Fischer, A.H.; Manstead, A.S.R. Emotional experience as a function of social context: The role of the other. J. Nonverbal Behav. 1997, 21, 103–130. [Google Scholar] [CrossRef]
- Russell, J.A. Is there universal recognition of emotion from facial expression? A review of the cross-cultural studies. Psychol. Bull. 1994, 115, 102. [Google Scholar] [CrossRef] [PubMed]
- Yin, D.; Bond, S.D.; Zhang, H. Keep your cool or let it out: Nonlinear effects of expressed arousal on perceptions of consumer reviews. J. Mark. Res. 2017, 54, 447–463. [Google Scholar] [CrossRef]
- Goldin, P.R.; McRae, K.; Ramel, W.; Gross, J.J. The neural bases of emotion regulation: Reappraisal and suppression of negative emotion. Biol. Psychiatry 2008, 63, 577–586. [Google Scholar] [CrossRef] [PubMed]
- Scott MacKenzie, I. Fitts’ throughput and the remarkable case of touch-based target selection. In Proceedings of the Human-Computer Interaction: Interaction Technologies: 17th International Conference, HCI International 2015, Part II 17, Los Angeles, CA, USA, 2–7 August 2015. [Google Scholar]
- CloverStudio. SPIKA3-Next Generation Opensource Messenger. 2018. Available online: https://www.spika.chat/ (accessed on 5 March 2023).
- Microsoft. Face API. 2019. Available online: https://azure.microsoft.com/en-us/products/cognitive-services/face (accessed on 5 March 2023).
Prime Emotions | Emotional Characteristics | Technical Constraint | |
---|---|---|---|
Facial | HR | ||
Neutral | Expressionless | Steady | Facial expression conflicts with anger emotion |
Happiness | Distinguishable | Decrease | Sufficient with facial expression only |
Surprise | Distinguishable | Increase | Sufficient with facial expression only |
Sadness | Relatively hard to distinguish | Increase | Not conflict but hard to recognize by facial expression |
Anger | Expressionless | Increase | Facial expression conflicts with neutral emotion |
Recognized | Happiness | Sadness | Anger | Surprise | Neutral | ||||||
---|---|---|---|---|---|---|---|---|---|---|---|
Actual | IDE | ITE | IDE | ITE | IDE | ITE | IDE | ITE | IDE | ITE | |
Happiness | 0.95 | 0.72 | 0 | 0 | 0.05 | 0.18 | 0 | 0 | 0 | 0.09 | |
Sadness | 0 | 0 | 0.86 | 0.77 | 0.09 | 0.07 | 0 | 0 | 0.05 | 0.13 | |
Anger | 0 | 0 | 0.09 | 0.05 | 0.90 | 0.80 | 0 | 0 | 0 | 0.15 | |
Surprise | 0 | 0 | 0 | 0 | 0.05 | 0.05 | 0.82 | 0.86 | 0.14 | 0.09 | |
Neutral | 0 | 0 | 0.05 | 0.05 | 0 | 0 | 0 | 0 | 0.95 | 0.95 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Kim, J.; Kang, M.; Seo, B.; Hong, J.; Kim, S. Effective Emoticon Suggestion Technique Based on Active Emotional Input Using Facial Expressions and Heart Rate Signals. Sensors 2023, 23, 4460. https://doi.org/10.3390/s23094460
Kim J, Kang M, Seo B, Hong J, Kim S. Effective Emoticon Suggestion Technique Based on Active Emotional Input Using Facial Expressions and Heart Rate Signals. Sensors. 2023; 23(9):4460. https://doi.org/10.3390/s23094460
Chicago/Turabian StyleKim, Jesung, Mincheol Kang, Bohun Seo, Jeongkyu Hong, and Soontae Kim. 2023. "Effective Emoticon Suggestion Technique Based on Active Emotional Input Using Facial Expressions and Heart Rate Signals" Sensors 23, no. 9: 4460. https://doi.org/10.3390/s23094460
APA StyleKim, J., Kang, M., Seo, B., Hong, J., & Kim, S. (2023). Effective Emoticon Suggestion Technique Based on Active Emotional Input Using Facial Expressions and Heart Rate Signals. Sensors, 23(9), 4460. https://doi.org/10.3390/s23094460