The Meso-Expression Test (MET): A Novel Assessment of Emotion Perception
Abstract
:1. Introduction
1.1. Emotion Perception as the Primary Facet of Emotional Intelligence
1.2. Meso-Expressions
1.3. Emotions Are a ‘Fuzzy Set’
1.4. Concealed Emotions
1.5. Reducing Bias in Emotion Perception Assessments
1.6. Reducing Methodological Bias
2. The Present Research
3. Study 1
3.1. Methods
3.1.1. Expressers/Actors
3.1.2. Emotion Induction
3.1.3. Recording and Technical Procedures
3.1.4. Validation of Stimuli
3.1.5. Participants
3.1.6. Procedures
3.1.7. Ratings
3.1.8. Analyses
3.2. Results
3.3. Discussion
4. Study 2
4.1. Methods
4.1.1. Participants
4.1.2. Selection of Distractor Items
4.1.3. IRT Scoring
4.1.4. Measurement Equivalence
4.1.5. Results
4.1.6. Nonverbal Displays
4.1.7. Verbal Displays
4.1.8. Concealed Displays
4.1.9. Measurement Equivalence
4.2. Discussion
5. Study 3
5.1. Methods
5.1.1. Participants
5.1.2. Measures
5.2. Results
5.3. Discussion
6. Study 4
6.1. Methods
6.1.1. Participants
6.1.2. Measures
6.2. Results
Gender
6.3. Discussion
7. General Discussion
7.1. Limitations and Future Directions
7.2. Practical Contributions and Research Implications
8. Conclusions
Supplementary Materials
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
1 | The most popular assessment of EI is the Mayer-Salovey-Caruso Emotional Intelligence Test (MSCEIT; Mayer 2002) which has over 3000 citations, and evidence of validity across cultures. Assessments of emotion perception include the Japanese and Caucasian Facial Expressions of Emotion instrument (JACFEE; Matsumoto and Ekman 1988), the Diagnostic Analysis of Nonverbal Accuracy (DANVA; Nowicki and Duke 1994) and, more recently, the Geneva Emotion Recognition Test (GERT; Schlegel and Scherer 2016). These assessments all examine macro-expressions. |
2 | In this study, we were interested in both male and female expressors and we included actors who identified as male or female. We also included actors who identified as Asian, Black, Hispanic, and White. We note that other gender and racial identities exist and that the groups included in our study were a starting point to better understand the interplay of intersectionality in emotion perception ability. |
3 | Each participant rated 4 randomly selected nonverbal Caucasian stimuli, 4 randomly selected nonverbal African-American stimuli, 4 randomly selected nonverbal Asian stimuli, 4 randomly selected nonverbal Hispanic stimuli, 3 randomly selected verbal Caucasian stimuli, 3 randomly selected verbal African-American stimuli, 3 randomly selected verbal Hispanic stimuli, 3 randomly selected verbal Asian stimuli, 1 randomly selected concealed Caucasian stimuli, 1 randomly selected concealed African-American stimuli, 1 randomly selected concealed Hispanic stimuli, and 1 randomly selected concealed Asian stimuli. |
4 | Some have argued that perceivers may simply select the same emotion label repeatedly, which can lead to artificially high recognition rates above chance (Laukka et al. 2016; Wagner 1993; See also Gendron et al. 2018). For example, if perceivers always select ‘anger,’ all anger stimuli will be recognized above chance. Research shows, however, that perceivers are actually failing to distinguish anger from other emotions (Elfenbein and Ambady 2003). In this study we were interested in patterns of confusions between emotions which may share phenomenological and cognitive appraisal similarities. In fact, we intentionally included emotions which may be easily confused (e.g., embarrassment and shame) during our assessment development with the intention of both: (1) increasing the difficulty of the test we created, and (2) giving perceivers credit for answers that are close to the intended emotion (i.e., embarrassment may be a plausible answer for a shame stimuli). This Graded Response Model approach, which we elaborate more on in Study 2, fundamentally treats responses as having a continuum of correctness or accuracy (as opposed to a correct/incorrect dichotomy). We believe that since different discrete emotions share phenomenological and appraisal similarities, a graded approach is more appropriate for our study rather than controlling for confusions or false positives. In other words, we treat confusions for emotions that are similar as evidence of emotion perception ability and as evidence that emotions concepts are a fuzzy set. |
5 | Because the expressors are concealing their true emotion, it was expected that perceivers would view the displays as less authentic in our believability rating. |
6 | These results are for the nonverbal and verbal stimuli. The concealed displays were also rated as moderate intensity (2.73 out of 5) but as expected, rated low on believability rating (1.05 out of 5) because these displays involve concealment of emotions. |
7 | We calculated the total amount of information provided by each item by calculating the items’ information function from θ = −3 to θ = +3. This area under the item information curve allows us to compare the relative value of each item in the test to determine which items contribute the most information, and therefore, should be retained. |
8 | We use the mean item information as a starting place to examine whether an item relatively improves the information value of our assessment rather than as a rule-of-thumb cut-off. See below. |
9 | Because the Chi-square test is sensitive to sample size and may be insensitive to model misfit, it is recommended to examine the ratio between chi-square and degrees of freedom (Tay et al. 2011). |
References
- American Educational Research Association, American Psychological Association, and National Council on Measurement in Education. 2014. Standards for Educational and Psychological Testing. Washington, DC: American Educational Research Association. [Google Scholar]
- Andrich, David, and Irene Styles. 2011. Distractors with information in multiple choice items: A rationale based on the Rasch model. Journal of Applied Measurement 12: 67–95. [Google Scholar] [PubMed]
- Bagby, R. Michael, James D. A. Parker, and Graeme J. Taylor. 1994. The twenty-item Toronto Alexithymia Scale—I. Item selection and cross-validation of the factor structure. Journal of Psychosomatic Research 38: 23–32. [Google Scholar] [CrossRef] [PubMed]
- Bänziger, Tanja, Marcello Mortillaro, and Klaus R. Scherer. 2012. Introducing the Geneva Multimodal expression corpus for experimental research on emotion perception. Emotion 12: 1161–79. [Google Scholar] [CrossRef] [PubMed]
- Barrett, Lisa Feldman. 2006. Are emotions natural kinds? Perspectives on Psychological Science 1: 28–58. [Google Scholar] [CrossRef] [PubMed]
- Barrett, Lisa Feldman. 2017a. Functionalism cannot save the classical view of emotion. Social Cognitive and Affective Neuroscience 12: 34–36. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Barrett, Lisa Feldman. 2017b. How Emotions Are Made: The Secret Life of the Brain. London: Pan Macmillan. [Google Scholar]
- Barrett, Lisa Feldman, Ralph Adolphs, Stacy Marsella, Aleix M. Martinez, and Seth D. Pollak. 2019. Emotional expressions reconsidered: Challenges to inferring emotion from human facial movements. Psychological Science in the Public Interest 20: 1–68. [Google Scholar] [CrossRef] [Green Version]
- Barsade, Sigal, A. P. Brief, S. E. Spataro, and J. Greenberg. 2003. The affective revolution in organizational behavior: The emergence of a paradigm. In Organizational Behavior: The State of the Science. Edited by J. Greenberg. Mahwah: Lawrence Erlbaum Associates Publishers, pp. 3–52. [Google Scholar]
- Bijlstra, Gijsbert, Rob W. Holland, and Daniël H. J. Wigboldus. 2010. The social face of emotion recognition: Evaluations versus stereotypes. Journal of Experimental Social Psychology 46: 657–63. [Google Scholar] [CrossRef]
- Blanck, Peter D., Robert Rosenthal, Sara E. Snodgrass, Bella M. DePaulo, and Miron Zuckerman. 1981. Sex differences in eavesdropping on nonverbal cues: Developmental changes. Journal of Personality and Social Psychology 41: 391–96. [Google Scholar] [CrossRef]
- Brackett, Marc A., John D. Mayer, and Rebecca M. Warner. 2004. Emotional intelligence and its relation to everyday behaviour. Personality and Individual Differences 36: 1387–402. [Google Scholar] [CrossRef]
- Brackett, Marc A., Susan E. Rivers, Sara Shiffman, Nicole Lerner, and Peter Salovey. 2006. Relating emotional abilities to social functioning: A comparison of self-report and performance measures of emotional intelligence. Journal of Personality and Social Psychology 91: 780–95. [Google Scholar] [CrossRef] [Green Version]
- Brazeau, Hannah Jasmine. 2021. Being “in Touch”: The Role of Daily Empathic Accuracy and Affectionate Touch Fulfillment in Shaping Well-Being. Ph.D. dissertation, Carleton University, Ottawa, ON, Canada. [Google Scholar]
- Broderick, Joan E., Esi Morgan DeWitt, Nan Rothrock, Paul K. Crane, and Christopher B. Forrest. 2013. Advances in patient-reported outcomes: The NIH PROMIS® measures. Egems 1: 1015. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Brody, Leslie R., and Judith A. Hall. 2008. Gender and emotion in context. Handbook of Emotions 3: 395–408. [Google Scholar]
- Buck, Ross, Stacie R. Powers, and Kyle S. Hull. 2017. Measuring emotional and cognitive empathy using dynamic, naturalistic, and spontaneous emotion displays. Emotion 17: 1120–36. [Google Scholar] [CrossRef]
- Cabello, Rosario, Miguel A. Sorrel, Irene Fernández-Pinto, Natalio Extremera, and Pablo Fernández-Berrocal. 2016. Age and gender differences in ability emotional intelligence in adults: A cross-sectional study. Developmental Psychology 52: 1486. [Google Scholar] [CrossRef]
- Campos, Belinda, Michelle N. Shiota, Dacher Keltner, Gian C. Gonzaga, and Jennifer L. Goetz. 2013. What is shared, what is different? Core relational themes and expressive displays of eight positive emotions. Cognition & Emotion 27: 37–52. [Google Scholar]
- Coan, James, and John J. B. Allen. 2007. Handbook of Emotion Elicitation and Assessment. Oxford: Oxford University Press. [Google Scholar]
- Coan, James A., and David A. Sbarra. 2015. Social baseline theory: The social regulation of risk and effort. Current Opinion in Psychology 1: 87–91. [Google Scholar] [CrossRef] [Green Version]
- Cohen, Jacob. 1988. Statistical Power Analysis for the Behavioral Sciences. New York: Routledge Academic. [Google Scholar]
- Cohen, Sheldon, Tom Kamarck, and Robin Mermelstein. 1994. Perceived stress scale. In Measuring Stress: A Guide for Health and Social Scientists. Oxford: Oxford University Press, vol. 10. [Google Scholar]
- Cordaro, Daniel Thomas. 2014. Universals and Cultural Variations in Emotional Expression. Ph.D. dissertation, UC Berkeley, Berkeley, CA, USA. [Google Scholar]
- Cordaro, Daniel T., Rui Sun, Dacher Keltner, Shanmukh Kamble, Niranjan Huddar, and Galen McNeil. 2018. Universals and cultural variations in 22 emotional expressions across five cultures. Emotion 18: 75–93. [Google Scholar] [CrossRef] [PubMed]
- Cordaro, Daniel T., Rui Sun, Shanmukh Kamble, Niranjan Hodder, Maria Monroy, Alan Cowen, Yang Bai, and Dacher Keltner. 2020. The recognition of 18 facial-bodily expressions across nine cultures. Emotion 20: 1292–300. [Google Scholar] [CrossRef]
- Council of National Psychological Associations for the Advancement of Ethnic Minority Interests. 2016. Testing and Assessment with Persons & Communities of Color. Washington, DC: American Psychological Association. Available online: https://www.apa.org/pi/oema (accessed on 1 July 2023).
- Cowen, Alan S., and Dacher Keltner. 2017. Self-report captures 27 distinct categories of emotion bridged by continuous gradients. Proceedings of the National Academy of Sciences 114: E7900–E7909. [Google Scholar] [CrossRef]
- Cowen, Alan S., and Dacher Keltner. 2020. What the face displays: Mapping 28 emotions conveyed by naturalistic expression. American Psychologist 75: 349–64. [Google Scholar] [CrossRef]
- Cowen, Alan S., and Dacher Keltner. 2021. Semantic space theory: A computational approach to emotion. Trends in Cognitive Sciences 25: 124–36. [Google Scholar] [CrossRef] [PubMed]
- Cowen, Alan S., Dacher Keltner, Florian Schroff, Brendan Jou, Hartwig Adam, and Gautam Prasad. 2021. Sixteen facial expressions occur in similar contexts worldwide. Nature 589: 251–57. [Google Scholar] [CrossRef] [PubMed]
- Cronbach, Lee J., and Paul E. Meehl. 1955. Construct validity in psychological tests. Psychological Bulletin 52: 281–302. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Dael, Nele, Marcello Mortillaro, and Klaus R. Scherer. 2012. Emotion expression in body action and posture. Emotion 12: 1085–101. [Google Scholar] [CrossRef] [Green Version]
- Davis, Mark H. 1980. A multidimensional approach to individual differences in empathy. JSAS Catalog of Selected Documents in Psychology 10: 85. [Google Scholar]
- Dovidio, John F., and Agata Gluszek. 2012. Accents, nonverbal behavior, and intergroup bias. In The Handbook of Intergroup Communication. London: Routledge, pp. 109–21. [Google Scholar]
- Ekman, Paul. 1992. An argument for basic emotions. Cognition & Emotion 6: 169–200. [Google Scholar]
- Ekman, Paul. 1997. What the Face Reveals: Basic and Applied Studies of Spontaneous Expression Using the Facial Action Coding System (FACS). Oxford: Oxford University Press. [Google Scholar]
- Ekman, Paul. 2003. Emotions Revealed: Recognizing Faces and Feelings to Improve Communication and Emotional Life. New York: Times Books/Henry Holt and Co. [Google Scholar]
- Ekman, Paul. 2009. Lie catching and microexpressions. The Philosophy of Deception 1: 5. [Google Scholar]
- Ekman, Paul, and Wallace V. Friesen. 1971. Constants across cultures in the face and emotion. Journal of Personality and Social Psychology 17: 124–29. [Google Scholar] [CrossRef] [Green Version]
- Ekman, Paul, and Wallace V. Friesen. 1974. Nonverbal behavior and psychopathology. In The Psychology of Depression: Contemporary Theory and Research. New York: Halsted Press, pp. 3–31. [Google Scholar]
- Ekman, Paul, and Wallace V. Friesen. 1986. A new pan-cultural facial expression of emotion. Motivation and Emotion 10: 159–68. [Google Scholar] [CrossRef] [Green Version]
- Ekman, Paul, E. Richard Sorenson, and Wallace V. Friesen. 1969. Pan-cultural elements in facial displays of emotion. Science 164: 86–88. [Google Scholar] [CrossRef] [Green Version]
- Ekman, Paul, Richard J. Davidson, and Wallace V. Friesen. 1990. The Duchenne smile: Emotional expression and brain physiology: II. Journal of Personality and Social Psychology 58: 342–53. [Google Scholar] [CrossRef]
- Ekman, Paul, Wallace V. Friesen, Maureen O’sullivan, Anthony Chan, Irene Diacoyanni-Tarlatzis, Karl Heider, Rainer Krause, William Ayhan LeCompte, Tom Pitcairn, and Pio Enrico Ricci-Bitti. 1987. Universals and cultural differences in the judgments of facial expressions of emotion. Journal of Personality and Social Psychology 53: 712–17. [Google Scholar] [CrossRef] [PubMed]
- El Haj, Mohamad, Pascal Antoine, and Jean Louis Nandrino. 2016. More emotional facial expressions during episodic than during semantic autobiographical retrieval. Cognitive, Affective, & Behavioral Neuroscience 16: 374–81. [Google Scholar]
- Elfenbein, Hillary Anger, and Nalini Ambady. 2002a. On the universality and cultural specificity of emotion recognition: A meta-analysis. Psychological Bulletin 128: 203. [Google Scholar] [CrossRef] [Green Version]
- Elfenbein, Hillary Anger, and Nalini Ambady. 2002b. Predicting workplace outcomes from the ability to eavesdrop on feelings. Journal of Applied Psychology 87: 963–71. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Elfenbein, Hillary Anger, and Nalini Ambady. 2003. Universals and cultural differences in recognizing emotions. Current Directions in Psychological Science 12: 159–64. [Google Scholar] [CrossRef]
- Ellsworth, Phoebe C., and Klaus R. Scherer. 2003. Appraisal processes in emotion. In Handbook of Affective Sciences. Oxford: Oxford University Press, vol. 572, p. V595. [Google Scholar]
- Fernández-Berrocal, Pablo, and Natalio Extremera. 2016. Ability emotional intelligence, depression, and well-being. Emotion Review 8: 311–15. [Google Scholar] [CrossRef]
- Fischer, Agneta H., and Antony S. R. Manstead. 2016. Social functions of emotion and emotion regulation. In Handbook of Emotions. New York: The Guilford Press, vol. 4, pp. 424–39. [Google Scholar]
- Frijda, Nico H. 2007. The Laws of Emotion. London: Psychology Press. [Google Scholar]
- Frijda, Nico H. 2008. The psychologists’ point of view. In Handbook of Emotions. Edited by Michael Lewis, Jeannette M. Haviland Jones and Lisa Barrett Feldman. New York: The Guilford Press, pp. 68–87. [Google Scholar]
- Furman, Wyndol, and Duane Buhrmester. 2010. Network of Relationships Questionnaire Manual. Unpublished Manuscript. Denver: University of Denver. Dallas: University of Texas at Dallas. [Google Scholar]
- Gendron, Maria, Carlos Crivelli, and Lisa Feldman Barrett. 2018. Universality reconsidered: Diversity in making meaning of facial expressions. Current Directions in Psychological Science 27: 211–19. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Gitter, A. George, Harvey Black, and David Mostofsky. 1972. Race and Sex in the Perception of Emotion 1. Journal of Social Issues 28: 63–78. [Google Scholar] [CrossRef]
- Grandey, Alicia A., and Robert C. Melloy. 2017. The state of the heart: Emotional labor as emotion regulation reviewed and revised. Journal of Occupational Health Psychology 22: 407–22. [Google Scholar] [CrossRef]
- Gregory, Amy J. P., Jason F. Anderson, and Shelly L. Gable. 2020. You don’t know how it feels: Accuracy in emotion perception predicts responsiveness of support. Emotion 20: 343–52. [Google Scholar] [CrossRef] [PubMed]
- Halberstadt, A. G., A. N. Cooke, P. W. Garner, S. A. Hughes, D. Oertwig, and S. D. Neupert. 2020. Racialized emotion recognition accuracy and anger bias of children’s faces. Emotion 22: 403. [Google Scholar] [CrossRef] [PubMed]
- Hall, Judith A., Susan A. Andrzejewski, and Jennelle E. Yopchick. 2009. Psychosocial correlates of interpersonal sensitivity: A meta-analysis. Journal of Nonverbal Behavior 33: 149–80. [Google Scholar] [CrossRef]
- Hall, Judith A., Susan A. Andrzejewski, Nora A. Murphy, Marianne Schmid Mast, and Brian A. Feinstein. 2008. Accuracy of judging others’ traits and states: Comparing mean levels across tests. Journal of Research in Personality 42: 1476–89. [Google Scholar] [CrossRef] [Green Version]
- Hochschild, Arlie. 1983. The Managed Heart. Berkeley: University of California Press. [Google Scholar]
- Hugenberg, Kurt. 2005. Social categorization and the perception of facial affect: Target race moderates the response latency advantage for happy faces. Emotion 5: 267–76. [Google Scholar] [CrossRef] [PubMed]
- Izard, C. E. 1971. The Face of Emotion. East Norwalk: Appleton-Century-Crofts. [Google Scholar]
- Jordan, Sarah, Laure Brimbal, D. Brian Wallace, Saul M. Kassin, Maria Hartwig, and Chris NH Street. 2019. A test of the micro-expressions training tool: Does it improve lie detection? Journal of Investigative Psychology and Offender Profiling 16: 222–35. [Google Scholar] [CrossRef]
- Joseph, Dana L., and Daniel A. Newman. 2010. Emotional intelligence: An integrative meta-analysis and cascading model. Journal of Applied Psychology 95: 54–78. [Google Scholar] [CrossRef] [Green Version]
- Joseph, Dana L., Micaela Y. Chan, Samantha J. Heintzelman, Louis Tay, Ed Diener, and Victoria S. Scotney. 2020. The manipulation of affect: A meta-analysis of affect induction procedures. Psychological Bulletin 146: 275–355. [Google Scholar] [CrossRef]
- Juslin, Patrik N., and Petri Laukka. 2003. Communication of emotions in vocal expression and music performance: Different channels, same code? Psychological Bulletin 129: 770–814. [Google Scholar] [CrossRef] [Green Version]
- Keltner, D. 1995. Signs of appeasement: Evidence for the distinct displays of embarrassment, amusement, and shame. Journal of Personality and Social Psychology 68: 441. [Google Scholar] [CrossRef]
- LaPalme, Matthew L., Wei Wang, Dana L. Joseph, Donald H. Saklofske, and Gonggu Yan. 2016. Measurement equivalence of the Wong and Law Emotional Intelligence Scale across cultures: An item response theory approach. Personality and Individual Differences 90: 190–98. [Google Scholar] [CrossRef]
- Laukka, Petri, and Hillary Anger Elfenbein. 2012. Emotion appraisal dimensions can be inferred from vocal expressions. Social Psychological and Personality Science 3: 529–36. [Google Scholar] [CrossRef]
- Laukka, Petri, and Hillary Anger Elfenbein. 2021. Cross-cultural emotion recognition and in-group advantage in vocal expression: A meta-analysis. Emotion Review 13: 3–11. [Google Scholar] [CrossRef]
- Laukka, Petri, Hillary Anger Elfenbein, Nutankumar S. Thingujam, Thomas Rockstuhl, Frederick K. Iraki, Wanda Chui, and Jean Althoff. 2016. The expression and recognition of emotions in the voice across five nations: A lens model analysis based on acoustic features. Journal of Personality and Social Psychology 111: 686–705. [Google Scholar] [CrossRef]
- Le, Bonnie M., and Emily A. Impett. 2016. The costs of suppressing negative emotions and amplifying positive emotions during parental caregiving. Personality and Social Psychology Bulletin 42: 323–36. [Google Scholar] [CrossRef]
- Lee, Richard M., and Steven B. Robbins. 1995. Measuring belongingness: The social connectedness and the social assurance scales. Journal of Counseling Psychology 42: 232–41. [Google Scholar] [CrossRef]
- Levenson, Robert W., Laura L. Carstensen, Wallace V. Friesen, and Paul Ekman. 1991. Emotion, physiology, and expression in old age. Psychology and Aging 6: 28–35. [Google Scholar] [CrossRef]
- Liliana, Dewi Yanti, Tjan Basaruddin, M. Rahmat Widyanto, and Imelda Ika Dian Oriza. 2019. Fuzzy emotion: A natural approach to automatic facial expression recognition from psychological perspective using fuzzy system. Cognitive Processing 20: 391–403. [Google Scholar] [CrossRef]
- Lord, Frederic M. 2012. Applications of Item Response Theory to Practical Testing Problems. London: Routledge. [Google Scholar]
- Lynn, Spencer K., and Lisa Feldman Barrett. 2014. “Utilizing” signal detection theory. Psychological Science 25: 1663–73. [Google Scholar] [CrossRef] [Green Version]
- MacCann, Carolyn, Yixin Jiang, Luke E. R. Brown, Kit S. Double, Micaela Bucich, and Amirali Minbashian. 2020. Emotional intelligence predicts academic performance: A meta-analysis. Psychological Bulletin 146: 150–86. [Google Scholar] [CrossRef]
- Matsumoto, David. 1992. American-Japanese Cultural Differences in the Recognition of Universal Facial Expressions. Journal of Cross-Cultural Psychology 23: 72–84. [Google Scholar] [CrossRef]
- Matsumoto, David, ed. 2001. The Handbook of Culture and Psychology. Oxford: Oxford University Press. [Google Scholar]
- Matsumoto, David, and Hyi Sung Hwang. 2011. Reading Facial Expressions of Emotion. APA. Available online: https://www.apa.org/science/about/psa/2011/05/facial-expressions (accessed on 1 July 2023).
- Matsumoto, David, and Paul Ekman. 1988. Japanese and Caucasian Facial Expressions of Emotion (JACFEE) [Slides]. San Francisco: Intercultural and Emotion Research Laboratory, Department of Psychology, San Francisco State University. [Google Scholar]
- Matsumoto, David, and Hyi Sung Hwang. 2014. Judgments of subtle facial expressions of emotion. Emotion 14: 349–57. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Matsumoto, David, Jeff LeRoux, Carinda Wilson-Cohn, Jake Raroque, Kristie Kooken, Paul Ekman, Nathan Yrizarry, Sherry Loewinger, Hideko Uchida, Albert Yee, and et al. 2000. A new test to measure emotion recognition ability: Matsumoto and Ekman’s Japanese and Caucasian Brief Affect Recognition Test (JACBART). Journal of Nonverbal Behavior 24: 179–209. [Google Scholar] [CrossRef]
- Mayer, John D. 2002. MSCEIT: Mayer-Salovey-Caruso Emotional Intelligence Test. Toronto: Multi-Health Systems. [Google Scholar]
- Mayer, John D., David R. Caruso, and Peter Salovey. 2016. The ability model of emotional intelligence: Principles and updates. Emotion Review 8: 290–300. [Google Scholar] [CrossRef] [Green Version]
- Mayer, John D., Richard D. Roberts, and Sigal G. Barsade. 2008. Human abilities: Emotional intelligence. Annual Review of Psychology 59: 507–36. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Mehrabian, Albert. 1971. Silent Messages. Belmont: Wadsworth, vol. 8. [Google Scholar]
- Mehrabian, Albert. 2017. Nonverbal Communication. London: Routledge. [Google Scholar]
- Mehrabian, Albert, and Susan R. Ferris. 1967. Inference of attitudes from nonverbal communication in two channels. Journal of Consulting Psychology 31: 248. [Google Scholar] [CrossRef]
- Nowicki, Stephen, and Marshall P. Duke. 1994. Individual differences in the nonverbal communication of affect: The Diagnostic Analysis of Nonverbal Accuracy Scale. Journal of Nonverbal Behavior 18: 9–35. [Google Scholar] [CrossRef]
- Nye, C. 2011. The Development and Validation of Effect Size Measure for IRT and CFA Studies of Measurement Equivalence. Ph.D. thesis, University of Illinois at Urbana-Champaign, Champaign, IL, USA. [Google Scholar]
- Olderbak, Sally, Oliver Wilhelm, Andrea Hildebrandt, and Jordi Quoidbach. 2019. Sex differences in facial emotion perception ability across the lifespan. Cognition and Emotion 33: 579–88. [Google Scholar] [CrossRef]
- Perkins, Adam M., Sophie L. Inchley-Mort, Alan D. Pickering, Philip J. Corr, and Adrian P. Burgess. 2012. A facial expression for anxiety. Journal of Personality and Social Psychology 102: 910–24. [Google Scholar] [CrossRef] [Green Version]
- Pietromonaco, Paula R., Bert Uchino, and Christine Dunkel Schetter. 2013. Close relationship processes and health: Implications of attachment theory for health and disease. Health Psychology 32: 499–513. [Google Scholar] [CrossRef] [Green Version]
- Porter, Stephen, and Leanne Ten Brinke. 2008. Reading between the lies: Identifying concealed and falsified emotions in universal facial expressions. Psychological Science 19: 508–14. [Google Scholar] [CrossRef] [PubMed]
- Puccinelli, Nancy M., and Linda Tickle-Degnen. 2004. Knowing too much about others: Moderators of the relationship between eavesdropping and rapport in social interaction. Journal of Nonverbal Behavior 28: 223–43. [Google Scholar] [CrossRef]
- Rasch, George. 1960. Studies in Mathematical Psychology: I. Probabilistic Models for Some Intelligence and Attainment Tests. Copenhagen: Nielsen & Lydiche. [Google Scholar]
- Rinn, William E. 1984. The neuropsychology of facial expression: A review of the neurological and psychological mechanisms for producing facial expressions. Psychological Bulletin 95: 52–77. [Google Scholar] [CrossRef] [PubMed]
- Rodriguez, Michael C. 2005. Three options are optimal for multiple-choice items: A meta-analysis of 80 years of research. Educational Measurement: Issues and Practice 24: 3–13. [Google Scholar] [CrossRef]
- Rosenthal, Robert, and Donald B. Rubin. 1989. Effect size estimation for one-sample multiple-choice-type data: Design, analysis, and meta-analysis. Psychological Bulletin 106: 332–37. [Google Scholar] [CrossRef]
- Russell, James A. 1980. A circumplex model of affect. Journal of Personality and Social Psychology 39: 1161–78. [Google Scholar] [CrossRef]
- Ryff, Carol D., and Corey Lee M. Keyes. 1995. The structure of psychological well-being revisited. Journal of Personality and Social Psychology 69: 719–27. [Google Scholar] [CrossRef]
- Salovey, Peter. 1992. Mood-induced self-focused attention. Journal of Personality and Social Psychology 62: 699–707. [Google Scholar]
- Salovey, Peter, and John D. Mayer. 1990. Emotional intelligence. Imagination, Cognition and Personality 9: 185–211. [Google Scholar] [CrossRef]
- Samejima, Fumiko. 1969. Estimation of latent ability using a response pattern of graded scores. In Psychometrika Monograph Supplement. Champaign: Psychometric Society. [Google Scholar]
- Sánchez-Álvarez, Nicolás, Natalio Extremera, and Pablo Fernández-Berrocal. 2016. The relation between emotional intelligence and subjective well-being: A meta-analytic investigation. The Journal of Positive Psychology 11: 276–85. [Google Scholar] [CrossRef]
- Sbarra, David A., and James A. Coan. 2018. Relationships and health: The critical role of affective science. Emotion Review 10: 40–54. [Google Scholar] [CrossRef]
- Schalet, Benjamin D, Paul A. Pilkonis, Lan Yu, Nathan Dodds, Kelly L. Johnston, Susan Yount, William Riley, and David Cella. 2016. Clinical validity of PROMIS Depression, Anxiety, and Anger across diverse clinical samples. Journal of Clinical Epidemiology 73: 119–27. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Scherer, K. R., and H. Ellgring. 2007. Multimodal expression of emotion: Affect programs or componential appraisal patterns? Emotion 7: 158–71. [Google Scholar] [CrossRef] [PubMed]
- Schlegel, Katja, and Klaus R. Scherer. 2016. Introducing a short version of the Geneva Emotion Recognition Test (GERT-S): Psychometric properties and construct validation. Behavior Research Methods 48: 1383–92. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Schlegel, Katja, Didier Grandjean, and Klaus R. Scherer. 2014. Introducing the Geneva emotion recognition test: An example of Rasch-based test development. Psychological Assessment 26: 666–72. [Google Scholar] [CrossRef] [Green Version]
- Shaver, Phillip, Judith Schwartz, Donald Kirson, and Cary O’connor. 1987. Emotion knowledge: Further exploration of a prototype approach. Journal of Personality and Social Psychology 52: 1061–86. [Google Scholar] [CrossRef]
- Simpson, J. A., M. M. Oriña, and W. Ickes. 2003. When accuracy hurts, and when it helps: A test of the empathic accuracy model in marital interactions. Journal of Personality and Social Psychology 85: 881. [Google Scholar] [CrossRef]
- Stanislavski, Constantin. 1989. An Actor Prepares. London: Routledge. [Google Scholar]
- Sternberg, Robert J., Linda Jarvin, Damian P. Birney, Adam Naples, Steven E. Stemler, Tina Newman, Renate Otterbach, Carolyn Parish, Judy Randi, and Elena L. Grigorenko. 2014. Testing the theory of successful intelligence in teaching grade 4 language arts, mathematics, and science. Journal of Educational Psychology 106: 881–99. [Google Scholar] [CrossRef] [Green Version]
- Sternglanz, R. Weylin, and Bella M. DePaulo. 2004. Reading nonverbal cues to emotions: The advantages and liabilities of relationship closeness. Journal of Nonverbal Behavior 28: 245–66. [Google Scholar] [CrossRef]
- Tamietto, Marco, and Beatrice De Gelder. 2010. Neural bases of the non-conscious perception of emotional signals. Nature Reviews Neuroscience 11: 697–709. [Google Scholar] [CrossRef]
- Tay, Louis, Usama S. Ali, Fritz Drasgow, and Bruce Williams. 2011. Fitting IRT models to dichotomous and polytomous data: Assessing the relative model–data fit of ideal point and dominance models. Applied Psychological Measurement 35: 280–95. [Google Scholar] [CrossRef]
- Tracy, Jessica L., and Richard W. Robins. 2008a. The automaticity of emotion recognition. Emotion 8: 81–95. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Tracy, Jessica L., and Richard W. Robins. 2008b. The nonverbal expression of pride: Evidence for cross-cultural recognition. Journal of Personality and Social Psychology 94: 516–30. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Vandenberg, Robert J., and Charles E. Lance. 2000. A review and synthesis of the measurement invariance literature: Suggestions, practices, and recommendations for organizational research. Organizational Research Methods 3: 4–70. [Google Scholar] [CrossRef]
- Vigliocco, Gabriella, Stavroula-Thaleia Kousta, Pasquale Anthony Della Rosa, David P. Vinson, Marco Tettamanti, Joseph T. Devlin, and Stefano F. Cappa. 2014. The neural representation of abstract words: The role of emotion. Cerebral Cortex 24: 1767–77. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Wagner, Hugh L. 1993. On measuring performance in category judgment studies of nonverbal behavior. Journal of nonverbal behavior 17: 3–28. [Google Scholar] [CrossRef]
- Wang, Wei, Louis Tay, and Fritz Drasgow. 2013. Detecting differential item functioning of polytomous items for an ideal point response process. Applied Psychological Measurement 37: 316–35. [Google Scholar] [CrossRef]
- Warren, Gemma, Elizabeth Schertler, and Peter Bull. 2009. Detecting deception from emotional and unemotional cues. Journal of Nonverbal Behavior 33: 59–69. [Google Scholar] [CrossRef]
- Waters, Sara F., Helena Rose Karnilowicz, Tessa V. West, and Wendy Berry Mendes. 2020. Keep it to yourself? Parent emotion suppression influences physiological linkage and interaction behavior. Journal of Family Psychology 34: 784–93. [Google Scholar] [CrossRef]
- Watson, David, Lee Anna Clark, and Auke Tellegen. 1988. Development and validation of brief measures of positive and negative affect: The PANAS scales. Journal of Personality and Social Psychology 54: 1063–70. [Google Scholar] [CrossRef]
- Wonderlic, Inc. 2007. Wonderlic Cognitive Ability Pretest (WPT-Q) Administrator’s Guide. Vernon Hills: Wonderlic, Inc. [Google Scholar]
- Yan, Wen-Jing, Qi Wu, Jing Liang, Yu-Hsin Chen, and Xiaolan Fu. 2013. How fast are the leaked facial expressions: The duration of micro-expressions. Journal of Nonverbal Behavior 37: 217–30. [Google Scholar] [CrossRef]
- Young, Steven G., Kurt Hugenberg, Michael J. Bernstein, and Donald F. Sacco. 2012. Perception and motivation in face recognition: A critical review of theories of the cross-race effect. Personality and Social Psychology Review 16: 116–42. [Google Scholar] [CrossRef] [PubMed]
- Zadeh, L. A. 1965. Fuzzy sets. Information and Control 8: 338–53. [Google Scholar] [CrossRef] [Green Version]
Amu | Ang | Anx | Awe | Bor | Cmp | Con | Dis | Emb | Fea | Joy | Neu | Pri | Rel | Sad | Sha | Sup | Sym | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Amusement (Amu) | 0.92 | 0.03 | 0.12 | 0.26 | 0.06 | 0.27 | 0.63 | 0.07 | 0.30 | 0.05 | 0.88 | 0.33 | 0.39 | 0.39 | 0.02 | 0.09 | 0.21 | 0.09 |
Anger (Ang) | 0.04 | 0.93 | 0.31 | 0.07 | 0.33 | 0.74 | 0.21 | 0.69 | 0.13 | 0.41 | 0.04 | 0.59 | 0.09 | 0.15 | 0.49 | 0.36 | 0.04 | 0.11 |
Anxiety (Anx) | 0.03 | 0.42 | 0.85 | 0.12 | 0.38 | 0.41 | 0.16 | 0.50 | 0.59 | 0.74 | 0.06 | 0.60 | 0.08 | 0.25 | 0.66 | 0.65 | 0.23 | 0.18 |
Awe | 0.49 | 0.06 | 0.51 | 0.76 | 0.24 | 0.35 | 0.43 | 0.26 | 0.23 | 0.51 | 0.49 | 0.75 | 0.24 | 0.60 | 0.23 | 0.18 | 0.80 | 0.20 |
Boredom (Bor) | 0.03 | 0.34 | 0.39 | 0.10 | 0.90 | 0.62 | 0.17 | 0.60 | 0.39 | 0.20 | 0.05 | 0.65 | 0.07 | 0.35 | 0.65 | 0.59 | 0.13 | 0.21 |
Contempt (Cmp) | 0.05 | 0.85 | 0.35 | 0.09 | 0.47 | 0.77 | 0.28 | 0.80 | 0.25 | 0.33 | 0.04 | 0.65 | 0.14 | 0.15 | 0.47 | 0.48 | 0.10 | 0.16 |
Content (Con) | 0.57 | 0.05 | 0.23 | 0.27 | 0.59 | 0.50 | 0.82 | 0.24 | 0.18 | 0.06 | 0.66 | 0.80 | 0.41 | 0.69 | 0.21 | 0.14 | 0.08 | 0.30 |
Disgust (Dis) | 0.03 | 0.67 | 0.46 | 0.18 | 0.31 | 0.70 | 0.18 | 0.91 | 0.27 | 0.47 | 0.03 | 0.58 | 0.08 | 0.10 | 0.56 | 0.41 | 0.24 | 0.14 |
Embarrassment (Emb) | 0.70 | 0.09 | 0.62 | 0.27 | 0.27 | 0.38 | 0.43 | 0.40 | 0.80 | 0.41 | 0.49 | 0.48 | 0.29 | 0.49 | 0.41 | 0.71 | 0.32 | 0.32 |
Fear (Fea) | 0.03 | 0.48 | 0.78 | 0.26 | 0.19 | 0.35 | 0.10 | 0.49 | 0.31 | 0.89 | 0.05 | 0.58 | 0.04 | 0.20 | 0.57 | 0.31 | 0.65 | 0.17 |
Joy | 0.82 | 0.02 | 0.22 | 0.34 | 0.09 | 0.29 | 0.74 | 0.06 | 0.22 | 0.09 | 0.89 | 0.55 | 0.49 | 0.54 | 0.03 | 0.03 | 0.19 | 0.11 |
Neutral (Neu) | 0.01 | 0.46 | 0.26 | 0.14 | 0.67 | 0.54 | 0.35 | 0.32 | 0.04 | 0.26 | 0.06 | 0.96 | 0.12 | 0.13 | 0.35 | 0.14 | 0.18 | 0.12 |
Pride (Pri) | 0.72 | 0.11 | 0.35 | 0.31 | 0.17 | 0.42 | 0.78 | 0.12 | 0.24 | 0.19 | 0.69 | 0.66 | 0.78 | 0.68 | 0.05 | 0.07 | 0.20 | 0.18 |
Relief (Rel) | 0.11 | 0.09 | 0.51 | 0.16 | 0.72 | 0.43 | 0.38 | 0.47 | 0.22 | 0.30 | 0.14 | 0.56 | 0.10 | 0.92 | 0.50 | 0.34 | 0.14 | 0.31 |
Sadness (Sad) | 0.03 | 0.19 | 0.49 | 0.12 | 0.43 | 0.39 | 0.17 | 0.45 | 0.33 | 0.54 | 0.08 | 0.71 | 0.04 | 0.15 | 0.92 | 0.63 | 0.11 | 0.42 |
Shame (Sha) | 0.03 | 0.30 | 0.49 | 0.04 | 0.32 | 0.40 | 0.12 | 0.54 | 0.68 | 0.46 | 0.07 | 0.47 | 0.03 | 0.08 | 0.84 | 0.87 | 0.11 | 0.30 |
Surprise (Sup) | 0.45 | 0.02 | 0.39 | 0.71 | 0.22 | 0.24 | 0.37 | 0.21 | 0.23 | 0.48 | 0.49 | 0.51 | 0.10 | 0.70 | 0.13 | 0.09 | 0.91 | 0.13 |
Sympathy (Sym) | 0.04 | 0.16 | 0.53 | 0.15 | 0.55 | 0.47 | 0.30 | 0.40 | 0.38 | 0.50 | 0.05 | 0.70 | 0.04 | 0.34 | 0.83 | 0.62 | 0.13 | 0.77 |
Amu | Ang | Anx | Awe | Bor | Cmp | Con | Dis | Emb | Fea | Joy | Neu | Pri | Rel | Sad | Sha | Sup | Sym | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Amusement (Amu) | 0.78 | 0.08 | 0.33 | 0.26 | 0.36 | 0.30 | 0.56 | 0.17 | 0.28 | 0.27 | 0.82 | 0.80 | 0.32 | 0.41 | 0.49 | 0.13 | 0.49 | 0.19 |
Anger (Ang) | 0.20 | 0.59 | 0.38 | 0.13 | 0.66 | 0.63 | 0.53 | 0.45 | 0.22 | 0.27 | 0.33 | 0.89 | 0.40 | 0.31 | 0.50 | 0.25 | 0.13 | 0.40 |
Anxiety (Anx) | 0.17 | 0.09 | 0.79 | 0.19 | 0.49 | 0.39 | 0.50 | 0.25 | 0.45 | 0.66 | 0.35 | 0.83 | 0.20 | 0.45 | 0.49 | 0.34 | 0.44 | 0.39 |
Awe | 0.19 | 0.03 | 0.61 | 0.49 | 0.39 | 0.27 | 0.56 | 0.17 | 0.27 | 0.53 | 0.57 | 0.83 | 0.29 | 0.57 | 0.38 | 0.28 | 0.66 | 0.58 |
Boredom (Bor) | 0.17 | 0.21 | 0.45 | 0.19 | 0.87 | 0.52 | 0.43 | 0.41 | 0.25 | 0.29 | 0.19 | 0.84 | 0.24 | 0.36 | 0.61 | 0.33 | 0.17 | 0.21 |
Contempt (Cmp) | 0.18 | 0.27 | 0.42 | 0.22 | 0.73 | 0.56 | 0.59 | 0.43 | 0.20 | 0.29 | 0.34 | 0.88 | 0.26 | 0.35 | 0.57 | 0.28 | 0.34 | 0.42 |
Content (Con) | 0.29 | 0.08 | 0.39 | 0.31 | 0.62 | 0.40 | 0.67 | 0.16 | 0.26 | 0.28 | 0.50 | 0.90 | 0.27 | 0.43 | 0.58 | 0.25 | 0.21 | 0.50 |
Disgust (Dis) | 0.16 | 0.26 | 0.48 | 0.15 | 0.69 | 0.59 | 0.50 | 0.46 | 0.30 | 0.35 | 0.31 | 0.89 | 0.25 | 0.33 | 0.60 | 0.28 | 0.25 | 0.34 |
Embarrassment (Emb) | 0.41 | 0.09 | 0.65 | 0.26 | 0.64 | 0.41 | 0.53 | 0.26 | 0.48 | 0.57 | 0.52 | 0.83 | 0.21 | 0.41 | 0.56 | 0.34 | 0.30 | 0.43 |
Fear (Fea) | 0.18 | 0.09 | 0.77 | 0.25 | 0.43 | 0.22 | 0.44 | 0.15 | 0.37 | 0.77 | 0.31 | 0.81 | 0.19 | 0.48 | 0.62 | 0.33 | 0.38 | 0.43 |
Joy | 0.67 | 0.06 | 0.40 | 0.23 | 0.39 | 0.31 | 0.63 | 0.13 | 0.19 | 0.21 | 0.80 | 0.85 | 0.40 | 0.39 | 0.35 | 0.17 | 0.45 | 0.39 |
Neutral (Neu) | 0.23 | 0.18 | 0.18 | 0.19 | 0.58 | 0.42 | 0.68 | 0.26 | 0.13 | 0.08 | 0.48 | 0.94 | 0.39 | 0.34 | 0.40 | 0.14 | 0.18 | 0.36 |
Pride (Pri) | 0.35 | 0.20 | 0.27 | 0.20 | 0.57 | 0.47 | 0.66 | 0.29 | 0.11 | 0.16 | 0.66 | 0.90 | 0.51 | 0.29 | 0.51 | 0.21 | 0.35 | 0.30 |
Relief (Rel) | 0.31 | 0.15 | 0.59 | 0.29 | 0.69 | 0.47 | 0.50 | 0.38 | 0.31 | 0.45 | 0.51 | 0.75 | 0.26 | 0.73 | 0.54 | 0.30 | 0.31 | 0.47 |
Sadness (Sad) | 0.07 | 0.09 | 0.63 | 0.14 | 0.60 | 0.35 | 0.35 | 0.21 | 0.34 | 0.70 | 0.18 | 0.82 | 0.12 | 0.30 | 0.81 | 0.48 | 0.20 | 0.51 |
Shame (Sha) | 0.12 | 0.07 | 0.58 | 0.19 | 0.71 | 0.37 | 0.46 | 0.24 | 0.40 | 0.57 | 0.21 | 0.85 | 0.14 | 0.36 | 0.72 | 0.46 | 0.23 | 0.48 |
Surprise (Sup) | 0.58 | 0.05 | 0.46 | 0.37 | 0.38 | 0.22 | 0.58 | 0.12 | 0.22 | 0.34 | 0.80 | 0.77 | 0.36 | 0.51 | 0.23 | 0.12 | 0.76 | 0.34 |
Sympathy (Sym) | 0.17 | 0.06 | 0.45 | 0.25 | 0.53 | 0.37 | 0.59 | 0.19 | 0.29 | 0.53 | 0.42 | 0.86 | 0.18 | 0.45 | 0.64 | 0.33 | 0.36 | 0.70 |
Amu | Ang | Anx | Awe | Bor | Cmp | Con | Dis | Emb | Fea | Joy | Neu | Pri | Rel | Sad | Sha | Sup | Sym | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Concealed emotion | ||||||||||||||||||
Anger (Ang) | 0.61 | 0.33 | 0.22 | 0.17 | 0.45 | 0.40 | 0.75 | 0.34 | 0.15 | 0.15 | 0.85 | 0.80 | 0.20 | 0.55 | 0.26 | 0.12 | 0.19 | 0.24 |
Content (Con) | 0.38 | 0.81 | 0.29 | 0.16 | 0.50 | 0.62 | 0.50 | 0.58 | 0.19 | 0.21 | 0.56 | 0.82 | 0.20 | 0.37 | 0.33 | 0.22 | 0.37 | 0.26 |
Sadness (Sad) | 0.71 | 0.10 | 0.15 | 0.18 | 0.22 | 0.22 | 0.71 | 0.13 | 0.17 | 0.11 | 0.92 | 0.65 | 0.15 | 0.49 | 0.40 | 0.15 | 0.17 | 0.29 |
Joy | 0.44 | 0.26 | 0.36 | 0.15 | 0.47 | 0.37 | 0.61 | 0.31 | 0.33 | 0.34 | 0.70 | 0.85 | 0.17 | 0.38 | 0.72 | 0.34 | 0.27 | 0.39 |
Displayed emotion | ||||||||||||||||||
Content (Con) | 0.48 | 0.76 | 0.64 | 0.10 | 0.58 | 0.57 | 0.20 | 0.67 | 0.39 | 0.58 | 0.50 | 0.35 | 0.11 | 0.26 | 0.73 | 0.34 | 0.28 | 0.09 |
Anger (Ang) | 0.62 | 0.82 | 0.51 | 0.14 | 0.48 | 0.58 | 0.27 | 0.66 | 0.32 | 0.56 | 0.63 | 0.40 | 0.11 | 0.26 | 0.52 | 0.29 | 0.43 | 0.07 |
Joy | 0.50 | 0.32 | 0.64 | 0.09 | 0.55 | 0.35 | 0.24 | 0.41 | 0.49 | 0.55 | 0.56 | 0.36 | 0.11 | 0.28 | 0.89 | 0.47 | 0.24 | 0.14 |
Sadness (Sad) | 0.67 | 0.33 | 0.56 | 0.19 | 0.38 | 0.27 | 0.30 | 0.44 | 0.47 | 0.66 | 0.70 | 0.43 | 0.15 | 0.23 | 0.83 | 0.47 | 0.35 | 0.17 |
Type of Concealed Expression | Mean pi | |||||||||||||||||
Felt anger but displayed contentment | 0.23 | |||||||||||||||||
Felt contentment but displayed anger | 0.85 | |||||||||||||||||
Felt sadness but displayed joy | 0.83 | |||||||||||||||||
Felt joy but displayed sadness | 0.95 |
Item Name | Emotion | Item Information | Factor Loadings | χ2 | d.f. | p | χ2/d.f. |
---|---|---|---|---|---|---|---|
NV1 | Amusement | 1.26 | 0.54 | 160.27 | 141 | 0.13 | 1.14 |
NV2 | Amusement | 0.94 | 0.48 | 154.68 | 140 | 0.19 | 1.1 |
NV3 | Amusement | 0.95 | 0.48 | 162.45 | 143 | 0.13 | 1.14 |
NV4 | Amusement | 0.73 | 0.38 | 156.58 | 158 | 0.52 | 0.99 |
NV5 | Amusement | 0.93 | 0.44 | 153.74 | 157 | 0.56 | 0.98 |
NV6 | Amusement | 0.99 | 0.46 | 173.61 | 146 | 0.06 | 1.19 |
NV7 | Boredom | 0.72 | 0.42 | 171.35 | 148 | 0.09 | 1.16 |
NV8 | Boredom | 0.72 | 0.41 | 169.86 | 155 | 0.2 | 1.1 |
NV9 | Boredom | 0.49 | 0.32 | 169.5 | 163 | 0.35 | 1.04 |
NV10 | Content | 0.59 | 0.35 | 132.62 | 158 | 0.93 | 0.84 |
NV11 | Content | 0.65 | 0.37 | 146.13 | 153 | 0.64 | 0.96 |
NV12 | Content | 0.65 | 0.4 | 194.65 | 140 | 0 | 1.39 |
NV13 | Content | 0.83 | 0.42 | 211.77 | 152 | 0 | 1.39 |
NV14 | Disgust | 0.75 | 0.45 | 160.62 | 141 | 0.12 | 1.14 |
NV15 | Disgust | 0.77 | 0.41 | 144.67 | 155 | 0.71 | 0.93 |
NV16 | Embarrassment | 0.61 | 0.38 | 137.34 | 155 | 0.84 | 0.89 |
NV17 | Embarrassment | 0.92 | 0.44 | 151.92 | 152 | 0.49 | 1 |
NV18 | Embarrassment | 0.68 | 0.38 | 170.56 | 152 | 0.14 | 1.12 |
NV19 | Relief | 0.51 | 0.38 | 166.48 | 143 | 0.09 | 1.16 |
NV20 | Relief | 0.65 | 0.46 | 147.98 | 127 | 0.1 | 1.17 |
NV21 | Sad | 0.65 | 0.41 | 131.95 | 139 | 0.65 | 0.95 |
NV22 | Sad | 0.46 | 0.32 | 175.37 | 152 | 0.09 | 1.15 |
NV23 | Shame | 0.74 | 0.39 | 181.41 | 157 | 0.09 | 1.16 |
NV24 | Shame | 0.6 | 0.37 | 160.47 | 149 | 0.25 | 1.08 |
NV25 | Shame | 0.68 | 0.39 | 153.55 | 154 | 0.5 | 1 |
NV26 | Surprise | 0.37 | 0.29 | 161.45 | 158 | 0.41 | 1.02 |
NV27 | Surprise | 0.74 | 0.41 | 128.21 | 153 | 0.93 | 0.84 |
NV28 | Sympathy | 1.05 | 0.5 | 197.73 | 144 | 0 | 1.37 |
NV29 | Sympathy | 0.77 | 0.41 | 143.07 | 149 | 0.62 | 0.96 |
NV30 | Sympathy | 0.9 | 0.47 | 126.54 | 142 | 0.82 | 0.89 |
NV31 | Sympathy | 0.84 | 0.41 | 145.04 | 155 | 0.71 | 0.94 |
Item Name | Emotion | Item Information | Factor Loadings | χ2 | d.f. | p | χ2/d.f. |
---|---|---|---|---|---|---|---|
V2 | Amusement | 0.49 | 0.31 | 73.37 | 53 | 0.03 | 1.38 |
V3 | Amusement | 0.32 | 0.27 | 77.53 | 52 | 0.01 | 1.49 |
V5 | Anger | 0.87 | 0.45 | 61.88 | 51 | 0.14 | 1.21 |
V6 | Anger | 1.10 | 0.47 | 78.89 | 52 | 0.01 | 1.52 |
V9 | Boredom | 0.89 | 0.45 | 56.5 | 50 | 0.24 | 1.13 |
V10 | Boredom | 0.65 | 0.37 | 46.21 | 53 | 0.73 | 0.87 |
V13 | Content | 0.35 | 0.27 | 48.11 | 54 | 0.70 | 0.89 |
V14 | Content | 0.46 | 0.33 | 75.84 | 49 | 0.01 | 1.55 |
V17 | Disgust | 0.93 | 0.46 | 66.34 | 51 | 0.07 | 1.30 |
V19 | Disgust | 1.27 | 0.51 | 63.87 | 48 | 0.06 | 1.33 |
V22 | Embarrassment | 0.49 | 0.31 | 59.12 | 54 | 0.29 | 1.09 |
V23 | Embarrassment | 0.55 | 0.33 | 52.56 | 53 | 0.49 | 0.99 |
V25 | Fear | 0.80 | 0.39 | 54.98 | 52 | 0.36 | 1.06 |
V28 | Fear | 0.42 | 0.29 | 60.83 | 55 | 0.27 | 1.11 |
V33 | Sadness | 0.63 | 0.36 | 74.03 | 53 | 0.03 | 1.40 |
V34 | Sadness | 0.25 | 0.22 | 61.67 | 55 | 0.25 | 1.12 |
V38 | Surprise | 0.83 | 0.42 | 60.87 | 52 | 0.19 | 1.17 |
V39 | Surprise | 0.78 | 0.4 | 94.03 | 53 | 0.00 | 1.77 |
V41 | Sympathy | 0.97 | 0.44 | 60.5 | 52 | 0.20 | 1.16 |
V42 | Sympathy | 0.57 | 0.35 | 53.94 | 54 | 0.48 | 1.00 |
Item Name | Emotion | Item Information | Factor Loadings | χ2 | d.f. | p | χ2/d.f. |
---|---|---|---|---|---|---|---|
C1 | Anger X Content | 1.66 | 0.54 | 77.15 | 67 | 0.19 | 1.15 |
C2 | Anger X Content | 0.48 | 0.32 | 79.24 | 64 | 0.09 | 1.24 |
C3 | Anger X Content | 0.95 | 0.42 | 75.04 | 61 | 0.11 | 1.23 |
C4 | Anger X Content | 0.98 | 0.44 | 85.39 | 60 | 0.02 | 1.42 |
C5 | Anger X Content | 0.96 | 0.42 | 79.68 | 67 | 0.14 | 1.19 |
C6 | Content X Anger | 0.77 | 0.39 | 153.81 | 76 | 0.00 | 2.02 |
C7 | Joy X Sadness | 0.83 | 0.40 | 158.21 | 79 | 0.00 | 2.00 |
C8 | Joy X Sadness | 0.52 | 0.32 | 155.54 | 78 | 0.00 | 1.99 |
C9 | Sad X Joy | 1.75 | 0.57 | 79.24 | 62 | 0.07 | 1.28 |
C10 | Sad X Joy | 1.58 | 0.54 | 75.31 | 57 | 0.05 | 1.32 |
C11 | Sad X Joy | 1.10 | 0.46 | 63.79 | 67 | 0.59 | 0.95 |
C12 | Sad X Joy | 1.52 | 0.53 | 92.94 | 72 | 0.05 | 1.29 |
C13 | Sad X Joy | 1.22 | 0.48 | 72.75 | 64 | 0.21 | 1.14 |
d-dif | |
---|---|
Non-Verbal | |
Asian versus Hispanic | 0.10 |
Black versus Hispanic | 0.11 |
Black versus Asian | 0.11 |
White versus Black | 0.20 |
White versus Hispanic | 0.13 |
White versus Asian | 0.14 |
Men versus Women | 0.12 |
Verbal | |
Asian versus Hispanic | 0.29 |
Black versus Hispanic | 0.14 |
Black versus Asian | 0.34 |
White versus Black | 0.11 |
White versus Hispanic | 0.23 |
White versus Asian | 0.40 |
Men versus Women | 0.12 |
Concealed | |
Asian versus Hispanic | 0.15 |
Black versus Hispanic | 0.16 |
Black versus Asian | 0.17 |
White versus Black | 0.23 |
White versus Hispanic | 0.12 |
White versus Asian | 0.19 |
Men versus Women | 0.08 |
1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | |
---|---|---|---|---|---|---|---|---|---|
1. MET total score | 1 | ||||||||
2. MET Non-Verbal | 0.85 * | 1 | |||||||
3. MET Concealed | 0.82 * | 0.48 * | 1 | ||||||
4. MET Verbal | 0.78 * | 0.51 * | 0.50 * | 1 | |||||
5. GERT | 0.68 * | 0.59 * | 0.53 * | 0.52 * | 1 | ||||
6. DANVA total score | 0.59 * | 0.50 * | 0.41 * | 0.55 * | 0.62 * | 1 | |||
7. DANVA faces | 0.39 * | 0.34 * | 0.26 * | 0.34 * | 0.41 * | 0.73 * | 1 | ||
8. DANVA voices | 0.55 * | 0.45 * | 0.40 * | 0.52 * | 0.56 * | 0.86 * | 0.29 * | 1 | |
9. WPTQ | 0.36 * | 0.34 * | 0.27 * | 0.29 * | 0.46 * | 0.25 * | 0.17 * | 0.23 * | 1 |
10. Age | −0.47 * | −0.35 * | −0.32 * | −0.50 * | −0.45 * | −0.29 * | −0.13 | −0.32 * | −0.32 * |
1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1. MET total score | 1 | ||||||||||||||
2. MET Non-Verbal | 0.92 * | 1 | |||||||||||||
3. MET Concealed | 0.82 * | 0.57 * | 1 | ||||||||||||
4. MET Verbal | 0.83 * | 0.67 * | 0.49 * | 1 | |||||||||||
5. Social well-being | 0.16 * | 0.19 * | 0.06 | 0.12 * | 1 | ||||||||||
6. Empathic con. | 0.38 * | 0.38 * | 0.24 * | 0.31 * | 0.46 * | 1 | |||||||||
7. Social connect. | 0.28 * | 0.29 * | 0.17 * | 0.23 * | 0.59 * | 0.36 * | 1 | ||||||||
8. Alexithymia | −0.42 * | −0.42 * | −0.24 * | −0.37 * | −0.49 * | −0.41 * | −0.75 * | 1 | |||||||
9. Stress | −0.13 * | −0.17 * | −0.05 | −0.09 * | −0.42 * | −0.13 * | −0.55 * | 0.56 * | 1 | ||||||
10. Depression | −0.33 * | −0.35 * | −0.19 * | −0.27 * | −0.41 * | −0.27 * | −0.70 * | 0.68 * | 0.65 * | 1 | |||||
11. Anxiety | −0.33 * | −0.34 * | −0.20 * | −0.27 * | −0.37 * | −0.23 * | −0.67 * | 0.65 * | 0.66 * | 0.87 * | 1 | ||||
12. Positive rel. | −0.10 * | −0.09 * | −0.07 * | −0.10 * | 0.36 * | 0.19 * | 0.01 | 0.07 | −0.01 | 0.10 * | 13 * | 1 | |||
13. Negative rel. | −0.54 * | −0.53 * | −0.34 * | −0.48 * | −0.23 * | −0.35 * | −0.53 * | 0.61 * | 0.35 * | 0.57 * | 0.55 * | 0.26 * | 1 | ||
14. Age | 0.15 * | 0.24 * | −0.03 | 0.11 * | 0.17 * | 0.09 * | 0.17 * | 0.21 * | −0.29 * | −0.29 * | 0−.28 * | −0.02 | −0.29 * | 1 | |
15. Gender | −0.30 * | −0.26 * | −0.17 * | −0.26 * | −0.07 * | −0.18 * | −0.11 * | 0.20 * | −0.02 | 0.10 * | 0.10 * | 0.08 * | 0.26 * | 0.09 * | 1 |
β | R2 | |
---|---|---|
Social Well-being | ||
MET score | 0.35 * | |
Gender | −0.14 * | |
Gender × MET score | −0.26 † | 0.03 |
Empathic concern | ||
MET score | 0.59 * | |
Gender | −0.20 * | |
Gender × MET score | −0.30 * | 0.16 |
Social connectedness | ||
MET score | 0.20 † | |
Gender | 0.01 | |
Gender × MET score | 0.10 | 0.08 |
Alexithymia | ||
MET score | −0.30 * | |
Gender | 0.05 | |
Gender × MET score | −0.12 | 0.18 |
Stress | ||
MET score | 0.24 * | |
Gender | −0.26 * | |
Gender × MET score | −0.51 * | 0.04 |
Depression | ||
MET score | 0.02 | |
Gender | −0.17 * | |
Gender × MET score | −0.44 * | 0.12 |
Anxiety | ||
MET score | −0.03 | |
Gender | −0.14 * | |
Gender × MET score | −0.38 * | 0.12 |
Positive relationships | ||
MET score | 0.12 | |
Gender | −0.08 | |
Gender × MET score | −0.34 * | 0.02 |
Negative relationships | ||
MET score | −0.31 * | |
Gender | 0.01 | |
Gender × MET score | −0.26 * | 0.31 |
r | |
---|---|
Women MET scores with | |
Social well-being | 0.21 * |
Empathic concern | 0.40 * |
Social connectedness | 0.25 * |
Alexithymia | −0.35 * |
Stress | 0.00 |
Depression | −0.20 * |
Anxiety | −0.21 * |
Neg Social interactions | −0.46 * |
Pos Social interactions | 0.02 |
Men MET scores with | |
Social well-being | 0.06 |
Empathic concern | 0.28 * |
Social connectedness | 0.27 * |
Alexithymia | −0.41 * |
Depression | −0.43 * |
Stress | −0.33 * |
Anxiety | −0.41 * |
Neg Social interactions | −0.54 * |
Pos Social interactions | −0.19 * |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
LaPalme, M.L.; Barsade, S.G.; Brackett, M.A.; Floman, J.L. The Meso-Expression Test (MET): A Novel Assessment of Emotion Perception. J. Intell. 2023, 11, 145. https://doi.org/10.3390/jintelligence11070145
LaPalme ML, Barsade SG, Brackett MA, Floman JL. The Meso-Expression Test (MET): A Novel Assessment of Emotion Perception. Journal of Intelligence. 2023; 11(7):145. https://doi.org/10.3390/jintelligence11070145
Chicago/Turabian StyleLaPalme, Matthew L., Sigal G. Barsade, Marc A. Brackett, and James L. Floman. 2023. "The Meso-Expression Test (MET): A Novel Assessment of Emotion Perception" Journal of Intelligence 11, no. 7: 145. https://doi.org/10.3390/jintelligence11070145
APA StyleLaPalme, M. L., Barsade, S. G., Brackett, M. A., & Floman, J. L. (2023). The Meso-Expression Test (MET): A Novel Assessment of Emotion Perception. Journal of Intelligence, 11(7), 145. https://doi.org/10.3390/jintelligence11070145