Recent Advances in Understanding Facial Expression Processing: New Methods, Measures and Models

A special issue of Behavioral Sciences (ISSN 2076-328X). This special issue belongs to the section "Cognition".

Deadline for manuscript submissions: closed (30 November 2024) | Viewed by 3854

Special Issue Editor


E-Mail Website
Guest Editor
School of Psychology, University of Ottawa, Ottawa, ON K1N 5Y3, Canada
Interests: face processing; facial expressions; emojis; psychopathology; psychophysics; visual cognition

Special Issue Information

Dear Colleagues,

The human face is arguably the most important visual stimulus we encounter on a daily basis. From each other's physiognomies we extract a wealth of socially vital information regarding identity, health, and attention, among much else. One of the most important types of information we garner from a face regards a person's emotional state. Researchers have been interested in gaining a scientific understanding of our ability to transmit emotional signals via facial muscle contractions since at least since Darwin. During the 20th century, the area of facial expression research has seen a number of important advances, including the proposal of the idea of universal expressions, the debate over face specialness, and the discovery of micro-expressions. More recent directions in facial expression research have been driven by social events, such as the need to wear face masks and communicate via video chat that emerged from the COVID-19 pandemic, and the emergence of generative AI applications that can quickly render convincing (if often imperfect) images of human faces based solely on short verbal descriptions. Similarly, the move towards a greater consideration of diversity, equity, and inclusion has fostered increasing attempts to explore facial expression processing in a way that takes into account differences in ethnicity, culture, and gender. Such new ideas and directions are emerging continually in the facial expression literature, and this Special Issue aims to bring attention to them. These novel concepts include advances in the methodologies, measures, and models we employ in our ongoing endeavor to better understand how we send and receive information about our internal emotional states via our faces.

Prof. Dr. Charles Collin
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Behavioral Sciences is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2200 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • facial expression processing
  • face processing models
  • measures of facial expression recognition performance
  • social cognition
  • generative AI
  • effects of race
  • effects of gender
  • effects of psychopathology

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (6 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

11 pages, 620 KiB  
Article
Effects of Explicit Knowledge and Attentional-Perceptual Processing on the Ability to Recognize Fear and Surprise
by Mylène Michaud, Annie Roy-Charland and Mélanie Perron
Behav. Sci. 2025, 15(2), 166; https://doi.org/10.3390/bs15020166 - 2 Feb 2025
Viewed by 380
Abstract
When participants are asked to identify expressed emotions from pictures, fear is often confused with surprise. The present study explored this confusion by utilizing one prototype of surprise and three prototypes of fear varying as a function of distinctive cues in the fear [...] Read more.
When participants are asked to identify expressed emotions from pictures, fear is often confused with surprise. The present study explored this confusion by utilizing one prototype of surprise and three prototypes of fear varying as a function of distinctive cues in the fear prototype (cue in the eyebrows, in the mouth or both zones). Participants were presented with equal numbers of pictures expressing surprise and fear. Eye movements were monitored when they were deciding if the picture was fear or surprise. Following each trial, explicit knowledge was assessed by asking the importance (yes vs. no) of five regions (mouth, nose, eyebrows, eyes, cheeks) in recognizing the expression. Results revealed that fear with both distinctive cues was recognized more accurately, followed by the prototype of surprise and fear with a distinctive cue in the mouth at a similar level. Finally, fear with a distinctive cue in the eyebrows was the least accurately recognized. Explicit knowledge discriminability results revealed that participants were aware of the relevant areas for each prototype but not equally so for all prototypes. Specifically, participants judged the eyebrow area as more important when the distinctive cue was in the eyebrows (fear–eyebrow) than when the cue was in the mouth (fear–mouth) or when both cues were present (fear–both). Results are discussed considering the attentional-perceptual and explicit knowledge limitation hypothesis. Full article
Show Figures

Figure 1

15 pages, 6893 KiB  
Article
Effects of Closed Mouth vs. Exposed Teeth on Facial Expression Processing: An ERP Study
by Nicolas M. Brunet and Alexandra R. Ackerman
Behav. Sci. 2025, 15(2), 163; https://doi.org/10.3390/bs15020163 - 1 Feb 2025
Viewed by 364
Abstract
The current study examines the neural mechanisms underlying facial recognition, focusing on how emotional expression and mouth display modulate event-related potential (ERP) waveforms. 42 participants categorized faces by gender in one of two experimental setups: one featuring full-face images and another with cropped [...] Read more.
The current study examines the neural mechanisms underlying facial recognition, focusing on how emotional expression and mouth display modulate event-related potential (ERP) waveforms. 42 participants categorized faces by gender in one of two experimental setups: one featuring full-face images and another with cropped faces presented against neutral gray backgrounds. The stimuli included 288 images balanced across gender, race/ethnicity, emotional expression (“Fearful”, “Happy”, “Neutral”), and mouth display (“closed mouth” vs. “open mouth with exposed teeth”). Results revealed that N170 amplitude was significantly greater for open-mouth (exposed teeth) conditions (p < 0.01), independent of emotional expression, and no interaction between emotional expression and mouth display was found. However, the P100 amplitude exhibited a significant interaction between these variables (p < 0.05). Monte Carlo simulations analyzing N170 latency differences showed that fearful faces elicited a faster response than happy and neutral faces, with a 2 ms delay unlikely to occur by chance (p < 0.01). While these findings challenge prior research suggesting that N170 is directly influenced by emotional expression, they also highlight the potential role of emotional intensity as an alternative explanation. This underscores the importance of further studies to disentangle these effects. This study highlights the critical need to control for mouth display when investigating emotional face processing. The results not only refine our understanding of the neural dynamics of face perception but also confirm that the brain processes fearful expressions more rapidly than happy or neutral ones. These insights offer valuable methodological considerations for future neuroimaging research on emotion perception. Full article
Show Figures

Figure 1

11 pages, 790 KiB  
Article
Exploring the Role of Foveal and Extrafoveal Processing in Emotion Recognition: A Gaze-Contingent Study
by Alejandro J. Estudillo
Behav. Sci. 2025, 15(2), 135; https://doi.org/10.3390/bs15020135 - 26 Jan 2025
Viewed by 418
Abstract
Although the eye-tracking technique has been widely used to passively study emotion recognition, no studies have utilised this technique to actively manipulate eye-gaze strategies during the recognition facial emotions. The present study aims to fill this gap by employing a gaze-contingent paradigm. Observers [...] Read more.
Although the eye-tracking technique has been widely used to passively study emotion recognition, no studies have utilised this technique to actively manipulate eye-gaze strategies during the recognition facial emotions. The present study aims to fill this gap by employing a gaze-contingent paradigm. Observers were asked to determine the emotion displayed by centrally presented upright or inverted faces. Under the window condition, only a single fixated facial feature was available at a time, only allowing for foveal processing. Under the mask condition, the fixated facial feature was masked while the rest of the face remained visible, thereby disrupting foveal processing but allowing for extrafoveal processing. These conditions were compared with a full-view condition. The results revealed that while both foveal and extrafoveal information typically contribute to emotion identification, at a standard conversation distance, the latter alone generally suffices for efficient emotion identification. Full article
Show Figures

Figure 1

15 pages, 923 KiB  
Article
Perceptual, Not Attentional, Guidance Drives Happy Superiority in Complex Visual Search
by Sjoerd M. Stuit, M. Alejandra Pardo Sanchez and David Terburg
Behav. Sci. 2025, 15(2), 124; https://doi.org/10.3390/bs15020124 - 24 Jan 2025
Viewed by 453
Abstract
Emotional facial expressions are thought to attract attention differentially based on their emotional content. While anger is thought to attract the most attention during visual search, happy superiority effects are reported as well. As multiple studies point out confounds associated with such emotional [...] Read more.
Emotional facial expressions are thought to attract attention differentially based on their emotional content. While anger is thought to attract the most attention during visual search, happy superiority effects are reported as well. As multiple studies point out confounds associated with such emotional superiority, further investigation into the underlying mechanisms is required. Here, we tested visual search behaviors when searching for angry faces, happy faces, or either happy or angry faces simultaneously using diverse distractors displaying many other expressions. We teased apart visual search behaviors into attentional and perceptual components using eye-tracking data and subsequently predicted these behaviors using low-level visual features of the distractors. The results show an overall happy superiority effect that can be traced back to the time required to identify distractors and targets. Search behavior is guided by task-based, emotion-specific search templates that are reliably predictable based on the spatial frequency content. Thus, when searching, we employ specific templates that drive attentional as well as perceptual elements of visual search. Only the perceptual elements contribute to happy superiority. In conclusion, we show that template-guided search underlies perceptual, but not attentional, happy superiority in visual search. Full article
Show Figures

Figure 1

13 pages, 2787 KiB  
Article
Methodological Variations to Explore Conflicting Results in the Existing Literature of Masking Smile Judgment
by Annalie Pelot, Adèle Gallant, Marie-Pier Mazerolle and Annie Roy-Charland
Behav. Sci. 2024, 14(10), 944; https://doi.org/10.3390/bs14100944 - 14 Oct 2024
Viewed by 776
Abstract
Although a smile can serve as an expression of genuine happiness, it can also be generated to conceal negative emotions. The traces of negative emotion present in these types of smiles can produce micro-expressions, subtle movements of the facial muscles manifested in the [...] Read more.
Although a smile can serve as an expression of genuine happiness, it can also be generated to conceal negative emotions. The traces of negative emotion present in these types of smiles can produce micro-expressions, subtle movements of the facial muscles manifested in the upper or lower half of the face. Studies examining the judgment of smiles masking negative emotions have mostly employed dichotomous rating measures, while also assuming that dichotomous categorization of a smile as happy or not is synonymous with judgments of the smile’s authenticity. The aim of the two studies was to explore the judgment of enjoyment and masking smiles using unipolar and bipolar continuous rating measures and examine differences in the judgment when instruction varied between judgments of happiness and authenticity. In Experiment 1, participants rated smiles on 7-point scales on perceived happiness and authenticity. In Experiment 2, participants rated the smiles on bipolar 7-point scales between happiness and a negative emotion label. In both studies, similar patterns were observed: faces with traces of fear were rated significantly less happy/authentic and those with traces of anger in the brows were rated significantly happier/more authentic. Regarding varied instruction type, no effect was found for the two instruction types, indicating that participants perceive and judge enjoyment and masking smiles similarly according to these two instructions. Additionally, the use of bipolar scales with dimensions between a negative emotion label and happiness were not consistently effective in influencing the judgement of the masking smile. Full article
Show Figures

Figure 1

11 pages, 1454 KiB  
Article
Beyond the Whole: Reduced Empathy for Masked Emotional Faces Is Not Driven by Disrupted Configural Face Processing
by Sarah D. McCrackin and Jelena Ristic
Behav. Sci. 2024, 14(9), 850; https://doi.org/10.3390/bs14090850 - 20 Sep 2024
Viewed by 879
Abstract
Sharing of emotional states is reduced for individuals wearing face coverings, but the mechanism behind this reduction remains unknown. Here, we investigated if face occlusion by masks reduces empathy by disrupting configural processing of emotional faces. Participants rated their empathy for happy and [...] Read more.
Sharing of emotional states is reduced for individuals wearing face coverings, but the mechanism behind this reduction remains unknown. Here, we investigated if face occlusion by masks reduces empathy by disrupting configural processing of emotional faces. Participants rated their empathy for happy and neutral faces which were presented in upright or inverted orientation and wore opaque, clear, or no face masks. Empathy ratings were reduced for masked faces (opaque or clear) as well as for inverted faces. Importantly, face inversion disrupted empathy more for faces wearing opaque masks relative to those wearing clear or no masks, which stands in contrast to the predictions generated by the classic configural processing models. We discuss these data within the context of classic and novel configural face perception models, and highlight that studying inverted occluded faces presents an informative case worthy of further investigation. Full article
Show Figures

Figure 1

Back to TopTop