Understanding the Recognition of Emotional Facial Expressions: Influence of Eye Visibility, Contexts, and Attention Dynamics

Summary

The mechanisms underpinning the ability to recognise emotional facial expressions are multiple and complex. This work focuses on what an observer perceives, how they interpret, and how they react behaviourally, cognitively, and neuronally to an emotional facial expression and its context.

The first area of interest concerns the observer's interpretation of the perceived face. The study will explore differences in the ability to recognise the expressed emotion based on the accessibility of the eyes. It is broadly agreed in the literature that certain facial features are more decisive in recognising specific emotions; the eyes help us identify anger and fear, while the mouth is more specific to joy (Bombari et al., 2013; Calvo et al., 2014, 2018; Eisenbarth & Alpers, 2011). However, few have considered what would happen to our ability to recognise these emotions when this information is impaired, particularly if it results in a change of categorisation. Indeed, studies have shown that the direction of gaze of an emotional face impacts the interpretation of the emotion's source: a face showing anger looking left does not provoke the same reactions as when it looks directly at us (Mumenthaler & Sander, 2012, 2015). Thus, the question arises of the interpretation associated with impaired perception of the eyes; what does the observer interpret when the eyes of the emotional face are closed? The source of the emotion can no longer be external. Does this affect the perceived emotion? And what about faces wearing opaque sunglasses where the eyes are not visible? This creates an ambiguous situation where perception and interpretation are conflated. This area will therefore investigate these differences to understand the mechanisms involved.

The second area will focus on understanding the attention dynamics of the observer during these face recognitions. Using psychophysiological techniques (eye-tracking and electroencephalogram), the study will examine differences in the oculomotor behaviours of exploring emotional faces with impaired eye visibility, as well as the brain activation differences associated with these underlying cognitions. It is crucial to determine whether our attention is directed similarly to the eyes when they are closed, behind sunglasses, or open, to understand if attentional processes can explain or occur concurrently with differences in the recognition of emotional facial expressions.

Finally, the third area will look into the impact of context. Studies have shown that contextual information, including social cues, can influence the ability to recognise an ambiguous emotional facial expression (Mumenthaler & Sander, 2012). Can this effect be observed when the ambiguity is not in the expression itself but in the ability to perceive the entirety of the information, such as with sunglasses?

The areas addressed in this thesis explore various research paths to untangle the links between perception, attention, and neural responses in the recognition of emotional facial expressions and their context. This work contributes to our understanding of the mechanisms governing the detection and interpretation of emotions on faces, highlighting how these processes interact and are influenced by external factors. Thus, the thesis offers a new perspective on the foundations of human non-verbal communication and the influences of visual perception in social interactions.

 

Bombari, D., Schmid, P. C., Schmid Mast, M., Birri, S., Mast, F. W., & Lobmaier, J. S. (2013). Emotion Recognition : The Role of Featural and Configural Face Information. Quarterly Journal of Experimental Psychology, 66(12), 2426‑2442. https://doi.org/10/gt4hqj

Calvo, M. G., Fernández-Martín, A., Gutiérrez-García, A., & Lundqvist, D. (2018). Selective eye fixations on diagnostic face regions of dynamic emotional expressions : KDEF-dyn database. Scientific Reports, 8(1), 17039. https://doi.org/10/gh3j6n

Calvo, M. G., Fernández-Martín, A., & Nummenmaa, L. (2014). Facial expression recognition in peripheral versus central vision : Role of the eyes and the mouth. Psychological Research, 78(2), 180‑195. https://doi.org/10/gtz4ms

Eisenbarth, H., & Alpers, G. W. (2011). Happy mouth and sad eyes : Scanning emotional facial expressions. Emotion, 11(4), 860‑865. https://doi.org/10/d5dvzt

Mumenthaler, C., & Sander, D. (2012). Social appraisal influences recognition of emotions. Journal of Personality and Social Psychology, 102(6), 1118‑1135. https://doi.org/10.1037/a0026885

Mumenthaler, C., & Sander, D. (2015). Automatic integration of social information in emotion recognition. Journal of Experimental Psychology: General, 144(2), 392‑399. https://doi.org/10/f678s9

Team

Funding

UNIGE doctoral thesis fund.

Program

PhD in Neuroscience - Lemanic Neuroscience Doctoral School (LNDS)

Selected Scientific Outputs