Understanding the Recognition of Emotional Facial Expressions: Influence of Eye Occlusion, Contexts, and Attention Dynamics

The mechanisms underlying the ability to recognise emotional facial expressions are multiple and complex. This work focuses on what an observer perceives, how it is interpreted, and how behavioural, attentional, and neural responses unfold when viewing an emotional facial expression.

The overall aim of this thesis is to characterise how reduced eye visibility influences the perception and interpretation of emotional facial expressions, and to test whether the nature of this reduction matters. In everyday social interactions, the eyes may become unavailable either through a natural eye state (for example sustained eye closure) or through an artificial occluder (for example opaque sunglasses). Although the literature has long established the eye region as a key source of diagnostic and socially relevant information, these different forms of occlusion have often been treated as interchangeable, implicitly assuming that comparable reductions in visual information produce comparable psychological effects.

A first axis focuses on the interpretation that observers construct from a face with reduced access to the eyes. It is widely accepted that certain facial features are more informative for recognising specific emotions: cues from the eye and brow region contribute strongly to the recognition of threat-related expressions such as fear and anger, whereas the mouth is particularly informative for happiness. However, this thesis examines the possibility that some forms of “missing eyes” cannot be reduced to a simple loss of perceptual information. In particular, closed eyes are not merely the absence of ocular cues: they may themselves constitute an expressive signal that shapes emotional appraisal, for instance by suggesting a shift towards internally oriented attention or reduced interpersonal engagement. By contrast, opaque sunglasses impose an external barrier that limits access to ocular information and may increase ambiguity regarding what the gaze conveys. This first axis therefore investigates whether these two situations lead to distinct consequences for emotion recognition and attribution.

A second axis examines the attentional dynamics associated with these judgements. Using psychophysiological techniques such as eye-tracking and electroencephalography (EEG), the project investigates how observers visually explore emotional faces when access to the eye region is reduced, and how the neural processes underlying these interpretations unfold over time. A key question is whether observers continue to sample the eye region when the eyes are closed, or whether the presence of an artificial occluder changes the functional value of that region and promotes a redistribution of attention towards alternative facial cues. At the neural level, the aim is to identify when, in the temporal cascade of face and emotion processing, the closed-eyes and sunglasses conditions begin to diverge, thereby distinguishing mechanisms related to early perceptual encoding from those linked to later interpretative or evaluative processes.

Finally, a third axis investigates the role of context. When facial information is incomplete, the interpretation of emotional expressions increasingly relies on contextual knowledge and situational expectations. This thesis therefore explores how contextual information may modulate the effects of reduced eye visibility, and whether contextual cues can disambiguate emotional interpretation when the ambiguity arises not from the expression itself but from restricted access to diagnostic facial information.

Taken together, the axes addressed in this thesis aim to clarify the relationships between perceptual constraints, attentional sampling, and neural dynamics in the recognition of emotional facial expressions. By explicitly distinguishing different forms of eye occlusion, this work contributes to a more precise understanding of how perceptual information and socio-affective meaning jointly shape emotion perception and non-verbal communication.

 

Team

Program

PhD in Neuroscience - Lemanic Neuroscience Doctoral School (LNDS)

Selected Scientific Outputs