Research

Emotional processes and coherence

Psychological theories of emotion have frequently posited that emotions are "coherent" states. Among other aspects, this concept captures that an emotion should consist of a pattern of mental and bodily changes that is concurrent in time, involves the whole body, is separable qualitatively from non-emotional or emotional states, is reliably observed, and/or is driven as a whole by underlying organising factors. Empirical research into emotional coherence has faced a number of important obstacles, however, which are (a) the absence of a clear theoretical definition of emotional coherence, (b) the weakness with which emotional states are typically induced in the lab, (c) the lack of comprehensive measurement across mental and bodily subsystems, and (d) the statistical complexities associated with the analysis of these multivariate measurements.

At the MMEF lab, we investigate emotional coherence empirically and address the challenges to research by (a) defining coherence more precisely, (b) induce strong emotions with virtual reality, (c) measure emotions comprehensively, and (d) use machine learning and clustering methods to analyze the resulting data.

The primary paradigm we developed for inducing emotion is the Virtual Height Experience (VHE). Programmed in Unity3D, the VHE allows a controlled induction of strong fear by a (repeated) exposure to virtual heights. Parameters of the VHE can be altered to change, e.g., the size of the viewing platform, the depth of the virtual heights, the number of trials, the duration of the exposure. As well, the user can perform either a passive viewing task, or active walking by navigating a U-shaped track over the virtual depth. Integrated measurements include gaze tracking, virtual self-report, and scripts for compatibility with physiological recording.