Developing continuous measures of audience immersion

Student thesis: Doctoral ThesisDoctor of Philosophy (PhD)

Abstract

When engaging with media, be it a film, book, game, or other media form, a pervasive experience is that of immersion. While immersed, we may lose awareness of our surroundings as our attention is narrowed towards the media. This state of immersion is dynamic and may fluctuate across the course of a programme or film. However, much of the existing literature relies only on retrospective questionnaires to measure this experience, and as such are not sensitive to changes in immersion across a viewing experience. Further, immersion likely relies on several cognitive processes from attention, to emotion, to mental representation. As such, to triangulate viewer immersion, researchers should rely on multiple measures sensitive to these underlying dimensions. In addition, viewing should be as naturalistic as possible, using measures that are unobtrusive to ensure that immersion is not disrupted.

This thesis presents a series of cognitive psychology experiments which develop, and validate, several continuous physiological and behavioural measures of audience immersion in film and television content. The measures explored are heart rate, skin conductance, body movement, dual- task reaction times, visual recognition memory, and immersion questionnaires. Of particular interest are synchronous responses: that is, correlated activity across viewers watching the same content, which suggests a common response to the content. Throughout this thesis, synchronous responses in audience heart rate emerged as an especially useful measure which relates to audience’s self-reported immersion, particularly their attentional and emotional engagement with the story. Dual-task reaction times, which reflect attentional resources dedicated towards the content were also sensitive to self-reported immersion. Finally, this thesis indicates that synchrony in heart rate is largely driven by high-level engagement with the narrative, rather than the audio- visual features of the content.
Date of Award20 Jun 2023
Original languageEnglish
Awarding Institution
  • University of Bristol
SponsorsBBC Research and Development Department
SupervisorIain D Gilchrist (Supervisor) & David R Bull (Supervisor)

Cite this

'