Tap the ShapeTones: Exploring the Effects of Crossmodal Congruence in an Audio-Visual Interface

Oussama Metatla, Nuno Correia, Fiore Martin, Nick Bryan-Kinns, Tony Stockman

Research output: Chapter in Book/Report/Conference proceedingConference Contribution (Conference Proceeding)

Abstract

There is growing interest in the application of crossmodal perception to interface design. However, most research has focused on task performance measures and often ignored user experience and engagement. We present an examination of crossmodal congruence in terms of performance and engagement in the context of a memory task of audio, visual, and audio-visual stimuli. Participants in a first study showed improved performance when using a visual congruent mapping that was cancelled by the addition of audio to the baseline conditions, and a subjective preference for the audio-visual stimulus that was not reflected in the objective data. Based on these findings, we designed an audio-visual memory game to examine the effects of crossmodal congruence on user experience and engagement. Results showed higher engagement levels with congruent displays with some reported preference for potential challenge and enjoyment that an incongruent display may support, particularly for increased task complexity.
Original languageEnglish
Title of host publicationProceedings of the 2016 CHI Conference on Human Factors in Computing Systems
PublisherAssociation for Computing Machinery (ACM)
Pages1055-1066
Number of pages12
ISBN (Print)9781450333627
DOIs
Publication statusPublished - 7 May 2016

Structured keywords

  • Engineering Education Research Group

Fingerprint Dive into the research topics of 'Tap the ShapeTones: Exploring the Effects of Crossmodal Congruence in an Audio-Visual Interface'. Together they form a unique fingerprint.

Cite this