Cross-modal Interactive Tools for Inclusive Learning

Project Details


A large number visually impaired and blind children in the UK are educated in mainstream schools, which often takes the form of one or two learners in a class of fully sighted peers. But typical accessibility solutions that are available in mainstream schools are designed to be used by visually impaired learners alone and not by their sighted peers, and so can end up forcing them to learn as isolated individuals and be excluded from group learning activities. These solutions emphasise accessibility over inclusion, focusing on an individual’s disability and not on the variety of abilities present in a social context of group learning involving students, teachers and technology.

The CRITICAL project aims to research and develop interactive learning tools to make group work in mixed classrooms more inclusive of visually impaired students.

The project explores questions such as: How do people learn together when they have access to different sets of sensory modalities? And, how can we exploit crossmodal interaction to design more inclusive collaborative educational technologies.

Non-visual modalities (e.g. audio, gestures, haptic and tactile feedback, smell) have shown potential benefit to support accessible interactions to varying degrees, but there are still limitations in their applicability in real world settings, such as issues with crossmodal effects when groups work together using different senses. We will use an iterative user-centred approach combining participatory design activities with empirical research into crossmodal interaction to find out how different senses can be effectively integrated with visual capabilities to support group work. The tools will be designed to address gaps in technological support for accommodating curriculum requirements and social processes surrounding collaborative learning, and will be validated in classroom settings to find out how they can improve group learning activities and impact teaching practices.
Effective start/end date1/03/1631/01/21

Fingerprint Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.

  • Research Output

    • 9 Conference Contribution (Conference Proceeding)
    • 1 Conference Paper

    Review of Experimental Evaluations Methods of Technology for Visually Impaired People

    Brulé, E., Tomlinson, B., Metatla, O., Serrano, M. & Jouffrais, C., 2020.

    Research output: Contribution to conferenceConference Paper

  • Robots for Inclusive Play: Co-designing an Educational Game With Visually Impaired and Sighted Children

    Metatla, O., Bardot, S., Cullen, C., Serrano, M. & Jouffrais, C., 2020, CHI 2020 - Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery (ACM)

    Research output: Chapter in Book/Report/Conference proceedingConference Contribution (Conference Proceeding)

  • Voice User Interfaces in Schools: Co-designing for Inclusion with Visually-Impaired and Sighted Pupils

    Metatla, O., Oldfield, A., Ahmed, T., Vafeas, A. & Miglani, S., 2 May 2019, CHI 2019 - Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery (ACM), 15 p. 378

    Research output: Chapter in Book/Report/Conference proceedingConference Contribution (Conference Proceeding)

    Open Access
  • 6 Citations (Scopus)
    177 Downloads (Pure)