Voice User Interfaces, such as Amazon's Echo devices, have recently opened up an exciting design space to understand aspects of how people interact with and through conversational voice agents. In this project, we intend to explore this design space and its implication within the context of mainstream education and Special Education Needs (SENs); in collaboration with researchers at the Department of Education, we aim to design and develop Mixed-Reality/Augmented-Reality experiences with conversational interfaces in the center of a tangible multisensory table top to enable collaborative and personalised learning experiences for children with and without visual impairments. The intention is to use the voice/tangible interface to enable an ecosystem of personalities or 'helper bots', abstracted as per their desired functionalities; a result of blended Participatory Design process and development in the wild at local schools. The underlying aims of this technology is to tentatively enhance 'SEN mainstreaming pedagogies' and improve inclusive learning, collaboration and turn taking.
|Effective start/end date||1/02/18 → 31/07/18|
- SoE Centre for Knowledge, Culture, and Society