Projects per year
This paper presents a novel method of fixation identification for mobile eye trackers. The most significant benefit of our method over the state-of-the-art is that it achieves high accuracy for low-sample-rate devices worn during locomotion. This in turn delivers higher quality datasets for further use in human behaviour research, robotics and the development of guidance aids for the visually impaired. The proposed method employs temporal characteristics of the eye positions combined with statistical visual features extracted using a deep convolutional neural network, inspired by models of the primate visual system, through the fovea and peripheral areas around the eye positions. The results show that the proposed method outperforms existing methods by up to 16 % in terms of classification accuracy.
|Title of host publication||IEEE International Conference on Image Processing (ICIP), 2016|
|Publisher||Institute of Electrical and Electronics Engineers (IEEE)|
|Publication status||Published - 19 Aug 2016|
- Cognitive Science
- Visual Perception
31/07/12 → 31/07/15