Visual salience and priority estimation for locomotion using a deep convolutional neural network

Research output: Chapter in Book/Report/Conference proceedingConference Contribution (Conference Proceeding)

1 Citation (Scopus)
346 Downloads (Pure)

Abstract

This paper presents a novel method of salience and priority estimation for the human visual system during locomotion. This visual information contains dynamic content derived from a moving viewpoint. The priority map, ranking key areas on the image, is created from probabilities of gaze fixations, merged from bottom-up features and top-down control on the locomotion. Two deep convolutional neural networks (CNNs), inspired by models of the primate visual system, are employed to capture local salience features and compute probabilities. The first network operates through the foveal and peripheral areas around the eye positions. The second network obtains the importance of fixated points that have long durations or multiple visits, of which such areas need more times to process or to recheck to ensure smooth locomotion. The results show that our proposed method outperforms the state-of-the-art by up to 30 %, computed from average of four well known metrics for saliency estimation.
Original languageEnglish
Title of host publication2016 IEEE International Conference on Image Processing (ICIP 2016)
Subtitle of host publicationProceedings of a meeting held 25-28 September 2016, Phoenix, Arizona, USA
PublisherInstitute of Electrical and Electronics Engineers (IEEE)
Pages1599-1603
Number of pages5
ISBN (Electronic)9781467399616
ISBN (Print)9781467399623
DOIs
Publication statusPublished - Mar 2017

Publication series

Name
ISSN (Electronic)2381-8549

Structured keywords

  • Cognitive Science
  • Visual Perception

Keywords

  • Salience
  • convolutional neural network
  • deep learning
  • locomotion

Fingerprint Dive into the research topics of 'Visual salience and priority estimation for locomotion using a deep convolutional neural network'. Together they form a unique fingerprint.

Cite this