Eeny Meeny Artsy Fartsy: eye tracking to explore preference for paintings generated by deep neural networks

Research output: Contribution to conferenceConference Posterpeer-review


A frequently asked question in experimental aesthetics and psychology is whether we can isolate aesthetic primitives in paintings from their content, and measure their ‘beauty’ independently. To answer this question, we applied deep neural networks (Gatys, Ecker and Bethge, 2015) to systematically separate and recombine the content and style of artistic images, adhering to formal stylistic rules, while keeping the compositional and semantic content untouched. We implemented a 2AFC task and a variant of drift-diffusion model (DDM) (Kajbich et al., 2010) in addition to eye tracking to establish whether both style and content contribute independently to the general liking of paintings. We integrated accumulated behavioural evidence (i.e. image preference) with evidence based on cumulative fixations to extract bias coefficients towards different styles and contents, respectively. These coefficients were found to have higher predictive power than the initial values associated with ‘paintings’ in determining the ultimately chosen image. In addition, we showed that an extended DDM (Kajbich et al., 2010), which uses the weighted difference between initial values of fixated and unfixated images, does not extend to novel stimulus choices.
Original languageEnglish
Publication statusPublished - 2017
EventEuropean Conference on Visual Perception 2017 - Berlin, Germany
Duration: 27 Aug 201731 Aug 2017


ConferenceEuropean Conference on Visual Perception 2017
Abbreviated titleECVP
Internet address

Structured keywords

  • Brain and Behaviour
  • Cognitive Science
  • Visual Perception


Dive into the research topics of 'Eeny Meeny Artsy Fartsy: eye tracking to explore preference for paintings generated by deep neural networks'. Together they form a unique fingerprint.

Cite this