Have a personal or library account? Click to login
Pupil Dilation Reflects Task Relevance Prior to Search Cover

Pupil Dilation Reflects Task Relevance Prior to Search

Open Access
|Jan 2018

Abstract

When observers search for a specific target, it is assumed that they activate a representation of the task relevant object in visual working memory (VWM). This representation – often referred to as the template – guides attention towards matching visual input. In two experiments we tested whether the pupil response can be used to differentiate stimuli that match the task-relevant template from irrelevant input. Observers memorized a target color to be searched for in a multi-color visual search display, presented after a delay period. In Experiment 1, one color appeared at the start of the trial, which was then automatically the search template. In Experiments 2, two colors were presented, and a retro-cue indicated which of these was relevant for the upcoming search task. Crucially, before the search display appeared, we briefly presented one colored probe stimulus. The probe could match either the relevant-template color, the non-cued color (irrelevant), or be a new color not presented in the trial. We measured the pupil response to the probe as a signature of task relevance. Experiment 1 showed significantly smaller pupil size in response to probes matching the search template than for irrelevant colors. Experiment 2 replicated the template matching effect and allowed us to rule out that it was solely due to repetition priming. Taken together, we show that the pupil responds selectively to participants’ target template prior to search.
DOI: https://doi.org/10.5334/joc.12 | Journal eISSN: 2514-4820
Language: English
Submitted on: Nov 15, 2017
|
Accepted on: Jan 3, 2018
|
Published on: Jan 26, 2018
Published by: Ubiquity Press
In partnership with: Paradigm Publishing Services
Publication frequency: 1 issue per year

© 2018 Katya Olmos-Solis, Anouk M. van Loon, Christian N.L. Olivers, published by Ubiquity Press
This work is licensed under the Creative Commons Attribution 4.0 License.