2012年5月8日火曜日

Dynamic integration of information about salience and value for saccadic eye movements


Alexander C. Schütz, Julia Trommershäuser, and Karl R. Gegenfurtner
PNAS May 8, 2012 vol. 109 no. 19 7547-7552

Humans shift their gaze to a new location several times per second. It is still unclear what determines where they look next. Fixation behavior is influenced by the low-level salience of the visual stimulus, such as luminance, contrast, and color, but also by high-level task demands and prior knowledge. Under natural conditions, different sources of information might conflict with each other and have to be combined. In our paradigm, we trade off visual salience against expected value. We show that both salience and value information influence the saccadic end point within an object, but with different time courses. The relative weights of salience and value are not constant but vary from eye movement to eye movement, depending critically on the availability of the value information at the time when the saccade is programmed. Short-latency saccades are determined mainly by salience, but value information is taken into account for long-latency saccades. We present a model that describes these data by dynamically weighting and integrating detailed topographic maps of visual salience and value. These results support the notion of independent neural pathways for the processing of visual information and value.

0 件のコメント:

コメントを投稿