How do top-down mechanisms and action intentions guide visual attention?
Final Report Abstract
To cope with the huge amount of incoming information, the visual system has to select relevant objects for further and more detailed processing. Observers are capable of setting their attention to particular objects that are important for their current task or for an intended action. This project examined how such top-down induced selection relates to other attentional control mechanisms when selecting relevant and ignoring irrelevant objects in a visual scene. In the first experimental series the dynamics of goal-driven selection and especially its temporal relation to bottom-up control were examined. Results showed that salient but irrelevant information can increase the competition on the priority map preceding the decision where to attend next, but does not mandatorily capture visual attention. Instead, selective visual attention is under top-down control, which prioritizes sensory signals very early in the processing stream. The second and third experimental series focused on action goals as one particular type of intentional, goal-based selection. These experiments examined the impact of action planning on an unrelated perceptual selection task, an experimental logic directly resulting from ideomotor theories which assume a common coding format for sensory and motor events on which such interaction effects are expected. Results not only showed the existence of such action-perception-congruency effects, but also revealed the origin of this actioninduced perception bias, the dynamics of this effect and its role in action planning, and its neuronal underpinnings. Taken together, the results indicate a strong coupling of action planning and the selection of visual signals, and they support the idea that action-perception-congruency effects are reflecting a fundamental characteristic of human action control.
Publications
- (2011). Action-induced effects on perception depend neither on element-level nor on set-level similarity between stimulus and response sets. Attention, Perception & Psychophysics, 73, 1034-1041
Wykowska, A., Hommel, B., & Schubö, A.
- (2011). How humans optimize their interaction with the environment: the impact of action context on human perception. International Journal of Social Robotics, 3, 223–231
Wykowska, A., Maldonado, A., Beetz, M., & Schubö, A.
- (2011). Irrelevant singletons in visual search do not capture attention but can produce non-spatial filtering costs. Journal of Cognitive Neuroscience, 23, 645–660
Wykowska, A., Schubö, A.
- (2012). Action intentions modulate allocation of visual attention: electrophysiological evidence. Frontiers in Psychology, 3:379
Wykowska, A., Schubö, A.
- (2012). Imaging when acting: picture but not word cues induce action-related biases of visual attention. Frontiers in Psychology, 3:388
Wykowska, A., Hommel, B., Schubö, A.
- (2012). Perception and action as two sides of the same coin. A review of the importance of action-perception links in humans for social robot design and research. International Journal of Social Robotics, 4, 5- 14
Wykowska, A., Schubö, A.
(See online at https://doi.org/10.1007/s12369-011-0127-6) - (2013). Electrophysiological correlates of early attentional feature selection and distractor filtering. Biological Psychology, 93, 269– 278
Akyürek, E., & Schubö
- (2013). Motivation modulates visual attention: Evidence from pupillometry. Frontiers in Psychology, 4:59
Wykowska, A., Anderl, C., Schubö, A., & Hommel, B.