EXPRESS: Are animates special? Exploring the effects of selective attention and animacy on visual statistical learning

2022 ◽  
pp. 174702182210746
Author(s):  
Jolene Alexa Cox ◽  
Timothy Walter Cox ◽  
Anne Marie Aimola Davies

Our visual system is built to extract regularities in how objects within our visual environment appear in relation to each other across time and space (‘visual statistical learning’). Existing research indicates that visual statistical learning is modulated by selective attention. Our attentional system prioritises information that enables behaviour; for example, animates are prioritised over inanimates (the ‘animacy advantage’). The present study examined the effects of selective attention and animacy on visual statistical learning in young adults (N = 284). We tested visual statistical learning of attended and unattended information across four animacy conditions: (i) living things that can self-initiate movement (animals); (ii) living things that cannot self-initiate movement (fruits and vegetables); (iii) non-living things that can generate movement (vehicles); and (iv) non-living things that cannot generate movement (tools and kitchen utensils). We implemented a four-point confidence-rating scale as an assessment of participants’ awareness of the regularities in the visual statistical learning task. There were four key findings. First, selective attention plays a critical role by modulating visual statistical learning. Second, animacy does not play a special role in visual statistical learning. Third, visual statistical learning of attended information cannot be exclusively accounted for by unconscious knowledge. Fourth, performance on the visual statistical learning task is associated with the proportion of stimuli that were named or labelled. Our findings support the notion that visual statistical learning is a powerful mechanism by which our visual system resolves an abundance of sensory input over time.

2021 ◽  
Author(s):  
Francisco Vicente-Conesa ◽  
Tamara Giménez-Fernández ◽  
David Luque ◽  
Miguel A. Vadillo

The additional singleton task has become a popular paradigm to explore visual statistical learning and selective attention. In this task, participants are instructed to find a different-shaped target among a series of distractors as fast as possible. In some trials, the search display includes a singleton distractor with a different colour, making search harder. This singleton distractor appears more often in one location than in all the remaining locations. The typical results of these experiments show that participants learn to ignore the area of the screen that is more likely to contain the singleton distractor. It is often claimed that this learning takes place unconsciously, because at the end of the experiment participants seem to be unable to identify the location where the singleton distractor appeared most frequently during the task. In the present study, we tested participants’ awareness in three high-powered experiments using alternative measures. Contrary to previous studies, the results show clear evidence of explicit knowledge about which area of the display was more likely to contain the singleton distractor, suggesting that this type of learning might not be unconscious.


2020 ◽  
Author(s):  
Katie Richards ◽  
Povilas Karvelis ◽  
Stephen Lawrie ◽  
Peggy Series

Deficits in statistical learning and predictive processing could in principle explain inattention and distractibility in attention deficit hyperactivity disorder (ADHD). To test this, we evaluated whether adults diagnosed with ADHD (n = 17) differed from controls (n = 30) in implicitly learning and using low-level perceptual priors to guide sensory processing. We used a visual statistical learning task in which participants had to estimate the direction of coherently moving dots. Unbeknown to the participants, two directions were more frequently presented than the others, creating an implicit bias (prior) towards those directions. This task had previously revealed differences in autistic spectrum disorder and schizophrenia. Both groups acquired the prior expectations for the most frequent directions and, except for some subtle differences over time, there were no group difference in how much the priors influenced performance. This suggests that ADHD symptoms do not stem from difficulties in developing and/or using perceptual priors.


Author(s):  
Christopher M. Conway ◽  
Robert L. Goldstone ◽  
Morten H. Christiansen

2012 ◽  
Vol 25 (0) ◽  
pp. 200
Author(s):  
David J. Lewkowicz

Human infancy is a time of rapid neural and behavioral development and multisensory perceptual skills emerge during this time. Both animal and human early deprivation studies have shown that experience contributes critically to the development of multisensory perception. Unfortunately, Bodison because the human deprivation studies have only studied adult responsiveness, little is known about the more immediate effects of early experience on multisensory development. Consequently, we have embarked on a program of research to investigate how early experience affects the development of multisensory perception in human infants. To do so, we have focused on multisensory perceptual narrowing, an experience-dependent process where initially broad perceptual tuning is narrowed to match the infant’s native environment. In this talk, I first review our work demonstrating that multisensory narrowing characterizes infants’ response to non-native (i.e., monkey) faces and voices, that the initially broad tuning is present at birth, that narrowing also occurs in the audiovisual speech domain, and that multisensory narrowing is an evolutionarily novel process. In the second part of the talk, I present findings from our most recent studies indicating that experience has a seemingly paradoxical effect on infant response to audio–visual synchrony, that experience narrows infant response to amodal language and intonational prosody cues, and that experience interacts with developmental changes in selective attention during the first year of life resulting in dramatic developmental shifts in human infants’ selective attention to the eyes and mouth of their interlocutors’ talking faces.


Sign in / Sign up

Export Citation Format

Share Document