Sound source similarity influences change perception during complex scene perception

2015 ◽  
Vol 137 (4) ◽  
pp. 2226-2226 ◽  
Author(s):  
Kelly Dickerson ◽  
Jeremy Gaston ◽  
Ashley Foots ◽  
Timothy Mermagen
2016 ◽  
Author(s):  
Kelly Dickerson ◽  
Jeremy R. Gaston ◽  
Brandon S. Perelman ◽  
Timothy Mermagen ◽  
Ashley N. Foots

2004 ◽  
Author(s):  
John M. Henderson

2020 ◽  
Author(s):  
Yaelan Jung ◽  
Dirk B. Walther

AbstractNatural scenes deliver rich sensory information about the world. Decades of research has shown that the scene-selective network in the visual cortex represents various aspects of scenes. It is, however, unknown how such complex scene information is processed beyond the visual cortex, such as in the prefrontal cortex. It is also unknown how task context impacts the process of scene perception, modulating which scene content is represented in the brain. In this study, we investigate these questions using scene images from four natural scene categories, which also depict two types of global scene properties, temperature (warm or cold), and sound-level (noisy or quiet). A group of healthy human subjects from both sexes participated in the present study using fMRI. In the study, participants viewed scene images under two different task conditions; temperature judgment and sound-level judgment. We analyzed how different scene attributes (scene categories, temperature, and sound-level information) are represented across the brain under these task conditions. Our findings show that global scene properties are only represented in the brain, especially in the prefrontal cortex, when they are task-relevant. However, scene categories are represented in the brain, in both the parahippocampal place area and the prefrontal cortex, regardless of task context. These findings suggest that the prefrontal cortex selectively represents scene content according to task demands, but this task selectivity depends on the types of scene content; task modulates neural representations of global scene properties but not of scene categories.


eLife ◽  
2020 ◽  
Vol 9 ◽  
Author(s):  
Christopher A Henry ◽  
Mehrdad Jazayeri ◽  
Robert M Shapley ◽  
Michael J Hawken

Complex scene perception depends upon the interaction between signals from the classical receptive field (CRF) and the extra-classical receptive field (eCRF) in primary visual cortex (V1) neurons. Although much is known about V1 eCRF properties, we do not yet know how the underlying mechanisms map onto the cortical microcircuit. We probed the spatio-temporal dynamics of eCRF modulation using a reverse correlation paradigm, and found three principal eCRF mechanisms: tuned-facilitation, untuned-suppression, and tuned-suppression. Each mechanism had a distinct timing and spatial profile. Laminar analysis showed that the timing, orientation-tuning, and strength of eCRF mechanisms had distinct signatures within magnocellular and parvocellular processing streams in the V1 microcircuit. The existence of multiple eCRF mechanisms provides new insights into how V1 responds to spatial context. Modeling revealed that the differences in timing and scale of these mechanisms predicted distinct patterns of net modulation, reconciling many previous disparate physiological and psychophysical findings.


2019 ◽  
Author(s):  
Christopher A. Henry ◽  
Mehrdad Jazayeri ◽  
Robert M. Shapley ◽  
Michael J. Hawken

AbstractComplex scene perception depends upon the interaction between signals from the classical receptive field (CRF) and the extra-classical receptive field (eCRF) in primary visual cortex (V1) neurons. While much is known about V1 eCRF properties, it remains unknown how the underlying mechanisms map onto the cortical microcircuit. We probed the spatio-temporal dynamics of eCRF modulation using a reverse correlation paradigm, and found three principal eCRF mechanisms: tuned-facilitation, untuned-suppression, and tuned-suppression. Each mechanism had a distinct timing and spatial profile. Laminar analysis showed that the timing, orientation-tuning, and strength of eCRF mechanisms had distinct signatures within magnocellular and parvocellular processing streams in the V1 microcircuit. The existence of multiple eCRF mechanisms provides new insights into how V1 responds to spatial context. Modeling revealed that the differences in timing and scale of these mechanisms predicted distinct patterns of net modulation, reconciling many previous disparate physiological and psychophysical findings.


1999 ◽  
Vol 58 (3) ◽  
pp. 170-179 ◽  
Author(s):  
Barbara S. Muller ◽  
Pierre Bovet

Twelve blindfolded subjects localized two different pure tones, randomly played by eight sound sources in the horizontal plane. Either subjects could get information supplied by their pinnae (external ear) and their head movements or not. We found that pinnae, as well as head movements, had a marked influence on auditory localization performance with this type of sound. Effects of pinnae and head movements seemed to be additive; the absence of one or the other factor provoked the same loss of localization accuracy and even much the same error pattern. Head movement analysis showed that subjects turn their face towards the emitting sound source, except for sources exactly in the front or exactly in the rear, which are identified by turning the head to both sides. The head movement amplitude increased smoothly as the sound source moved from the anterior to the posterior quadrant.


2012 ◽  
Author(s):  
Elizabeth Keenan ◽  
Lisa Zaval ◽  
Ye Li ◽  
Eric J. Johnson

2013 ◽  
Author(s):  
Susanne Mayr ◽  
Gunnar Regenbrecht ◽  
Kathrin Lange ◽  
Albertgeorg Lang ◽  
Axel Buchner

Sign in / Sign up

Export Citation Format

Share Document