scholarly journals Wide-Field Feedback Neurons Dynamically Tune Early Visual Processing

Neuron ◽  
2014 ◽  
Vol 82 (4) ◽  
pp. 887-895 ◽  
Author(s):  
John C. Tuthill ◽  
Aljoscha Nern ◽  
Gerald M. Rubin ◽  
Michael B. Reiser
2018 ◽  
Author(s):  
Bernard J E Evans ◽  
David C O'Carroll ◽  
Joseph M Fabian ◽  
Steven D Wiederman

An important task for any aerial creature is the ability to ascertain their own movement (ego-motion) through their environment. Neurons thought to underlie this behaviour have been well-characterised in many insect models including flies, moths and bees. However, dragonfly wide-field motion pathways remain undescribed. Some species of Dragonflies, such as Hemicordulia tau, engage in hawking behaviour, hovering in a single area for extended periods of time whilst also engaging in fast-moving patrols and highly dynamic pursuits of prey and conspecifics. These varied flight behaviours place very different constraints on establishing ego-motion from optic flow cues hinting at a sophisticated wide-field motion analysis system capable of detecting both fast and slow motion. We characterised wide-field motion sensitive neurons via intracellular recordings in Hemicordulia dragonflies finding similar properties to those found in other species. We found that the spatial and temporal tuning properties of these neurons were broadly similar but differed significantly in their adaptation to sustained motion. We categorised a total of three different subclasses, finding differences between subclasses in their motion adaptation and response to the broadband statistics of natural images. The differences found correspond well with the dynamics of the varied behavioural tasks hawking dragonflies perform. These findings may underpin the exquisite flight behaviours found in dragonflies. They also hint at the need for the great complexity seen in dragonfly early visual processing.


The construction of directionally selective units, and their use in the processing of visual motion, are considered. The zero crossings of ∇ 2 G(x, y) ∗ I(x, y) are located, as in Marr & Hildreth (1980). That is, the image is filtered through centre-surround receptive fields, and the zero values in the output are found. In addition, the time derivative ∂[∇ 2 G(x, y) ∗ l(x, y) ]/∂ t is measured at the zero crossings, and serves to constrain the local direction of motion to within 180°. The direction of motion can be determined in a second stage, for example by combining the local constraints. The second part of the paper suggests a specific model of the information processing by the X and Y cells of the retina and lateral geniculate nucleus, and certain classes of cortical simple cells. A number of psychophysical and neurophysiological predictions are derived from the theory.


1997 ◽  
Vol 8 (2) ◽  
pp. 95-100 ◽  
Author(s):  
Kimron Shapiro ◽  
Jon Driver ◽  
Robert Ward ◽  
Robyn E. Sorensen

When people must detect several targets in a very rapid stream of successive visual events at the same location, detection of an initial target induces misses for subsequent targets within a brief period. This attentional blink may serve to prevent interruption of ongoing target processing by temporarily suppressing vision for subsequent stimuli. We examined the level at which the internal blink operates, specifically, whether it prevents early visual processing or prevents quite substantial processing from reaching awareness. Our data support the latter view. We observed priming from missed letter targets, benefiting detection of a subsequent target with the same identity but a different case. In a second study, we observed semantic priming from word targets that were missed during the blink. These results demonstrate that attentional gating within the blink operates only after substantial stimulus processing has already taken place. The results are discussed in terms of two forms of visual representation, namely, types and tokens.


The existence of multiple channels, or multiple receptive field sizes, in the visual system does not commit us to any particular theory of spatial encoding in vision. However, distortions of apparent spatial frequency and width in a wide variety of conditions favour the idea that each channel carries a width- or frequency-related code or ‘label’ rather than a ‘local sign’ or positional label. When distortions of spatial frequency occur without prior adaptation (e.g. at low contrast or low luminance) they are associated with lowered sensitivity, and may be due to a mismatch between the perceptual labels and the actual tuning of the channels. A low-level representation of retinal space could be constructed from the spatial information encoded by the channels, rather than being projected intact from the retina.


2011 ◽  
Vol 33 (1) ◽  
pp. 63-74 ◽  
Author(s):  
Karsten Rauss ◽  
Gilles Pourtois ◽  
Patrik Vuilleumier ◽  
Sophie Schwartz

1993 ◽  
pp. 171-175
Author(s):  
Harry G. Barrow ◽  
Alistair J. Bray

2015 ◽  
Vol 27 (4) ◽  
pp. 832-841 ◽  
Author(s):  
Amanda K. Robinson ◽  
Judith Reinhard ◽  
Jason B. Mattingley

Sensory information is initially registered within anatomically and functionally segregated brain networks but is also integrated across modalities in higher cortical areas. Although considerable research has focused on uncovering the neural correlates of multisensory integration for the modalities of vision, audition, and touch, much less attention has been devoted to understanding interactions between vision and olfaction in humans. In this study, we asked how odors affect neural activity evoked by images of familiar visual objects associated with characteristic smells. We employed scalp-recorded EEG to measure visual ERPs evoked by briefly presented pictures of familiar objects, such as an orange, mint leaves, or a rose. During presentation of each visual stimulus, participants inhaled either a matching odor, a nonmatching odor, or plain air. The N1 component of the visual ERP was significantly enhanced for matching odors in women, but not in men. This is consistent with evidence that women are superior in detecting, discriminating, and identifying odors and that they have a higher gray matter concentration in olfactory areas of the OFC. We conclude that early visual processing is influenced by olfactory cues because of associations between odors and the objects that emit them, and that these associations are stronger in women than in men.


Sign in / Sign up

Export Citation Format

Share Document