Cerebral Asymmetry in the Perceived Duration of Color Stimuli

1980 ◽  
Vol 50 (3_suppl) ◽  
pp. 1239-1246
Author(s):  
Sue A. Koch ◽  
Donald J. Polzella ◽  
Frank Da Polito

20 right-handed males judged the duration of small and large colored circles, which were briefly exposed in the left, center, and right visual fields. Perceived duration was a logarithmic function of exposure duration and a positive function of size and chromaticity. Over-all accuracy was equivalent in the left and right visual fields, but the effects of chromaticity and duration on subjects' judgments were asymmetrical. These and other findings suggest a two-process model of time perception in which there is right hemispheric control over a visual information processor and left hemispheric control over a timer.

2020 ◽  
Vol 6 (2) ◽  
pp. eaay6036 ◽  
Author(s):  
R. C. Feord ◽  
M. E. Sumner ◽  
S. Pusdekar ◽  
L. Kalra ◽  
P. T. Gonzalez-Bellido ◽  
...  

The camera-type eyes of vertebrates and cephalopods exhibit remarkable convergence, but it is currently unknown whether the mechanisms for visual information processing in these brains, which exhibit wildly disparate architecture, are also shared. To investigate stereopsis in a cephalopod species, we affixed “anaglyph” glasses to cuttlefish and used a three-dimensional perception paradigm. We show that (i) cuttlefish have also evolved stereopsis (i.e., the ability to extract depth information from the disparity between left and right visual fields); (ii) when stereopsis information is intact, the time and distance covered before striking at a target are shorter; (iii) stereopsis in cuttlefish works differently to vertebrates, as cuttlefish can extract stereopsis cues from anticorrelated stimuli. These findings demonstrate that although there is convergent evolution in depth computation, cuttlefish stereopsis is likely afforded by a different algorithm than in humans, and not just a different implementation.


1981 ◽  
Vol 53 (1) ◽  
pp. 311-316 ◽  
Author(s):  
Stephen M. Rao ◽  
Daniel Rourke ◽  
R. Douglas Whitman

Normal right-handed subjects were presented with luminance patterns varying sinusoidally in both space and time to the left and right visual fields. With no temporal variation in the stimuli, detection thresholds for the left visual field were lower than those for the right visual field for all spatial frequencies. However, with increasing temporal variations, a reversal in detection of threshold occurred, with the right visual field surpassing the left. This finding suggests that left and right visual processing may be differentially efficient for temporal and spatial visual information.


2019 ◽  
Vol 5 (1) ◽  
Author(s):  
Marta Suárez-Pinilla ◽  
Kyriacos Nikiforou ◽  
Zafeirios Fountas ◽  
Anil K. Seth ◽  
Warrick Roseboom

The neural basis of time perception remains unknown. A prominent account is the pacemaker-accumulator model, wherein regular ticks of some physiological or neural pacemaker are read out as time. Putative candidates for the pacemaker have been suggested in physiological processes (heartbeat), or dopaminergic mid-brain neurons, whose activity has been associated with spontaneous blinking. However, such proposals have difficulty accounting for observations that time perception varies systematically with perceptual content. We examined physiological influences on human duration estimates for naturalistic videos between 1–64 seconds using cardiac and eye recordings. Duration estimates were biased by the amount of change in scene content. Contrary to previous claims, heart rate, and blinking were not related to duration estimates. Our results support a recent proposal that tracking change in perceptual classification networks provides a basis for human time perception, and suggest that previous assertions of the importance of physiological factors should be tempered.


2018 ◽  
Vol 119 (5) ◽  
pp. 1981-1992 ◽  
Author(s):  
Laura Mikula ◽  
Valérie Gaveau ◽  
Laure Pisella ◽  
Aarlenne Z. Khan ◽  
Gunnar Blohm

When reaching to an object, information about the target location as well as the initial hand position is required to program the motor plan for the arm. The initial hand position can be determined by proprioceptive information as well as visual information, if available. Bayes-optimal integration posits that we utilize all information available, with greater weighting on the sense that is more reliable, thus generally weighting visual information more than the usually less reliable proprioceptive information. The criterion by which information is weighted has not been explicitly investigated; it has been assumed that the weights are based on task- and effector-dependent sensory reliability requiring an explicit neuronal representation of variability. However, the weights could also be determined implicitly through learned modality-specific integration weights and not on effector-dependent reliability. While the former hypothesis predicts different proprioceptive weights for left and right hands, e.g., due to different reliabilities of dominant vs. nondominant hand proprioception, we would expect the same integration weights if the latter hypothesis was true. We found that the proprioceptive weights for the left and right hands were extremely consistent regardless of differences in sensory variability for the two hands as measured in two separate complementary tasks. Thus we propose that proprioceptive weights during reaching are learned across both hands, with high interindividual range but independent of each hand’s specific proprioceptive variability. NEW & NOTEWORTHY How visual and proprioceptive information about the hand are integrated to plan a reaching movement is still debated. The goal of this study was to clarify how the weights assigned to vision and proprioception during multisensory integration are determined. We found evidence that the integration weights are modality specific rather than based on the sensory reliabilities of the effectors.


1995 ◽  
Vol 74 (3) ◽  
pp. 1083-1094 ◽  
Author(s):  
V. J. Brown ◽  
R. Desimone ◽  
M. Mishkin

1. The tail of the caudate nucleus and adjacent ventral putamen (ventrocaudal neostriatum) are major projection sites of the extrastriate visual cortex. Visual information is then relayed, directly or indirectly, to a variety of structures with motor functions. To test for a role of the ventrocaudal neostriatum in stimulus-response association learning, or habit formation, neuronal responses were recorded while monkeys performed a visual discrimination task. Additional data were collected from cells in cortical area TF, which serve as a comparison and control for the caudate data. 2. Two monkeys were trained to perform an asymmetrically reinforced go-no go visual discrimination. The stimuli were complex colored patterns, randomly assigned to be either positive or negative. The monkey was rewarded with juice for releasing a bar when a positive stimulus was presented, whereas a negative stimulus signaled that no reward was available and that the monkey should withhold its response. Neuronal responses were recorded both while the monkey performed the task with previously learned stimuli and while it learned the task with new stimuli. In some cases, responses were recorded during reversal learning. 3. There was no evidence that cells in the ventrocaudal neostriatum were influenced by the reward contingencies of the task. Cells did not fire preferentially to the onset of either positive or negative stimuli; neither did cells fire in response to the reward itself or in association with the motor response of the monkey. Only visual responses were apparent. 4. The visual properties of cells in these structures resembled those of cells in some of the cortical areas projecting to them. Most cells responded selectively to different visual stimuli. The degree of stimulus selectivity was assessed with discriminant analysis and was found to be quantitatively similar to that of inferior temporal cells tested with similar stimuli. Likewise, like inferior temporal cells, many cells in the ventrocaudal neostriatum had large, bilateral receptive fields. Some cells had "doughnut"-shaped receptive fields, with stronger responses in the periphery of both visual fields than at the fovea, similar to the fields of some cells in the superior temporal polysensory area. Although the absence of task-specific responses argues that ventrocaudal neostriatal cells are not themselves the mediators of visual learning in the task employed, their cortical-like visual properties suggest that they might relay visual information important for visuomotor plasticity in other structures. (ABSTRACT TRUNCATED AT 400 WORDS)


SAGE Open ◽  
2020 ◽  
Vol 10 (3) ◽  
pp. 215824402093990
Author(s):  
Lingjing Li ◽  
Yu Tian

In the domain of aesthetic preference, previous studies focused primarily on exploring the factors that influence aesthetic preference while neglecting to investigate whether aesthetic preference affects other psychological activities. This study sought to expand our understanding of time perception by examining whether aesthetic preference in viewing paintings influenced its perceived duration. Participants who preferred Chinese paintings ( n = 20) and participants who preferred western paintings ( n = 21) were recruited to complete a temporal reproduction task that measured their time perception of Chinese paintings and of western paintings. The results showed that participants who preferred Chinese paintings exhibited longer time perceptions for Chinese paintings than for western paintings, while the participants who preferred western paintings exhibited longer time perceptions for western paintings than for Chinese paintings. These results suggested that aesthetic preference could modulate our perceived duration of painting presentation. Specifically, individuals perceive longer painting presentation durations when exposed to the stimuli matching their aesthetic preferences.


2010 ◽  
Vol 104 (5) ◽  
pp. 2624-2633 ◽  
Author(s):  
Catherine A. Dunn ◽  
Carol L. Colby

Our eyes are constantly moving, allowing us to attend to different visual objects in the environment. With each eye movement, a given object activates an entirely new set of visual neurons, yet we perceive a stable scene. One neural mechanism that may contribute to visual stability is remapping. Neurons in several brain regions respond to visual stimuli presented outside the receptive field when an eye movement brings the stimulated location into the receptive field. The stored representation of a visual stimulus is remapped, or updated, in conjunction with the saccade. Remapping depends on neurons being able to receive visual information from outside the classic receptive field. In previous studies, we asked whether remapping across hemifields depends on the forebrain commissures. We found that, when the forebrain commissures are transected, behavior dependent on accurate spatial updating is initially impaired but recovers over time. Moreover, neurons in lateral intraparietal cortex (LIP) continue to remap information across hemifields in the absence of the forebrain commissures. One possible explanation for the preserved across-hemifield remapping in split-brain animals is that neurons in a single hemisphere could represent visual information from both visual fields. In the present study, we measured receptive fields of LIP neurons in split-brain monkeys and compared them with receptive fields in intact monkeys. We found a small number of neurons with bilateral receptive fields in the intact monkeys. In contrast, we found no such neurons in the split-brain animals. We conclude that bilateral representations in area LIP following forebrain commissures transection cannot account for remapping across hemifields.


Sign in / Sign up

Export Citation Format

Share Document