Visual Cues, Decoding, Literacy, and the Brain

2007 ◽  
Vol 17 (2) ◽  
pp. 9-12
Author(s):  
Judith A Curtin
Keyword(s):  
2021 ◽  
Vol 7 (1) ◽  
pp. eabd6127
Author(s):  
Gwangsu Kim ◽  
Jaeson Jang ◽  
Seungdae Baek ◽  
Min Song ◽  
Se-Bum Paik

Number sense, the ability to estimate numerosity, is observed in naïve animals, but how this cognitive function emerges in the brain remains unclear. Here, using an artificial deep neural network that models the ventral visual stream of the brain, we show that number-selective neurons can arise spontaneously, even in the complete absence of learning. We also show that the responses of these neurons can induce the abstract number sense, the ability to discriminate numerosity independent of low-level visual cues. We found number tuning in a randomly initialized network originating from a combination of monotonically decreasing and increasing neuronal activities, which emerges spontaneously from the statistical properties of bottom-up projections. We confirmed that the responses of these number-selective neurons show the single- and multineuron characteristics observed in the brain and enable the network to perform number comparison tasks. These findings provide insight into the origin of innate cognitive functions.


Author(s):  
Jason McCarthy ◽  
Patricia Castro ◽  
Rachael Cottier ◽  
Joseph Buttell ◽  
Qadeer Arshad ◽  
...  

AbstractA coherent perception of spatial orientation is key in maintaining postural control. To achieve this the brain must access sensory inputs encoding both the body and the head position and integrate them with incoming visual information. Here we isolated the contribution of proprioception to verticality perception and further investigated whether changing the body position without moving the head can modulate visual dependence—the extent to which an individual relies on visual cues for spatial orientation. Spatial orientation was measured in ten healthy individuals [6 female; 25–47 years (SD 7.8 years)] using a virtual reality based subjective visual vertical (SVV) task. Individuals aligned an arrow to their perceived gravitational vertical, initially against a static black background (10 trials), and then in other conditions with clockwise and counterclockwise background rotations (each 10 trials). In all conditions, subjects were seated first in the upright position, then with trunk tilted 20° to the right, followed by 20° to the left while the head was always aligned vertically. The SVV error was modulated by the trunk position, and it was greater when the trunk was tilted to the left compared to right or upright trunk positions (p < 0.001). Likewise, background rotation had an effect on SVV errors as these were greater with counterclockwise visual rotation compared to static background and clockwise roll motion (p < 0.001). Our results show that the interaction between neck and trunk proprioception can modulate how visual inputs affect spatial orientation.


2021 ◽  
Author(s):  
Yusuke Ujitoko ◽  
Takahiro Kawabe

Humans can judge the softness of elastic materials through only visual cues. However, factors contributing to the judgement of visual softness are not yet fully understood. We conducted a psychophysical experiment to determine which factors and motion features contribute to the apparent softness of materials. Observers watched video clips in which materials were indented from the top surface to a certain depth, and reported the apparent softness of the materials. The depth and speed of indentation were systematically manipulated. As physical characteristics of materials, compliance was also controlled. It was found that higher indentation speeds resulted in larger softness rating scores and the variation with the indentation speed was successfully explained by the image motion speed. The indentation depth had a powerful effect on the softness rating scores whose variation with the indentation depth was consistently explained by motion features related to overall deformation. Higher material compliance resulted in higher rating scores while their effect was not straightforwardly explained by the motion features. We conclude that the brain makes visual judgments about the softness of materials under indentation on the basis of the motion speed and deformation magnitude while motion features related to material compliance require further study.


2021 ◽  
Vol 12 ◽  
Author(s):  
Elisa Rigosi ◽  
David C. O’Carroll

Cholinergic pesticides, such as the neonicotinoid imidacloprid, are the most important insecticides used for plant protection worldwide. In recent decades, concerns have been raised about side effects on non-target insect species, including altered foraging behavior and navigation. Although pollinators rely on visual cues to forage and navigate their environment, the effects of neonicotinoids on visual processing have been largely overlooked. To test the effect of acute treatment with imidacloprid at known concentrations in the brain, we developed a modified electrophysiological setup that allows recordings of visually evoked responses while perfusing the brain in vivo. We obtained long-lasting recordings from direction selective wide-field, motion sensitive neurons of the hoverfly pollinator, Eristalis tenax. Neurons were treated with imidacloprid (3.9 μM, 0.39 μM or a sham control treatment using the solvent (dimethylsulfoxide) only. Exposure to a high, yet sub-lethal concentration of imidacloprid significantly alters their physiological response to motion stimuli. We observed a general effect of imidacloprid (3.9 μM) increasing spontaneous activity, reducing contrast sensitivity and giving weaker directional tuning to wide-field moving stimuli, with likely implications for errors in flight control, hovering and routing. Our electrophysiological approach reveals the robustness of the fly visual pathway against cholinergic perturbance (i.e., at 0.39 μM) but also potential threatening effects of cholinergic pesticides (i.e., evident at 3.9 μM) for the visual motion detecting system of an important pollinator.


2019 ◽  
Author(s):  
Sophie Smit ◽  
Anina N. Rich ◽  
Regine Zopf

AbstractBody ownership relies on spatiotemporal correlations between multisensory signals and visual cues specifying oneself such as body form and orientation. The mechanism for the integration of bodily signals remains unclear. One approach to model multisensory integration that has been influential in the multisensory literature is Bayesian causal inference. This specifies that the brain integrates spatial and temporal signals coming from different modalities when it infers a common cause for inputs. As an example, the rubber hand illusion shows that visual form and orientation cues can promote the inference of a common cause (one’s body) leading to spatial integration shown by a proprioceptive drift of the perceived location of the real hand towards the rubber hand. Recent studies investigating the effect of visual cues on temporal integration, however, have led to conflicting findings. These could be due to task differences, variation in ecological validity of stimuli and/or small samples. In this pre-registered study, we investigated the influence of visual information on temporal integration using a visuo-tactile temporal order judgement task with realistic stimuli and a sufficiently large sample determined by Bayesian analysis. Participants viewed videos of a touch being applied to plausible or implausible visual stimuli for one’s hand (hand oriented plausibly, hand rotated 180 degrees, or a sponge) while also being touched at varying stimulus onset asynchronies. Participants judged which stimulus came first: viewed or felt touch. Results show that visual cues do not modulate visuo-tactile temporal order judgements. This is not in line with the idea that bodily signals indicating oneself influence the integration of multisensory signals in the temporal domain. The current study emphasises the importance of rigour in our methodologies and analyses to advance the understanding of how properties of multisensory events affect the encoding of temporal information in the brain.


eLife ◽  
2015 ◽  
Vol 4 ◽  
Author(s):  
Adhira Sunkara ◽  
Gregory C DeAngelis ◽  
Dora E Angelaki

As we navigate through the world, eye and head movements add rotational velocity patterns to the retinal image. When such rotations accompany observer translation, the rotational velocity patterns must be discounted to accurately perceive heading. The conventional view holds that this computation requires efference copies of self-generated eye/head movements. Here we demonstrate that the brain implements an alternative solution in which retinal velocity patterns are themselves used to dissociate translations from rotations. These results reveal a novel role for visual cues in achieving a rotation-invariant representation of heading in the macaque ventral intraparietal area. Specifically, we show that the visual system utilizes both local motion parallax cues and global perspective distortions to estimate heading in the presence of rotations. These findings further suggest that the brain is capable of performing complex computations to infer eye movements and discount their sensory consequences based solely on visual cues.


2018 ◽  
Author(s):  
Pierre Mégevand ◽  
Manuel R. Mercier ◽  
David M. Groppe ◽  
Elana Zion Golumbic ◽  
Nima Mesgarani ◽  
...  

ABSTRACTNatural conversation is multisensory: when we can see the speaker’s face, visual speech cues influence our perception of what is being said. The neuronal basis of this phenomenon remains unclear, though there is indication that phase modulation of neuronal oscillations—ongoing excitability fluctuations of neuronal populations in the brain—provides a mechanistic contribution. Investigating this question using naturalistic audiovisual speech with intracranial recordings in humans, we show that neuronal populations in auditory cortex track the temporal dynamics of unisensory visual speech using the phase of their slow oscillations and phase-related modulations in high-frequency activity. Auditory cortex thus builds a representation of the speech stream’s envelope based on visual speech alone, at least in part by resetting the phase of its ongoing oscillations. Phase reset could amplify the representation of the speech stream and organize the information contained in neuronal activity patterns.SIGNIFICANCE STATEMENTWatching the speaker can facilitate our understanding of what is being said. The mechanisms responsible for this influence of visual cues on the processing of speech remain incompletely understood. We studied those mechanisms by recording the human brain’s electrical activity through electrodes implanted surgically inside the skull. We found that some regions of cerebral cortex that process auditory speech also respond to visual speech even when it is shown as a silent movie without a soundtrack. This response can occur through a reset of the phase of ongoing oscillations, which helps augment the response of auditory cortex to audiovisual speech. Our results contribute to discover the mechanisms by which the brain merges auditory and visual speech into a unitary perception.


Author(s):  
Salomé Le Franc ◽  
Isabelle Bonan ◽  
Mathis Fleury ◽  
Simon Butet ◽  
Christian Barillot ◽  
...  

Abstract Background Illusion of movement induced by tendon vibration is commonly used in rehabilitation and seems valuable for motor rehabilitation after stroke, by playing a role in cerebral plasticity. The aim was to study if congruent visual cues using Virtual Reality (VR) could enhance the illusion of movement induced by tendon vibration of the wrist among participants with stroke. Methods We included 20 chronic stroke participants. They experienced tendon vibration of their wrist (100 Hz, 30 times) inducing illusion of movement. Three VR visual conditions were added to the vibration: a congruent moving virtual hand (Moving condition); a static virtual hand (Static condition); or no virtual hand at all (Hidden condition). The participants evaluated for each visual condition the intensity of the illusory movement using a Likert scale, the sensation of wrist’s movement using a degree scale and they answered a questionnaire about their preferred condition. Results The Moving condition was significantly superior to the Hidden condition and to the Static condition in terms of illusion of movement (p < 0.001) and the wrist’s extension (p < 0.001). There was no significant difference between the Hidden and the Static condition for these 2 criteria. The Moving condition was considered the best one to increase the illusion of movement (in 70% of the participants). Two participants did not feel any illusion of movement. Conclusions This study showed the interest of using congruent cues in VR in order to enhance the consistency of the illusion of movement induced by tendon vibration among participants after stroke, regardless of their clinical severity. By stimulating the brain motor areas, this visuo-proprioceptive feedback could be an interesting tool in motor rehabilitation. Record number in Clinical Trials: NCT04130711, registered on October 17th 2019 (https://clinicaltrials.gov/ct2/show/NCT04130711?id=NCT04130711&draw=2&rank=1).


2019 ◽  
Author(s):  
Andrea Adden ◽  
Sara Wibrand ◽  
Keram Pfeiffer ◽  
Eric Warrant ◽  
Stanley Heinze

AbstractEvery year, millions of Australian Bogong moths (Agrotis infusa) complete an astonishing journey: in spring, they migrate over 1000 km from their breeding grounds to the alpine regions of the Snowy Mountains, where they endure the hot summer in the cool climate of alpine caves. In autumn, the moths return to their breeding grounds, where they mate, lay eggs and die. These moths can use visual cues in combination with the geomagnetic field to guide their flight, but how these cues are processed and integrated in the brain to drive migratory behavior is unknown. To generate an access point for functional studies, we provide a detailed description of the Bogong moth’s brain. Based on immunohistochemical stainings against synapsin and serotonin (5HT), we describe the overall layout as well as the fine structure of all major neuropils, including the regions that have previously been implicated in compass-based navigation. The resulting average brain atlas consists of 3D reconstructions of 25 separate neuropils, comprising the most detailed account of a moth brain to date. Our results show that the Bogong moth brain follows the typical lepidopteran ground pattern, with no major specializations that can be attributed to their spectacular migratory lifestyle. These findings suggest that migratory behavior does not require widespread modifications of brain structure, but might be achievable via small adjustments of neural circuitry in key brain areas. Locating these subtle changes will be a challenging task for the future, for which our study provides an essential anatomical framework.


Sign in / Sign up

Export Citation Format

Share Document