scholarly journals Auditory-Visual Interactions in the Blind with Artificial Vision: Are Multisensory Perceptions Restored After Decades of Blindness?

2019 ◽  
Author(s):  
Noelle R. B. Stiles ◽  
Vivek R. Patel ◽  
James D. Weiland

AbstractIn the sighted, auditory and visual perception typically interact strongly and influence each other significantly. Blindness acquired in adulthood alters these multisensory pathways. During blindness, it has been shown that the senses functionally reorganize, enabling visual cortex to be recruited for auditory processing. It is yet unknown whether this reorganization is permanent, or whether auditory-visual interactions can be re-established in cases of partial visual recovery.Retinal prostheses restore visual perception to the late blind and provide an opportunity to determine if these auditory-visual connections and interactions are still viable after years of plasticity and neglect. We tested Argus II retinal prosthesis patients (N = 7) for an auditory-visual illusion, the ventriloquist effect, in which the perceived location of an auditory stimulus is modified by the presence of a visual stimulus. Prosthetically-restored visual perception significantly modified patients’ auditory perceptions, comparable to results with sighted control participants (N = 10). Furthermore, the auditory-visual interaction strength in retinal prosthesis patients exhibited a significant partial anti-correlation with patient age, as well as a significant partial correlation with duration of prosthesis use.These results indicate that auditory-visual interactions can be restored after decades of blindness, and that auditory-visual processing pathways and regions can be re-engaged. Furthermore, they indicate the resilience of multimodal interactions to plasticity during blindness, and that this plasticity can either be partially reversed or at least does not prevent auditory-visual interactions. Finally, this study provides hope for the restoration of sensory perception, complete with multisensory integration, even after years of visual deprivation.SignificanceRetinal prostheses restore visual perception to the blind by means of an implanted retinal stimulator wirelessly connected to a camera mounted on glasses. Individuals with prosthetic vision can locate and identify simple objects, and identify the direction of visual motion. A key question is whether this prosthetic vision will interact with the other senses, such as audition, in the same way that natural vision does. We found that artificial vision, like natural vision, can alter auditory localization. This suggests that the brain processes prosthetic vision similarly to natural vision despite altered visual processing in the retina. In addition, it implies that reorganization of the senses during blindness may be reversible, allowing for the rehabilitation of crossmodal interactions after visual restoration.

2020 ◽  
Author(s):  
Jacob Thomas Thorn ◽  
Enrico Migliorini ◽  
Diego Ghezzi

AbstractObjectiveRetinal prostheses hold the potential to restore artificial vision in blind patients suffering from outer retinal dystrophies. The optimal number, density, and coverage of the electrodes that a retinal prosthesis should have to provide adequate artificial vision in daily activities is still an open question and an important design parameter needed to develop better implants.ApproachTo address this question, we investigated the interaction between the visual angle, the pixel number and the pixel density without being limited by a small electrode count, as in previous research reports. We implemented prosthetic vision in a virtual reality environment in order to simulate the real-life experience of using a retinal prosthesis. We designed four different tasks simulating: object recognition, word reading, perception of a descending step and crossing a street.Main resultsThe results of our study showed that in all the tasks the visual angle played the most significant role in improving the performance of the participant.SignificanceThe design of new retinal prostheses should take into account the relevance of the restored visual angle to provide a helpful and valuable visual aid to profoundly or totally blind patients.


2020 ◽  
Author(s):  
Amandine Lassalle ◽  
Michael X Cohen ◽  
Laura Dekkers ◽  
Elizabeth Milne ◽  
Rasa Gulbinaite ◽  
...  

Background: People with an Autism Spectrum Condition diagnosis (ASD) are hypothesized to show atypical neural dynamics, reflecting differences in neural structure and function. However, previous results regarding neural dynamics in autistic individuals have not converged on a single pattern of differences. It is possible that the differences are cognitive-set-specific, and we therefore measured EEG in autistic individuals and matched controls during three different cognitive states: resting, visual perception, and cognitive control.Methods: Young adults with and without an ASD (N=17 in each group) matched on age (range 20 to 30 years), sex, and estimated Intelligence Quotient (IQ) were recruited. We measured their behavior and their EEG during rest, a task requiring low-level visual perception of gratings of varying spatial frequency, and the “Simon task” to elicit activity in the executive control network. We computed EEG power and Inter-Site Phase Clustering (ISPC; a measure of connectivity) in various frequency bands.Results: During rest, there were no ASD vs. controls differences in EEG power, suggesting typical oscillation power at baseline. During visual processing, without pre-baseline normalization, we found decreased broadband EEG power in ASD vs. controls, but this was not the case during the cognitive control task. Furthermore, the behavioral results of the cognitive control task suggest that autistic adults were better able to ignore irrelevant stimuli.Conclusions: Together, our results defy a simple explanation of overall differences between ASD and controls, and instead suggest a more nuanced pattern of altered neural dynamics that depend on which neural networks are engaged.


The construction of directionally selective units, and their use in the processing of visual motion, are considered. The zero crossings of ∇ 2 G(x, y) ∗ I(x, y) are located, as in Marr & Hildreth (1980). That is, the image is filtered through centre-surround receptive fields, and the zero values in the output are found. In addition, the time derivative ∂[∇ 2 G(x, y) ∗ l(x, y) ]/∂ t is measured at the zero crossings, and serves to constrain the local direction of motion to within 180°. The direction of motion can be determined in a second stage, for example by combining the local constraints. The second part of the paper suggests a specific model of the information processing by the X and Y cells of the retina and lateral geniculate nucleus, and certain classes of cortical simple cells. A number of psychophysical and neurophysiological predictions are derived from the theory.


2008 ◽  
Vol 364 (1516) ◽  
pp. 463-470 ◽  
Author(s):  
Devi Stuart-Fox ◽  
Adnan Moussalli

Organisms capable of rapid physiological colour change have become model taxa in the study of camouflage because they are able to respond dynamically to the changes in their visual environment. Here, we briefly review the ways in which studies of colour changing organisms have contributed to our understanding of camouflage and highlight some unique opportunities they present. First, from a proximate perspective, comparison of visual cues triggering camouflage responses and the visual perception mechanisms involved can provide insight into general visual processing rules. Second, colour changing animals can potentially tailor their camouflage response not only to different backgrounds but also to multiple predators with different visual capabilities. We present new data showing that such facultative crypsis may be widespread in at least one group, the dwarf chameleons. From an ultimate perspective, we argue that colour changing organisms are ideally suited to experimental and comparative studies of evolutionary interactions between the three primary functions of animal colour patterns: camouflage; communication; and thermoregulation.


2018 ◽  
Vol 1 ◽  
pp. 205920431877823 ◽  
Author(s):  
Linda Becker

Musical expertise can lead to neural plasticity in specific cognitive domains (e.g., in auditory music perception). However, not much is known about whether the visual perception of simple musical symbols (e.g., notes) already differs between musicians and non-musicians. This was the aim of the present study. Therefore, the Familiarity Effect (FE) – an effect which occurs quite early during visual processing and which is based on prior knowledge or expertise – was investigated. The FE describes the phenomenon that it is easier to find an unfamiliar element (e.g., a mirrored eighth note) in familiar elements (e.g., normally oriented eighth notes) than to find a familiar element in a background of unfamiliar elements. It was examined whether the strength of the FE for eighth notes differs between note readers and non-note readers. Furthermore, it was investigated at which component of the event-related brain potential (ERP) the FE occurs. Stimuli that consisted of either eighth notes or vertically mirrored eighth notes were presented to the participants (28 note readers, 19 non-note readers). A target element was embedded in half of the trials. Reaction times, sensitivity, and three ERP components (the N1, N2p, and P3) were recorded. For both the note readers and the non-note readers, strong FEs were found in the behavioral data. However, no differences in the strength of the FE between groups were found. Furthermore, for both groups, the FE was found for the same ERP components (target-absent trials – N1 latency; target-present trials – N2p latency, N2p amplitude, P3 amplitude). It is concluded that the early visual perception of eighth note symbols does not differ between note readers and non-note readers. However, future research is needed to verify this for more complex musical stimuli and for professional musicians.


2009 ◽  
Vol 26 (1) ◽  
pp. 35-49 ◽  
Author(s):  
THORSTEN HANSEN ◽  
KARL R. GEGENFURTNER

AbstractForm vision is traditionally regarded as processing primarily achromatic information. Previous investigations into the statistics of color and luminance in natural scenes have claimed that luminance and chromatic edges are not independent of each other and that any chromatic edge most likely occurs together with a luminance edge of similar strength. Here we computed the joint statistics of luminance and chromatic edges in over 700 calibrated color images from natural scenes. We found that isoluminant edges exist in natural scenes and were not rarer than pure luminance edges. Most edges combined luminance and chromatic information but to varying degrees such that luminance and chromatic edges were statistically independent of each other. Independence increased along successive stages of visual processing from cones via postreceptoral color-opponent channels to edges. The results show that chromatic edge contrast is an independent source of information that can be linearly combined with other cues for the proper segmentation of objects in natural and artificial vision systems. Color vision may have evolved in response to the natural scene statistics to gain access to this independent information.


Retinal prostheses are devices that receive an environmental visual stimulus, process it, and stimulate the degenerated retinal areas in order to produce a functionally efficient visual perception. Indications for implantation of these devices include hereditary retinal degenerations like retinitis pigmentosa, choroideremia, rod-cone dystrophy, and acquired macular diseases like geographic atrophy or fibrosis due to age-related macular degeneration. Clinically applied retinal prosthesis approaches can be classified as; epiretinal, subretinal, suprachoroidal, and scleral (transscleral suprachoroidal). In this paper, approaches of retinal prosthesis research groups, results of clinical trials, and the latest advances in their projects will be provided.


2020 ◽  
Vol 104 (12) ◽  
pp. 1730-1734 ◽  
Author(s):  
Sandra R Montezuma ◽  
Susan Y Sun ◽  
Arup Roy ◽  
Avi Caspi ◽  
Jessy D Dorn ◽  
...  

AimTo demonstrate the potential clinically meaningful benefits of a thermal camera integrated with the Argus II, an artificial vision therapy system, for assisting Argus II users in localising and discriminating heat-emitting objects.MethodsSeven blind patients implanted with Argus II retinal prosthesis participated in the study. Two tasks were investigated: (1) localising up to three heat-emitting objects by indicating the location of the objects and (2) discriminating a specific heated object out of three presented on a table. Heat-emitting objects placed on a table included a toaster, a flat iron, an electric kettle, a heating pad and a mug of hot water. Subjects completed the two tasks using the unmodified Argus II system with a visible-light camera and the thermal camera-integrated Argus II.ResultsSubjects more accurately localised heated objects displayed on a table (p=0.011) and discriminated a specific type of object (p=0.005) presented with the thermal camera integrated with the Argus II versus the unmodified Argus II camera.ConclusionsThe thermal camera integrated with the artificial vision therapy system helps users to locate and differentiate heat-emitting objects more precisely than a visible light sensor. The integration of the thermal camera with the Argus II may have significant benefits in patients’ daily life.


Sign in / Sign up

Export Citation Format

Share Document