retinal image motion
Recently Published Documents


TOTAL DOCUMENTS

63
(FIVE YEARS 6)

H-INDEX

18
(FIVE YEARS 1)

2021 ◽  
Vol 21 (9) ◽  
pp. 2046
Author(s):  
Michele A. Cox ◽  
Janis Intoy ◽  
Benjamin Moon ◽  
Ruei-Jr Wu ◽  
Jonathan D. Victor ◽  
...  

Author(s):  
Alexander Fanning ◽  
Amin Shakhawat ◽  
Jennifer L Raymond

The climbing fiber input to the cerebellum conveys instructive signals that can induce synaptic plasticity and learning by triggering complex spikes accompanied by large calcium transients in Purkinje cells. In the cerebellar flocculus, which supports oculomotor learning, complex spikes are driven by image motion on the retina, which could indicate an oculomotor error. In the same neurons, complex spikes also can be driven by non-visual signals. It has been shown that the calcium transients accompanying each complex spike can vary in amplitude, even within a given cell, therefore, we compared the calcium responses associated with the visual and non-visual inputs to floccular Purkinje cells. The calcium indicator GCaMP6f was selectively expressed in Purkinje cells, and fiber photometry was used to record the calcium responses from a population of Purkinje cells in the flocculus of awake behaving mice. During visual (optokinetic) stimuli and pairing of vestibular and visual stimuli, the calcium level increased during contraversive retinal image motion. During performance of the vestibulo-ocular reflex in the dark, calcium increased during contraversive head rotation and the associated ipsiverse eye movements. The amplitude of this non-visual calcium response was comparable to that during conditions with retinal image motion present that induce oculomotor learning. Thus, population calcium responses of Purkinje cells in the cerebellar flocculus to visual and non-visual input are similar to what has been reported previously for complex spikes, suggesting that multimodal instructive signals control the synaptic plasticity supporting oculomotor learning.


2020 ◽  
Vol 20 (7) ◽  
pp. 34 ◽  
Author(s):  
Alexander G. Anderson ◽  
Kavitha Ratnam ◽  
Austin Roorda ◽  
Bruno A. Olshausen

2019 ◽  
Vol 19 (10) ◽  
pp. 148
Author(s):  
Michele A Cox ◽  
Norick R Bowers ◽  
Janis Intoy ◽  
Martina Poletti ◽  
Michele Rucci

2019 ◽  
Author(s):  
Shih-Jung Hsu ◽  
Bo Cheng

ABSTRACTIn the presence of wind or background image motion, flies are able to maintain a constant retinal-image velocity via regulating flight speed to the extent permitted by their locomotor capacity. Here we investigated the speed regulation of semi-tethered blue-bottle flies (Calliphora vomitoria) flying along an annular corridor in a magnetically levitated flight mill enclosed by two motorized cylindrical walls. We perturbed the flies’ retinal-image motion via spinning the cylindrical walls, generating bilaterally-averaged velocity perturbations from -0.3 to 0.3 m·s-1. Flies compensated retinal-image velocity perturbations by adjusting airspeed up to 20%, thereby maintaining a relatively constant retinal-image velocity. When the retinal-image velocity perturbation became greater than ∼0.1 m·s-1, the compensation weakened as airspeed plateaued, suggesting that flies were unable to further change airspeed. The compensation gain, i.e., the ratio of airspeed compensation and retinal-image velocity perturbation, depended on the spatial frequency of the grating patterns, being the largest at 12 m-1.


2019 ◽  
Vol 19 (4) ◽  
pp. 2 ◽  
Author(s):  
Adela S. Y. Park ◽  
Andrew B. Metha ◽  
Phillip A. Bedggood ◽  
Andrew J. Anderson

2018 ◽  
Vol 18 (10) ◽  
pp. 372
Author(s):  
Janis Intoy ◽  
Norick Bowers ◽  
Jonathan Victor ◽  
Martina Poletti ◽  
Michele Rucci

2017 ◽  
Vol 17 (1) ◽  
pp. 30 ◽  
Author(s):  
Kavitha Ratnam ◽  
Niklas Domdei ◽  
Wolf M. Harmening ◽  
Austin Roorda

2016 ◽  
Vol 116 (3) ◽  
pp. 1449-1467 ◽  
Author(s):  
HyungGoo R. Kim ◽  
Xaq Pitkow ◽  
Dora E. Angelaki ◽  
Gregory C. DeAngelis

Sensory input reflects events that occur in the environment, but multiple events may be confounded in sensory signals. For example, under many natural viewing conditions, retinal image motion reflects some combination of self-motion and movement of objects in the world. To estimate one stimulus event and ignore others, the brain can perform marginalization operations, but the neural bases of these operations are poorly understood. Using computational modeling, we examine how multisensory signals may be processed to estimate the direction of self-motion (i.e., heading) and to marginalize out effects of object motion. Multisensory neurons represent heading based on both visual and vestibular inputs and come in two basic types: “congruent” and “opposite” cells. Congruent cells have matched heading tuning for visual and vestibular cues and have been linked to perceptual benefits of cue integration during heading discrimination. Opposite cells have mismatched visual and vestibular heading preferences and are ill-suited for cue integration. We show that decoding a mixed population of congruent and opposite cells substantially reduces errors in heading estimation caused by object motion. In addition, we present a general formulation of an optimal linear decoding scheme that approximates marginalization and can be implemented biologically by simple reinforcement learning mechanisms. We also show that neural response correlations induced by task-irrelevant variables may greatly exceed intrinsic noise correlations. Overall, our findings suggest a general computational strategy by which neurons with mismatched tuning for two different sensory cues may be decoded to perform marginalization operations that dissociate possible causes of sensory inputs.


2016 ◽  
Vol 367 ◽  
pp. 128-130
Author(s):  
Jeong-Yoon Choi ◽  
Jeong-Eun Kim ◽  
Jung Ho Han ◽  
Chang-Ho Yun ◽  
Christopher Kennard ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document