scholarly journals Neuronal activity in the lateral cerebellum of trained monkeys, related to visual stimuli or to eye movements.

1990 ◽  
Vol 428 (1) ◽  
pp. 595-614 ◽  
Author(s):  
D E Marple-Horvat ◽  
J F Stein
Sensors ◽  
2021 ◽  
Vol 21 (4) ◽  
pp. 1394
Author(s):  
Asad Ali ◽  
Sanaul Hoque ◽  
Farzin Deravi

Presentation attack artefacts can be used to subvert the operation of biometric systems by being presented to the sensors of such systems. In this work, we propose the use of visual stimuli with randomised trajectories to stimulate eye movements for the detection of such spoofing attacks. The presentation of a moving visual challenge is used to ensure that some pupillary motion is stimulated and then captured with a camera. Various types of challenge trajectories are explored on different planar geometries representing prospective devices where the challenge could be presented to users. To evaluate the system, photo, 2D mask and 3D mask attack artefacts were used and pupillary movement data were captured from 80 volunteers performing genuine and spoofing attempts. The results support the potential of the proposed features for the detection of biometric presentation attacks.


Sensors ◽  
2021 ◽  
Vol 21 (15) ◽  
pp. 5178
Author(s):  
Sangbong Yoo ◽  
Seongmin Jeong ◽  
Seokyeon Kim ◽  
Yun Jang

Gaze movement and visual stimuli have been utilized to analyze human visual attention intuitively. Gaze behavior studies mainly show statistical analyses of eye movements and human visual attention. During these analyses, eye movement data and the saliency map are presented to the analysts as separate views or merged views. However, the analysts become frustrated when they need to memorize all of the separate views or when the eye movements obscure the saliency map in the merged views. Therefore, it is not easy to analyze how visual stimuli affect gaze movements since existing techniques focus excessively on the eye movement data. In this paper, we propose a novel visualization technique for analyzing gaze behavior using saliency features as visual clues to express the visual attention of an observer. The visual clues that represent visual attention are analyzed to reveal which saliency features are prominent for the visual stimulus analysis. We visualize the gaze data with the saliency features to interpret the visual attention. We analyze the gaze behavior with the proposed visualization to evaluate that our approach to embedding saliency features within the visualization supports us to understand the visual attention of an observer.


1996 ◽  
Vol 76 (3) ◽  
pp. 1439-1456 ◽  
Author(s):  
P. Mazzoni ◽  
R. M. Bracewell ◽  
S. Barash ◽  
R. A. Andersen

1. The lateral intraparietal area (area LIP) of the monkey's posterior parietal cortex (PPC) contains neurons that are active during saccadic eye movements. These neurons' activity includes visual and saccade-related components. These responses are spatially tuned and the location of a neuron's visual receptive field (RF) relative to the fovea generally overlaps its preferred saccade amplitude and direction (i.e., its motor field, MF). When a delay is imposed between the presentation of a visual stimulus and a saccade made to its location (memory saccade task), many LIP neurons maintain elevated activity during the delay (memory activity, M), which appears to encode the metrics of the next intended saccadic eye movements. Recent studies have alternatively suggested that LIP neurons encode the locations of visual stimuli regardless of where the animal intends to look. We examined whether the M activity of LIP neurons specifically encodes movement intention or the locations of recent visual stimuli, or a combination of both. In the accompanying study, we investigated whether the intended-movement activity reflects changes in motor plan. 2. We trained monkeys (Macaca mulatta) to memorize the locations of two visual stimuli and plan a sequence of two saccades, one to each remembered target, as we recorded the activity of single LIP neurons. Two targets were flashed briefly while the monkey maintained fixation; after a delay the fixation point was extinguished, and the monkey made two saccades in sequence to each target's remembered location, in the order in which the targets were presented. This "delayed double saccade" (DDS) paradigm allowed us to dissociate the location of visual stimulation from the direction of the planned saccade and thus distinguish neuronal activity related to the target's location from activity related to the saccade plan. By imposing a delay, we eliminated the confounding effect of any phasic responses coincident with the appearance of the stimulus and with the saccade. 3. We arranged the two visual stimuli so that in one set of conditions at least the first one was in the neuron's visual RF, and thus the first saccade was in the neuron's motor field (MF). M activity should be high in these conditions according to both the sensory memory and motor plan hypotheses. In another set of conditions, the second stimulus appeared in the RF but the first one was presented outside the RF, instructing the monkey to plan the first saccade away from the neuron's MF. If the M activity encodes the motor plan, it should be low in these conditions, reflecting the plan for the first saccade (away from the MF). If it is a sensory trace of the stimulus' location, it should be high, reflecting stimulation of the RF by the second target. 4. We tested 49 LIP neurons (in 3 hemispheres of 2 monkeys) with M activity on the DDS task. Of these, 38 (77%) had M activity related to the next intended saccade. They were active in the delay period, as expected, if the first saccade was in their preferred direction. They were less active or silent if the next saccade was not in their preferred direction, even when the second stimulus appeared in their RF. 5. The M activity of 8 (16%) of the remaining neurons specifically encoded the location of the most recent visual stimulus. Their firing rate during the delay reflected stimulation of the RF independently of the saccade being planned. The remaining 3 neurons had M activity that did not consistently encode either the next saccade or the stimulus' location. 6. We also recorded the activity of a subset of neurons (n = 38) in a condition in which no stimulus appeared in a neuron's RF, but the second saccade was in the neuron's MF. In this case the majority of neurons tested (23/38, 60%) became active in the period between the first and second saccade, even if neither stimulus had appeared in their RF. Moreover, this activity appeared only after the first saccade had started in all but two of


2000 ◽  
Vol 83 (1) ◽  
pp. 625-629 ◽  
Author(s):  
Stefano Ferraina ◽  
Martin Paré ◽  
Robert H. Wurtz

Information about depth is necessary to generate saccades to visual stimuli located in three-dimensional space. To determine whether monkey frontal eye field (FEF) neurons play a role in the visuo-motor processes underlying this behavior, we studied their visual responses to stimuli at different disparities. Disparity sensitivity was tested from 3° of crossed disparity (near) to 3° degrees of uncrossed disparity (far). The responses of about two thirds of FEF visual and visuo-movement neurons were sensitive to disparity and showed a broad tuning in depth for near or far disparities. Early phasic and late tonic visual responses often displayed different disparity sensitivity. These findings provide evidence of depth-related signals in FEF and suggest a role for FEF in the control of disconjugate as well as conjugate eye movements.


1999 ◽  
Vol 82 (5) ◽  
pp. 2612-2632 ◽  
Author(s):  
Pierre A. Sylvestre ◽  
Kathleen E. Cullen

The mechanics of the eyeball and its surrounding tissues, which together form the oculomotor plant, have been shown to be the same for smooth pursuit and saccadic eye movements. Hence it was postulated that similar signals would be carried by motoneurons during slow and rapid eye movements. In the present study, we directly addressed this proposal by determining which eye movement–based models best describe the discharge dynamics of primate abducens neurons during a variety of eye movement behaviors. We first characterized abducens neuron spike trains, as has been classically done, during fixation and sinusoidal smooth pursuit. We then systematically analyzed the discharge dynamics of abducens neurons during and following saccades, during step-ramp pursuit and during high velocity slow-phase vestibular nystagmus. We found that the commonly utilized first-order description of abducens neuron firing rates (FR = b + kE + rE˙, where FR is firing rate, E and E˙ are eye position and velocity, respectively, and b, k, and r are constants) provided an adequate model of neuronal activity during saccades, smooth pursuit, and slow phase vestibular nystagmus. However, the use of a second-order model, which included an exponentially decaying term or “slide” (FR = b + kE + rE˙ + uË − c[Formula: see text]), notably improved our ability to describe neuronal activity when the eye was moving and also enabled us to model abducens neuron discharges during the postsaccadic interval. We also found that, for a given model, a single set of parameters could not be used to describe neuronal firing rates during both slow and rapid eye movements. Specifically, the eye velocity and position coefficients ( r and k in the above models, respectively) consistently decreased as a function of the mean (and peak) eye velocity that was generated. In contrast, the bias ( b, firing rate when looking straight ahead) invariably increased with eye velocity. Although these trends are likely to reflect, in part, nonlinearities that are intrinsic to the extraocular muscles, we propose that these results can also be explained by considering the time-varying resistance to movement that is generated by the antagonist muscle. We conclude that to create realistic and meaningful models of the neural control of horizontal eye movements, it is essential to consider the activation of the antagonist, as well as agonist motoneuron pools.


2015 ◽  
Vol 12 (3) ◽  
pp. 036014 ◽  
Author(s):  
Shogo Ohmae ◽  
Toshimitsu Takahashi ◽  
Xiaofeng Lu ◽  
Yasunori Nishimori ◽  
Yasushi Kodaka ◽  
...  

2021 ◽  
pp. 2150048
Author(s):  
Hamidreza Namazi ◽  
Avinash Menon ◽  
Ondrej Krejcar

Our eyes are always in search of exploring our surrounding environment. The brain controls our eyes’ activities through the nervous system. Hence, analyzing the correlation between the activities of the eyes and brain is an important area of research in vision science. This paper evaluates the coupling between the reactions of the eyes and the brain in response to different moving visual stimuli. Since both eye movements and EEG signals (as the indicator of brain activity) contain information, we employed Shannon entropy to decode the coupling between them. Ten subjects looked at four moving objects (dynamic visual stimuli) with different information contents while we recorded their EEG signals and eye movements. The results demonstrated that the changes in the information contents of eye movements and EEG signals are strongly correlated ([Formula: see text]), which indicates a strong correlation between brain and eye activities. This analysis could be extended to evaluate the correlation between the activities of other organs versus the brain.


Sign in / Sign up

Export Citation Format

Share Document