visual reference frame
Recently Published Documents


TOTAL DOCUMENTS

7
(FIVE YEARS 0)

H-INDEX

3
(FIVE YEARS 0)

2012 ◽  
Vol 25 (0) ◽  
pp. 103
Author(s):  
Michael J. Carnevale ◽  
Lisa M. Pritchett ◽  
Laurence R. Harris

Eccentric gaze systematically biases touch localization on the arm and waist. These perceptual errors suggest that touch location is at least partially coded in a visual reference frame. Here we investigated whether touches to non-visible parts of the body are also affected by gaze position. If so, can the direction of mislocalization tell us how they are laid out in the visual representation? To test this, an array of vibro-tactors was attached to either the lower back or the forehead. During trials, participants were guided to orient the position of their head (90° left, right or straight ahead for touches on the lower back) or head and eyes (combination of ±15° left, right or straight ahead head and eye positions for touches on the forehead) using LED fixation targets and a head mounted laser. Participants then re-oriented to straight ahead and reported perceived touch location on a visual scale using a mouse and computer screen. Similar to earlier experiments on the arm and waist, perceived touch location on the forehead and lower back was biased in the same direction as eccentric head and eye position. This is evidence that perceived touch location is at least partially coded in a visual reference frame even for parts of the body that are not typically seen.


2007 ◽  
Vol 98 (2) ◽  
pp. 966-983 ◽  
Author(s):  
Aaron P. Batista ◽  
Gopal Santhanam ◽  
Byron M. Yu ◽  
Stephen I. Ryu ◽  
Afsheen Afshar ◽  
...  

When a human or animal reaches out to grasp an object, the brain rapidly computes a pattern of muscular contractions that can acquire the target. This computation involves a reference frame transformation because the target's position is initially available only in a visual reference frame, yet the required control signal is a set of commands to the musculature. One of the core brain areas involved in visually guided reaching is the dorsal aspect of the premotor cortex (PMd). Using chronically implanted electrode arrays in two Rhesus monkeys, we studied the contributions of PMd to the reference frame transformation for reaching. PMd neurons are influenced by the locations of reach targets relative to both the arm and the eyes. Some neurons encode reach goals using limb-centered reference frames, whereas others employ eye-centered reference fames. Some cells encode reach goals in a reference frame best described by the combined position of the eyes and hand. In addition to neurons like these where a reference frame could be identified, PMd also contains cells that are influenced by both the eye- and limb-centered locations of reach goals but for which a distinct reference frame could not be determined. We propose two interpretations for these neurons. First, they may encode reach goals using a reference frame we did not investigate, such as intrinsic reference frames. Second, they may not be adequately characterized by any reference frame.


1981 ◽  
Vol 52 (2) ◽  
pp. 455-458
Author(s):  
Colin B. Pitblado ◽  
Charles S. Mirabile ◽  
John E. Richard

Judgments of the visual, vertical, made without a visual reference frame-work, from a tilted-body position, result in systematic constant errors (Aubert effects). Pitblado and Mirabile (1977) showed that these errors vary with motion-sickness susceptibility, persons of intermediate susceptibility showing the greatest error. Recent exploratory work suggested patterns of progressive intra-session change in Aubert effects which might further differentiate groups of differing susceptibility. The raw data from Pitblado and Mirabile's 1977 study were reanalyzed for possible progressive change. This new analysis showed significant progressive reductions in Aubert effects for groups originally high and low, but a nearly significant increase in the intermediate group. New implications concerning group differences in vestibular function are discussed.


Sign in / Sign up

Export Citation Format

Share Document