Identification of lexical???phonological networks in the superior temporal sulcus using functional magnetic resonance imaging

Neuroreport ◽  
2006 ◽  
Vol 17 (12) ◽  
pp. 1293-1296 ◽  
Author(s):  
Kayoko Okada ◽  
Gregory Hickok
2008 ◽  
Vol 20 (1) ◽  
pp. 108-119 ◽  
Author(s):  
Simone Materna ◽  
Peter W. Dicke ◽  
Peter Thier

Previous imaging work has shown that the superior temporal sulcus (STS) region and the intraparietal sulcus (IPS) are specifically activated during the passive observation of shifts in eye gaze [Pelphrey, K. A., Singerman, J. D., Allison, T., & McCarthy, G. Brain activation evoked by perception of gaze shifts: The influence of context. Neuropsychologia, 41, 156–170, 2003; Hoffman, E. A., & Haxby, J. V. Distinct representations of eye gaze and identity in the distributed human neural system for face perception. Nature Neuroscience, 3, 80–84, 2000; Puce, A., Allison, T., Bentin, S., Gore, J. C., & McCarthy, G. Temporal cortex activation in humans viewing eye and mouth movements. Journal of Neuroscience, 18, 2188–2199, 1998; Wicker, B., Michel, F., Henaff, M. A., & Decety, J. Brain regions involved in the perception of gaze: A PET study. Neuroimage, 8, 221–227, 1998]. Are the same brain regions also involved in extracting gaze direction in order to establish joint attention? In an event-related functional magnetic resonance imaging experiment, healthy human subjects actively followed the directional cue provided by the eyes of another person toward an object in space or, in the control condition, used a nondirectional symbolic cue to make an eye movement toward an object in space. Our results show that the posterior part of the STS region and the cuneus are specifically involved in extracting and using detailed directional information from the eyes of another person to redirect one's own gaze and establish joint attention. The IPS, on the other hand, seems to be involved in encoding spatial direction and mediating shifts of spatial attention independent of the type of cue that triggers this process.


1998 ◽  
Vol 41 (3) ◽  
pp. 538-548 ◽  
Author(s):  
Sean C. Huckins ◽  
Christopher W. Turner ◽  
Karen A. Doherty ◽  
Michael M. Fonte ◽  
Nikolaus M. Szeverenyi

Functional Magnetic Resonance Imaging (fMRI) holds exciting potential as a research and clinical tool for exploring the human auditory system. This noninvasive technique allows the measurement of discrete changes in cerebral cortical blood flow in response to sensory stimuli, allowing determination of precise neuroanatomical locations of the underlying brain parenchymal activity. Application of fMRI in auditory research, however, has been limited. One problem is that fMRI utilizing echo-planar imaging technology (EPI) generates intense noise that could potentially affect the results of auditory experiments. Also, issues relating to the reliability of fMRI for listeners with normal hearing need to be resolved before this technique can be used to study listeners with hearing loss. This preliminary study examines the feasibility of using fMRI in auditory research by performing a simple set of experiments to test the reliability of scanning parameters that use a high resolution and high signal-to-noise ratio unlike that presently reported in the literature. We used consonant-vowel (CV) speech stimuli to investigate whether or not we could observe reproducible and consistent changes in cortical blood flow in listeners during a single scanning session, across more than one scanning session, and in more than one listener. In addition, we wanted to determine if there were differences between CV speech and nonspeech complex stimuli across listeners. Our study shows reproducibility within and across listeners for CV speech stimuli. Results were reproducible for CV speech stimuli within fMRI scanning sessions for 5 out of 9 listeners and were reproducible for 6 out of 8 listeners across fMRI scanning sessions. Results of nonspeech complex stimuli across listeners showed activity in 4 out of 9 individuals tested.


Sign in / Sign up

Export Citation Format

Share Document