scholarly journals The Neural Bases of Egocentric Spatial Representation for Extracorporeal and Corporeal Tasks: An fMRI Study

2021 ◽  
Vol 11 (8) ◽  
pp. 963
Author(s):  
Stephanie Leplaideur ◽  
Annelise Moulinet-Raillon ◽  
Quentin Duché ◽  
Lucie Chochina ◽  
Karim Jamal ◽  
...  

(1) Background: Humans use reference frames to elaborate the spatial representations needed for all space-oriented behaviors such as postural control, walking, or grasping. We investigated the neural bases of two egocentric tasks: the extracorporeal subjective straight-ahead task (SSA) and the corporeal subjective longitudinal body plane task (SLB) in healthy participants using functional magnetic resonance imaging (fMRI). This work was an ancillary part of a study involving stroke patients. (2) Methods: Seventeen healthy participants underwent a 3T fMRI examination. During the SSA, participants had to divide the extracorporeal space into two equal parts. During the SLB, they had to divide their body along the midsagittal plane. (3) Results: Both tasks elicited a parieto-occipital network encompassing the superior and inferior parietal lobules and lateral occipital cortex, with a right hemispheric dominance. Additionally, the SLB > SSA contrast revealed activations of the left angular and premotor cortices. These areas, involved in attention and motor imagery suggest a greater complexity of corporeal processes engaging body representation. (4) Conclusions: This was the first fMRI study to explore the SLB-related activity and its complementarity with the SSA. Our results pave the way for the exploration of spatial cognitive impairment in patients.

2014 ◽  
Vol 26 (11) ◽  
pp. 2469-2478 ◽  
Author(s):  
Michal Bernstein ◽  
Jonathan Oron ◽  
Boaz Sadeh ◽  
Galit Yovel

Faces and bodies are processed by distinct category-selective brain areas. Neuroimaging studies have so far presented isolated faces and headless bodies, and therefore little is known on whether and where faces and headless bodies are grouped together to one object, as they appear in the real world. The current study examined whether a face presented above a body are represented as two separate images or as an integrated face–body representation in face and body-selective brain areas by employing a fMRI competition paradigm. This paradigm has been shown to reveal higher fMRI response to sequential than simultaneous presentation of multiple stimuli (i.e., the competition effect), indicating competitive interactions among simultaneously presented multiple stimuli. We therefore hypothesized that if a face above a body is integrated to an image of a person whereas a body above a face is represented as two separate objects, the competition effect will be larger for the latter than the former. Consistent with our hypothesis, our findings reveal a competition effect when a body is presented above a face, but not when a face is presented above a body, suggesting that a body above a face is represented as two separate objects whereas a face above a body is represented as an integrated image of a person. Interestingly, this integration of a face and a body to an image of a person was found in the fusiform, but not the lateral-occipital face and body areas. We conclude that faces and bodies are processed separately at early stages and are integrated to a unified image of a person at mid-level stages of object processing.


2010 ◽  
Vol 104 (4) ◽  
pp. 2075-2081 ◽  
Author(s):  
Lars Strother ◽  
Adrian Aldcroft ◽  
Cheryl Lavell ◽  
Tutis Vilis

Functional MRI (fMRI) studies of the human object recognition system commonly identify object-selective cortical regions by comparing blood oxygen level–dependent (BOLD) responses to objects versus those to scrambled objects. Object selectivity distinguishes human lateral occipital cortex (LO) from earlier visual areas. Recent studies suggest that, in addition to being object selective, LO is retinotopically organized; LO represents both object and location information. Although LO responses to objects have been shown to depend on location, it is not known whether responses to scrambled objects vary similarly. This is important because it would suggest that the degree of object selectivity in LO does not vary with retinal stimulus position. We used a conventional functional localizer to identify human visual area LO by comparing BOLD responses to objects versus scrambled objects presented to either the upper (UVF) or lower (LVF) visual field. In agreement with recent findings, we found evidence of position-dependent responses to objects. However, we observed the same degree of position dependence for scrambled objects and thus object selectivity did not differ for UVF and LVF stimuli. We conclude that, in terms of BOLD response, LO discriminates objects from non-objects equally well in either visual field location, despite stronger responses to objects in the LVF.


NeuroImage ◽  
2009 ◽  
Vol 47 ◽  
pp. S63
Author(s):  
VG van de Ven ◽  
B Jans ◽  
M Been ◽  
R Goebel ◽  
P de Weerd

2020 ◽  
Author(s):  
Nazia Jassim ◽  
Simon Baron-Cohen ◽  
John Suckling

Sensory sensitivities occur in up to 90% of autistic individuals. With the recent inclusion of sensory symptoms in the diagnostic criteria for autism, there is a current need to develop neural hypotheses related to autistic sensory perception. Using activation likelihood estimation (ALE), we meta-analysed 52 task-based fMRI studies investigating differences between autistic (n=891) and control (n=967) participants during non-social sensory perception. During complex perception, autistic groups showed more activity in the secondary somatosensory and occipital cortices, insula, caudate, superior temporal gyrus, and inferior parietal lobule, while control groups showed more activity in the frontal and parietal regions. During basic sensory processing, autistic groups showed hyperactivity in the lateral occipital cortex, primary somatosensory and motor cortices, insula, caudate, and thalamus, while controls showed heightened activity in the precentral gyrus, middle frontal gyrus, precuneus, and anterior cingulate cortex. We conclude that autistic individuals, on average, show distinct engagement of sensory-related brain networks during sensory perception. These findings may help guide future research to focus on relevant neurobiological mechanisms underpinning the autistic experience.


2007 ◽  
Vol 46 (02) ◽  
pp. 247-250 ◽  
Author(s):  
H. Takahashi ◽  
N. Yahata ◽  
M. Matsuura ◽  
K. Asai ◽  
Y. Okubo ◽  
...  

Summary Objectives : In our previous functional magnetic resonance imaging (fMRI) study, we determined that there was distinct left hemispheric dominance for lexical- semantic processing without the influence of human voice perception in right-handed healthy subjects. However, the degree of right-handedness in the right-handed subjects ranged from 52 to 100 according to the Edinburgh Handedness Inventory (EHI) score. In the present study, we aimed to clarify the correlation between the degree of right-handedness and language dominance in the fronto-temporo-parietal cortices by examining cerebral activation for lexical-semantic processing. Methods : Twenty-seven normal right-handed healthy subjects were scanned by fMRI while listening to sentences (SEN), reverse sentences (rSEN), and identifiable non-vocal sounds (SND). Fronto-temporo-parietal activation was observed in the left hemisphere under the SEN - rSEN contrast, which included lexical- semantic processing without the influence of human voice perception. Laterality Indexwas calculated as LI = (L - R)/(L + R) X 100, L: left, R: right. Results : Laterality Index in the fronto-temporo-parietal cortices did not correlate with the degree of right-handedness in EHI score. Conclusions : The present study indicated that the degree of right-handedness from 52 to 100 in EHI score had no effect on the degree of left hemispheric dominance for lexical-semantic processing in right-handed healthy subjects.


NeuroImage ◽  
1998 ◽  
Vol 7 (4) ◽  
pp. S121
Author(s):  
G. Buccino ◽  
F. Binkofski ◽  
S. Posse ◽  
K.M. Stephan ◽  
H.-J. Freund ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document