scholarly journals Different Patterns of Attention Modulation in Early N140 and Late P300 sERPs Following Ipsilateral vs. Contralateral Stimulation at the Fingers and Cheeks

2021 ◽  
Vol 15 ◽  
Author(s):  
Laura Lindenbaum ◽  
Sebastian Zehe ◽  
Jan Anlauff ◽  
Thomas Hermann ◽  
Johanna Maria Kissler

Intra-hemispheric interference has been often observed when body parts with neighboring representations within the same hemisphere are stimulated. However, patterns of interference in early and late somatosensory processing stages due to the stimulation of different body parts have not been explored. Here, we explore functional similarities and differences between attention modulation of the somatosensory N140 and P300 elicited at the fingers vs. cheeks. In an active oddball paradigm, 22 participants received vibrotactile intensity deviant stimulation either ipsilateral (within-hemisphere) or contralateral (between-hemisphere) at the fingers or cheeks. The ipsilateral deviant always covered a larger area of skin than the contralateral deviant. Overall, both N140 and P300 amplitudes were higher following stimulation at the cheek and N140 topographies differed between fingers and cheek stimulation. For the N140, results showed higher deviant ERP amplitudes following contralateral than ipsilateral stimulation, regardless of the stimulated body part. N140 peak latency differed between stimulated body parts with shorter latencies for the stimulation at the fingers. Regarding P300 amplitudes, contralateral deviant stimulation at the fingers replicated the N140 pattern, showing higher responses and shorter latencies than ipsilateral stimulation at the fingers. For the stimulation at the cheeks, ipsilateral deviants elicited higher P300 amplitudes and longer latencies than contralateral ones. These findings indicate that at the fingers ipsilateral deviant stimulation leads to intra-hemispheric interference, with significantly smaller ERP amplitudes than in contralateral stimulation, both at early and late processing stages. By contrast, at the cheeks, intra-hemispheric interference is selective for early processing stages. Therefore, the mechanisms of intra-hemispheric processing differ from inter-hemispheric ones and the pattern of intra-hemispheric interference in early and late processing stages is body-part specific.

1988 ◽  
Vol 98 (2) ◽  
pp. 116-120 ◽  
Author(s):  
Akira Inokuchi ◽  
Thomas V. Boran ◽  
Charles P. Kimmelman ◽  
James B. Snow

The effects of electrical stimulation of the olfactory bulb and the locus ceruleus on olfactory tubercle neurons were examined in rat models. Ipsilateral stimulation of the olfactory bulb produced excitation in 31% of olfactory tubercle neurons tested and inhibition in 17%. Twenty-two percent of the olfactory tubercle neurons were excited, whereas 9% were inhibited by ipsilateral stimulation of the locus ceruleus. Contralateral stimulation of the locus ceruleus produced similar responses in the same neuron entities. A negative-positive evoked potential was recorded in the olfactory tubercle after ipsilateral and contralateral stimulation of the locus ceruleus. Thirty-three percent of the olfactory tubercle neurons that responded orthodromically or antidromically to stimulation of the olfactory bulb were excited by ipsilateral stimulation of the locus ceruleus. In contrast, only 10% responded with excitation to ipsilateral stimulation of the locus ceruleus among the olfactory tubercle neurons that were unresponsive to stimulation of the olfactory bulb. These findings suggest that olfactory tubercle neurons that receive input from or sending output to the olfactory bulb are influenced by the noradrenergic system of the locus ceruleus. A possible role of the olfactory tubercle in olfactory transduction will also be discussed.


1985 ◽  
Vol 53 (6) ◽  
pp. 1467-1482 ◽  
Author(s):  
M. N. Semple ◽  
L. M. Kitzes

Monaural excitatory responses of 181 single units in the central nucleus of the inferior colliculus of 15 anesthetized gerbils (Meriones unguiculatus) were examined quantitatively. Pure-tone stimuli were presented monaurally through sealed, calibrated sound-delivery systems. Most units were excited only by contralateral stimulation (EO); 23% were bilaterally excitable (EE). The threshold frequency tuning curves for contralateral stimulation of EE units were significantly broader than those produced by ipsilateral stimulation of EE units and those produced by contralateral stimulation of EO units. The frequency at which threshold was lowest (best frequency), or BF) was very similar for ipsilateral and contralateral stimulation of individual EE units; however, ipsilateral BFs were slightly but significantly lower than contralateral BFs. For EE units, ipsilateral BF thresholds (mean: 29.2 dB SPL) were significantly higher than contralateral BF thresholds (mean: 14.9 dB SPL). Monotonic and nonmonotonic relationships between discharge rate and stimulus intensity at BF were observed in responses evoked both by contralateral and ipsilateral stimulation. Interestingly, for individual EE units it was not uncommon for the rate/intensity function for one monaural condition to be monotonic although the relationship for stimulation of the other ear was markedly nonmonotonic. There was no qualitative difference between rate/intensity functions evoked by contralateral stimulation in EO and EE units. Ipsilateral discharge rates were characteristically much lower than contralateral rates for a given stimulus intensity. For 50 BF tones of 100 ms duration, the median peak numbers of discharges for contralateral stimulation of EO and EE units were 361 and 339, respectively; the median for ipsilateral stimulation of EE units was 102. The dynamic range of each rate/intensity function was calculated by measuring the intensity range associated with an increase in spike count from 10 to 90% of the peak rate. No differences were detected between the distributions of dynamic range for contralateral stimulation in EO or EE units, or between contralateral and ipsilateral dynamic ranges within individual EE units. For all response types the distributions of dynamic range were approximately normal, with means near 20 dB. The minimum mean latency to the first spike at BF was generally longer for ipsilateral than for contralateral responses.(ABSTRACT TRUNCATED AT 400 WORDS)


2018 ◽  
Vol 30 (12) ◽  
pp. 1858-1869 ◽  
Author(s):  
Guannan Shen ◽  
Nathan J. Smyk ◽  
Andrew N. Meltzoff ◽  
Peter J. Marshall

The focus of the current study is on a particular aspect of tactile perception: categorical segmentation on the body surface into discrete body parts. The MMN has been shown to be sensitive to categorical boundaries and language experience in the auditory modality. Here we recorded the somatosensory MMN (sMMN) using two tactile oddball protocols and compared sMMN amplitudes elicited by within- and across-boundary oddball pairs. Both protocols employed the identity MMN method that controls for responsivity at each body location. In the first protocol, we investigated the categorical segmentation of tactile space at the wrist by presenting pairs of tactile oddball stimuli across equal spatial distances, either across the wrist or within the forearm. Amplitude of the sMMN elicited by stimuli presented across the wrist boundary was significantly greater than for stimuli presented within the forearm, suggesting a categorical effect at an early stage of somatosensory processing. The second protocol was designed to investigate the generality of this MMN effect, and involved three digits on one hand. Amplitude of the sMMN elicited by a contrast of the third digit and the thumb was significantly larger than a contrast between the third and fifth digits, suggesting a functional boundary effect that may derive from the way that objects are typically grasped. These findings demonstrate that the sMMN is a useful index of processing of somatosensory spatial discrimination that can be used to study body part categories.


2012 ◽  
Vol 25 (0) ◽  
pp. 135
Author(s):  
Elisa Canzoneri ◽  
Elisa Magosso ◽  
Amedeo Amoresano ◽  
Andrea Serino

Multisensory representations of the body and of the space around it (i.e., Peripersonal space, PPS) depend on the physical structure of the body, in that they are constructed from incoming multisensory signals from different body parts. After a sudden change in the physical structure of the body, such as limb amputation, little is known about how multimodal representations of the body and of the PPS adapt to loosing a part of the body, and how partially restoring the function of missing body part by means of prosthesis implantation affects these multimodal body representations. We assessed body representation in a group of upper limb amputees by means of a tactile distance perception task, measuring the implicitly perceived length of the arm, and PPS representation by means of an audio–tactile interaction task, assessing the extension of the multisensory space where environmental stimuli interact with somatosensory processing. When patients performed the task on the amputated limb, without the prosthesis, the perceived arm length shrank, with a concurrent shift of PPS boundaries towards the stump. Wearing the prosthesis increased the perceived length of the stump and extended the boundaries of the PPS so to include the prosthetic hand. The representations of the healthy limb were comparable to those of healthy controls. These results suggest that a modification in the physical body affects multisensory body and PPS representations for the amputated side; such representations are further shaped if prostheses are used to replace the lost body part.


Author(s):  
Carol Priestley

This chapter discusses body part nouns, a part of language that is central to human life, and the polysemy that arises in connection with them. Examples from everyday speech and narrative in various contexts are examined in a Papuan language called Koromu and semantic characteristics of body part nouns in other studies are also considered. Semantic templates are developed for nouns that represent highly visible body parts: for example, wapi ‘hands/arms’, ehi ‘feet/legs’, and their related parts. Culture-specific explications are expressed in a natural metalanguage that can be translated into Koromu to avoid the cultural bias inherent in using other languages and to reveal both distinctive semantic components and similarities to cross-linguistic examples.


Author(s):  
Laura Mora ◽  
Anna Sedda ◽  
Teresa Esteban ◽  
Gianna Cocchini

AbstractThe representation of the metrics of the hands is distorted, but is susceptible to malleability due to expert dexterity (magicians) and long-term tool use (baseball players). However, it remains unclear whether modulation leads to a stable representation of the hand that is adopted in every circumstance, or whether the modulation is closely linked to the spatial context where the expertise occurs. To this aim, a group of 10 experienced Sign Language (SL) interpreters were recruited to study the selective influence of expertise and space localisation in the metric representation of hands. Experiment 1 explored differences in hands’ size representation between the SL interpreters and 10 age-matched controls in near-reaching (Condition 1) and far-reaching space (Condition 2), using the localisation task. SL interpreters presented reduced hand size in near-reaching condition, with characteristic underestimation of finger lengths, and reduced overestimation of hands and wrists widths in comparison with controls. This difference was lost in far-reaching space, confirming the effect of expertise on hand representations is closely linked to the spatial context where an action is performed. As SL interpreters are also experts in the use of their face with communication purposes, the effects of expertise in the metrics of the face were also studied (Experiment 2). SL interpreters were more accurate than controls, with overall reduction of width overestimation. Overall, expertise modifies the representation of relevant body parts in a specific and context-dependent manner. Hence, different representations of the same body part can coexist simultaneously.


2021 ◽  
Vol 11 (7) ◽  
pp. 946
Author(s):  
Won-Mo Jung ◽  
In-Seon Lee ◽  
Ye-Seul Lee ◽  
Yeonhee Ryu ◽  
Hi-Joon Park ◽  
...  

Emotional perception can be shaped by inferences about bodily states. Here, we investigated whether exteroceptive inferences about bodily sensations in the chest area influence the perception of fearful faces. Twenty-two participants received pseudo-electrical acupuncture stimulation at three different acupoints: CV17 (chest), CV23 (chin), and PC6 (left forearm). All stimuli were delivered with corresponding visual cues, and the control condition included visual cues that did not match the stimulated body sites. After the stimulation, the participants were shown images with one of five morphed facial expressions, ranging from 100% fear to 100% disgust, and asked to classify them as fearful or disgusted. Brain activity was measured using functional magnetic resonance imaging during the facial expression classification task. When the participants expected that they would receive stimulation of the chest (CV17), the ratio of fearful to non-fearful classifications decreased compared to the control condition, and brain activities within the periaqueductal gray and the default mode network decreased when they viewed fearful faces. Our findings suggest that bodily sensations around the chest, but not the other tested body parts, were selectively associated with fear perception and that altering external inferences inhibited the perception of fearful faces.


2010 ◽  
Vol 16 (4) ◽  
pp. 112-121 ◽  
Author(s):  
Brennen W. Mills ◽  
Owen B. J. Carter ◽  
Robert J. Donovan

The objective of this case study was to experimentally manipulate the impact on arousal and recall of two characteristics frequently occurring in gruesome depictions of body parts in smoking cessation advertisements: the presence or absence of an external physical insult to the body part depicted; whether or not the image contains a clear figure/ground demarcation. Three hundred participants (46% male, 54% female; mean age 27.3 years, SD = 11.4) participated in a two-stage online study wherein they viewed and responded to a series of gruesome 4-s video images. Seventy-two video clips were created to provide a sample of images across the two conditions: physical insult versus no insult and clear figure/ground demarcation versus merged or no clear figure/ground demarcation. In stage one, participants viewed a randomly ordered series of 36 video clips and rated how “confronting” they considered each to be. Seven days later (stage two), to test recall of each video image, participants viewed all 72 clips and were asked to identify those they had seen previously. Images containing a physical insult were consistently rated more confronting and were remembered more accurately than images with no physical insult. Images with a clear figure/ground demarcation were rated as no more confronting but were consistently recalled with greater accuracy than those with unclear figure/ground demarcation. Makers of gruesome health warning television advertisements should incorporate some form of physical insult and use a clear figure/ground demarcation to maximize image recall and subsequent potential advertising effectiveness.


Author(s):  
Toshiki Kusano ◽  
Hiroki Kurashige ◽  
Isao Nambu ◽  
Yoshiya Moriguchi ◽  
Takashi Hanakawa ◽  
...  

AbstractSeveral functional magnetic resonance imaging (fMRI) studies have demonstrated that resting-state brain activity consists of multiple components, each corresponding to the spatial pattern of brain activity induced by performing a task. Especially in a movement task, such components have been shown to correspond to the brain activity pattern of the relevant anatomical region, meaning that the voxels of pattern that are cooperatively activated while using a body part (e.g., foot, hand, and tongue) also behave cooperatively in the resting state. However, it is unclear whether the components involved in resting-state brain activity correspond to those induced by the movement of discrete body parts. To address this issue, in the present study, we focused on wrist and finger movements in the hand, and a cross-decoding technique trained to discriminate between the multi-voxel patterns induced by wrist and finger movement was applied to the resting-state fMRI. We found that the multi-voxel pattern in resting-state brain activity corresponds to either wrist or finger movements in the motor-related areas of each hemisphere of the cerebrum and cerebellum. These results suggest that resting-state brain activity in the motor-related areas consists of the components corresponding to the elementary movements of individual body parts. Therefore, the resting-state brain activity possibly has a finer structure than considered previously.


2020 ◽  
Vol 27 (1) ◽  
pp. 184-208
Author(s):  
Dorothea Hoffmann

Abstract In this paper I provide a description of the role of body-part terms in expressions of emotion and other semantic extensions in MalakMalak, a non-Pama-Nyungan language of the Daly River area. Body-based expressions denote events, emotions, personality traits, significant places and people and are used to refer to times and number. Particularly central in the language are men ‘stomach’, pundu ‘head’ and tjewurr ‘ear’ associated respectively with basic emotions, states of mind and reason. The figurative extensions of these body parts are discussed systematically, and compared with what is known for other languages of the Daly River region. The article also explores the grammatical make up of body-based emotional collocations, and in particular the role of noun incorporation. In MalakMalak, noun incorporation is a central part of forming predicates with body parts, but uncommon in any other semantic domain of the language and only lexemes denoting basic emotions may also incorporate closed-class adjectives.


Sign in / Sign up

Export Citation Format

Share Document