Cortical Mu Rhythms During Action and Passive Music Listening

Author(s):  
Jessica M. Ross ◽  
Daniel C. Comstock ◽  
John R. Iversen ◽  
Scott Makeig ◽  
Ramesh Balasubramaniam

Brain systems supporting body movement are active during music listening in the absence of overt movement. This covert motor activity is not well understood, but some theories propose a role in auditory timing prediction facilitated by motor simulation. One question is how music-related covert motor activity relates to motor activity during overt movement. We address this question using scalp electroencephalogram by measuring mu rhythms-- cortical field phenomena associated with the somatomotor system that appear over sensorimotor cortex. Lateralized mu enhancement over hand sensorimotor cortex during/just before foot movement in foot vs. hand movement paradigms is thought to reflect hand movement inhibition during current/prospective movement of another effector. Behavior of mu during music listening with movement suppressed has yet to be determined. We recorded 32-channel EEG (N=17) during silence without movement, overt movement (foot/hand), and music listening without movement. Using an Independent Component Analysis-based source equivalent dipole clustering technique, we identified three mu-related clusters, localized to left primary motor and right and midline premotor cortices. Right foot tapping was accompanied by mu enhancement in the left lateral source cluster, replicating previous work. Music listening was accompanied by similar mu enhancement in the left, as well as midline, clusters. We are the first to report, and also to source-resolve, music-related mu modulation in the absence of overt movements. Covert music-related motor activity has been shown to play a role in beat perception (1). Our current results show enhancement in somatotopically organized mu, supporting overt motor inhibition during beat perception.

2020 ◽  
Vol 132 (5) ◽  
pp. 1358-1366
Author(s):  
Chao-Hung Kuo ◽  
Timothy M. Blakely ◽  
Jeremiah D. Wander ◽  
Devapratim Sarma ◽  
Jing Wu ◽  
...  

OBJECTIVEThe activation of the sensorimotor cortex as measured by electrocorticographic (ECoG) signals has been correlated with contralateral hand movements in humans, as precisely as the level of individual digits. However, the relationship between individual and multiple synergistic finger movements and the neural signal as detected by ECoG has not been fully explored. The authors used intraoperative high-resolution micro-ECoG (µECoG) on the sensorimotor cortex to link neural signals to finger movements across several context-specific motor tasks.METHODSThree neurosurgical patients with cortical lesions over eloquent regions participated. During awake craniotomy, a sensorimotor cortex area of hand movement was localized by high-frequency responses measured by an 8 × 8 µECoG grid of 3-mm interelectrode spacing. Patients performed a flexion movement of the thumb or index finger, or a pinch movement of both, based on a visual cue. High-gamma (HG; 70–230 Hz) filtered µECoG was used to identify dominant electrodes associated with thumb and index movement. Hand movements were recorded by a dataglove simultaneously with µECoG recording.RESULTSIn all 3 patients, the electrodes controlling thumb and index finger movements were identifiable approximately 3–6-mm apart by the HG-filtered µECoG signal. For HG power of cortical activation measured with µECoG, the thumb and index signals in the pinch movement were similar to those observed during thumb-only and index-only movement, respectively (all p > 0.05). Index finger movements, measured by the dataglove joint angles, were similar in both the index-only and pinch movements (p > 0.05). However, despite similar activation across the conditions, markedly decreased thumb movement was observed in pinch relative to independent thumb-only movement (all p < 0.05).CONCLUSIONSHG-filtered µECoG signals effectively identify dominant regions associated with thumb and index finger movement. For pinch, the µECoG signal comprises a combination of the signals from individual thumb and index movements. However, while the relationship between the index finger joint angle and HG-filtered signal remains consistent between conditions, there is not a fixed relationship for thumb movement. Although the HG-filtered µECoG signal is similar in both thumb-only and pinch conditions, the actual thumb movement is markedly smaller in the pinch condition than in the thumb-only condition. This implies a nonlinear relationship between the cortical signal and the motor output for some, but importantly not all, movement types. This analysis provides insight into the tuning of the motor cortex toward specific types of motor behaviors.


2021 ◽  
Author(s):  
Max van den Boom ◽  
Kai J. Miller ◽  
Nicholas M. Gregg ◽  
Gabriela Ojeda ◽  
Kendall H. Lee ◽  
...  

AbstractElectrophysiological signals in the human motor system may change in different ways after deafferentation, with some studies emphasizing reorganization while others propose retained physiology. Understanding whether motor electrophysiology is retained over longer periods of time can be invaluable for patients with paralysis (e.g. ALS or brainstem stroke) when signals from sensorimotor areas may be used for communication or control over neural prosthetic devices. In addition, a maintained electrophysiology can potentially benefit the treatment of phantom limb pains through prolonged use of these signals in a brain-machine interface (BCI).Here, we were presented with the unique opportunity to investigate the physiology of the sensorimotor cortex in a patient with an amputated arm using electrocorticographic (ECoG) measurements. While implanted with an ECoG grid for clinical evaluation of electrical stimulation for phantom limb pain, the patient performed attempted finger movements with the contralateral (lost) hand and executed finger movements with the ipsilateral (healthy) hand.The electrophysiology of the sensorimotor cortex contralateral to the amputated hand remained very similar to that of hand movement in healthy people, with a spatially focused increase of high-frequency band (65-175Hz; HFB) power over the hand region and a distributed decrease in low-frequency band (15-28Hz; LFB) power. The representation of the three different fingers (thumb, index and little) remained intact and HFB patterns could be decoded using support vector learning at single-trial classification accuracies of >90%, based on the first 1-3s of the HFB response. These results demonstrate that hand representations are largely retained in the motor cortex. The intact physiological response of the amputated hand, the high distinguishability of the fingers and fast temporal peak are encouraging for neural prosthetic devices that target the sensorimotor cortex.


Author(s):  
Ahmed H. Aliwy ◽  
Ahmed A. Alethary

<span>The arabic sign language (ArSL) is the natural language of the deaf community in Arabic countries. ArSL suffers from a lack of resources such as unified dictionaries and corpora. In this work, a dictionary of Arabic language to ArSL has been constructed as a part of a translation system. The Arabic words are converted into hamburg notation system (HamNoSys) using eSign editor Software. HamNoSys was used to create manual parameters (handshape, hand orientation, hand location, and hand movement), while non-manual parameters (facial expressions, shoulder raising, mouthing gesture, head tilting, and body movement) added by using (mouth, face, and limbs) in the eSign editor software. The sign then converted to the sign gesture markup language (SiGML) file, and later 3D avatar interprets the SiGML file scripts to the animated sign. The constructed dictionary has three thousand signs; therefore, it can be adopted for the translation system in which written text can be transformed into sign language and can be utilized for the education of deaf people. The dictionary will be available as a free resource for researchers. It is hard and time-consuming work, but it is an essential step in machine translation of whole Arabic text to ArSL with 3D animations. </span>


2000 ◽  
Vol 59 (2) ◽  
pp. 102-107 ◽  
Author(s):  
Keiichiro Tsuji ◽  
Keikichi Hayashibe ◽  
Masatoshi Hara ◽  
Tetsuro Matsuzawa

This study examines the effectiveness of cues of visual depth and distance in the course of development and how this process depends on visuo-motor development. In the visual pitfall situation, i.e. a modification of Gibson 's visual cliff, eight Japanese monkeys (macaca fuscata) were observed with respect to their depth avoidance and visuo-motor activity. The tests were run once a week from the first until the sixteenth week after birth. Binocular parallax, motion parallax and texture density rates were manipulated to examine their effectiveness as cues. It was shown that for the first two months depth perception depended exclusively on motion parallax, whereas in the third month cues of motion and texture were added. Binocular cues did not have any effect in this age range. Three items of behaviour, i.e. visual regard of depth, head movement, and body movement, were checked and measured to obtain information which could explain the process of development of the cue function. The three items showed different developmental curves. During the first month, visual regard closely concurred with head and body movements, then visual activity suppressed motor behaviour and, after the end of the second month, the two became almost independent of each other. These analyses demonstrated that at a later stage pictorial cues produced an effect additional to the primary motion cues and that the effective cue function was based on the development of visuo-motor activity.


2013 ◽  
Vol 110 (5) ◽  
pp. 1158-1166 ◽  
Author(s):  
Mitsuaki Takemi ◽  
Yoshihisa Masakado ◽  
Meigen Liu ◽  
Junichi Ushiba

There is increasing interest in electroencephalogram (EEG)-based brain-computer interface (BCI) as a tool for rehabilitation of upper limb motor functions in hemiplegic stroke patients. This type of BCI often exploits mu and beta oscillations in EEG recorded over the sensorimotor areas, and their event-related desynchronization (ERD) following motor imagery is believed to represent increased sensorimotor cortex excitability. However, it remains unclear whether the sensorimotor cortex excitability is actually correlated with ERD. Thus we assessed the association of ERD with primary motor cortex (M1) excitability during motor imagery of right wrist movement. M1 excitability was tested by motor evoked potentials (MEPs), short-interval intracortical inhibition (SICI), and intracortical facilitation (ICF) with transcranial magnetic stimulation (TMS). Twenty healthy participants were recruited. The participants performed 7 s of rest followed by 5 s of motor imagery and received online visual feedback of the ERD magnitude of the contralateral hand M1 while performing the motor imagery task. TMS was applied to the right hand M1 when ERD exceeded predetermined thresholds during motor imagery. MEP amplitudes, SICI, and ICF were recorded from the agonist muscle of the imagined hand movement. Results showed that the large ERD during wrist motor imagery was associated with significantly increased MEP amplitudes and reduced SICI but no significant changes in ICF. Thus ERD magnitude during wrist motor imagery represents M1 excitability. This study provides electrophysiological evidence that a motor imagery task involving ERD may induce changes in corticospinal excitability similar to changes accompanying actual movements.


1998 ◽  
Vol 86 (1) ◽  
pp. 243-249 ◽  
Author(s):  
Asghar Dadkhah

Dohsa-hou, a Japanese psychorehabilitation method for motor training, was introduced to 10 subjects with cerebral palsy in a pre-post (6-wk.) design. Four expert raters were asked to judge the improvement in range of motion, ease and smoothness of movement, correctness of posture) of these subjects. Findings suggest that the training method had a significant effect on body movement as compared to body posture. Since the effect may be peculiar to this subject group, further studies are suggested.


1974 ◽  
Vol 39 (1) ◽  
pp. 279-293 ◽  
Author(s):  
Thomas Blass ◽  
Norbert Freedman ◽  
Irving Steingart

The purpose of the study was to examine the prevalence of object- and body-focused hand movements of the congenitally blind individuals engaged in an encoding task and to determine the relation of these movements to verbal performance. Ten Ss participated in a 5-min. videotaped monologue. The video portion was coded for hand movements using Freedman's categories of analysis. The audio portion was scored for grammatical complexity according to a system developed by Steingart and Freedman. It was found that: (1) Blind Ss engaged only in body-focused movements; object-focused movements were almost completely absent. (2) Blind Ss displayed significantly greater amounts of body-focused (primarily finger-to-hand) movements than a group of sighted Ss observed in a previous study. (3) There was a correlation of .51 between finger-to-hand movements and verbal fluency and a correlation of –.53 between body-touching and verbal fluency. (4) Ss with a prevalence of finger-to-hand movements showed significantly greater language skill at encoding complex sentences which portray descriptions of patterned, interrelationships among experiences, while Ss with a predominance of continuous body touching gave a less skillful language product in this regard. The findings indicate the central role of motor activity in ongoing thought construction. They also indicate that for the blind, finger-to-hand motions contribute to the evocation of sensory experiences as a necessary pre-condition for linguistic representation.


2021 ◽  
Vol 2021 ◽  
pp. 1-10
Author(s):  
Maria L. Rangel ◽  
Lidiane Souza ◽  
Erika C. Rodrigues ◽  
José M. Oliveira ◽  
Michelle F. Miranda ◽  
...  

Predicting upcoming sensorimotor events means creating forward estimates of the body and the surrounding world. This ability is a fundamental aspect of skilled motor behavior and requires an accurate and constantly updated representation of the body and the environment. To test whether these prediction mechanisms could be affected by a peripheral injury, we employed an action observation and electroencephalogram (EEG) paradigm to assess the occurrence of prediction markers in anticipation of observed sensorimotor events in healthy and brachial plexus injury (BPI) participants. Nine healthy subjects and six BPI patients watched a series of video clips showing an actor’s hand and a colored ball in an egocentric perspective. The color of the ball indicated whether the hand would grasp it (hand movement), or the ball would roll toward the hand and touch it (ball movement), or no event would occur (no movement). In healthy participants, we expected to find distinct electroencephalographic activation patterns (EEG signatures) specific to the prediction of the occurrence of each of these situations. Cluster analysis from EEG signals recorded from electrodes placed over the sensorimotor cortex of control participants showed that predicting either an upcoming hand movement or the occurrence of a tactile event yielded specific neural signatures. In BPI participants, the EEG signals from the sensorimotor cortex contralateral to the dominant hand in the hand movement condition were different compared to the other conditions. Furthermore, there were no differences between ball movement and no movement conditions in the sensorimotor cortex contralateral to the dominant hand, suggesting that BPI blurred specifically the ability to predict upcoming tactile events for the dominant hand. These results highlight the role of the sensorimotor cortex in creating estimates of both actions and tactile interactions in the space around the body and suggest plastic effects on prediction coding following peripheral sensorimotor loss.


2014 ◽  
Vol 12 (1) ◽  
pp. 45-57 ◽  
Author(s):  
Joanna Piskorz ◽  
Marcin Czub ◽  
Katarzyna Urbańska ◽  
Małgorzata Mrula ◽  
Paweł Hodowaniec ◽  
...  

Abstract This study investigates the effectiveness of virtual reality (VR) technology in distracting attention from pain. We tested how body engagement related to navigating the virtual environment (VE) influences the intensity of pain. Two different interfaces were used to play the same VE, and a cold pressor test was used for pain stimulation. A mixed design was used for the experiment. Sixty-six undergraduate students participated. One group navigated the game using a rotation sensor, head tracker and foot pedals (Body Movement Interface). Another group navigated only using their hands (Hand Movement Interface). Objective and subjective measures of pain were collected - the amount of time participants kept their hand in a container with cold water, and the participant’s assessment of the pain intensity on a visual analog scale (VAS). Participants also filled in questionnaires designed to measure feelings of presence in VE and emotional attitudes towards the game. We found no significant difference between the two used interfaces in their analgesic efficacy. In both groups during VR distraction, participants showed significantly higher levels of pain endurance than without VR distraction.


Sign in / Sign up

Export Citation Format

Share Document