scholarly journals The Effect of Meditation on Visual and Auditory Sustained Attention

2021 ◽  
Author(s):  
◽  
Paige Badart

<p>Failures of attention can be hazardous, especially within the workplace where sustaining attention has become an increasingly important skill. This has produced a necessity for the development of methods to improve attention. One such method is the practice of meditation. Previous research has shown that meditation can produce beneficial changes to attention and associated brain regions. In particular, sustained attention has shown to be significantly improved by meditation. While this effect has shown to occur in the visual modality, there is less research on the effects of meditation and auditory sustained attention. Furthermore, there is currently no research which examines meditation on crossmodal sustained attention. This is relevant not only because visual and auditory are perceived simultaneously in reality, but also as it may assist in the debate as to whether sustained attention is managed by modality-specific systems or a single overarching supramodal system.  The current research was conducted to examine the effects of meditation on visual, auditory and audiovisual crossmodal sustained attention by using variants of the Sustained Attention to Response Task. In these tasks subjects were presented with either visual, auditory, or a combination of visual and auditory stimuli, and were required to respond to infrequent targets over an extended period of time. It was found that for all of the tasks, meditators significantly differed in accuracy compared to non-meditating control groups. The meditators made less errors without sacrificing response speed, with the exception of the Auditory-target crossmodal task. This demonstrates the benefit of meditation for improving sustained attention across sensory modalities and also lends support to the argument that sustained attention is governed by a supramodal system rather than modality-specific systems.</p>

2021 ◽  
Author(s):  
◽  
Paige Badart

<p>Failures of attention can be hazardous, especially within the workplace where sustaining attention has become an increasingly important skill. This has produced a necessity for the development of methods to improve attention. One such method is the practice of meditation. Previous research has shown that meditation can produce beneficial changes to attention and associated brain regions. In particular, sustained attention has shown to be significantly improved by meditation. While this effect has shown to occur in the visual modality, there is less research on the effects of meditation and auditory sustained attention. Furthermore, there is currently no research which examines meditation on crossmodal sustained attention. This is relevant not only because visual and auditory are perceived simultaneously in reality, but also as it may assist in the debate as to whether sustained attention is managed by modality-specific systems or a single overarching supramodal system.  The current research was conducted to examine the effects of meditation on visual, auditory and audiovisual crossmodal sustained attention by using variants of the Sustained Attention to Response Task. In these tasks subjects were presented with either visual, auditory, or a combination of visual and auditory stimuli, and were required to respond to infrequent targets over an extended period of time. It was found that for all of the tasks, meditators significantly differed in accuracy compared to non-meditating control groups. The meditators made less errors without sacrificing response speed, with the exception of the Auditory-target crossmodal task. This demonstrates the benefit of meditation for improving sustained attention across sensory modalities and also lends support to the argument that sustained attention is governed by a supramodal system rather than modality-specific systems.</p>


2020 ◽  
Vol 117 (13) ◽  
pp. 7437-7446 ◽  
Author(s):  
Gaëtan Sanchez ◽  
Thomas Hartmann ◽  
Marco Fuscà ◽  
Gianpaolo Demarchi ◽  
Nathan Weisz

An increasing number of studies highlight common brain regions and processes in mediating conscious sensory experience. While most studies have been performed in the visual modality, it is implicitly assumed that similar processes are involved in other sensory modalities. However, the existence of supramodal neural processes related to conscious perception has not been convincingly shown so far. Here, we aim to directly address this issue by investigating whether neural correlates of conscious perception in one modality can predict conscious perception in a different modality. In two separate experiments, we presented participants with successive blocks of near-threshold tasks involving subjective reports of tactile, visual, or auditory stimuli during the same magnetoencephalography (MEG) acquisition. Using decoding analysis in the poststimulus period between sensory modalities, our first experiment uncovered supramodal spatiotemporal neural activity patterns predicting conscious perception of the feeble stimulation. Strikingly, these supramodal patterns included activity in primary sensory regions not directly relevant to the task (e.g., neural activity in visual cortex predicting conscious perception of auditory near-threshold stimulation). We carefully replicate our results in a control experiment that furthermore show that the relevant patterns are independent of the type of report (i.e., whether conscious perception was reported by pressing or withholding a button press). Using standard paradigms for probing neural correlates of conscious perception, our findings reveal a common signature of conscious access across sensory modalities and illustrate the temporally late and widespread broadcasting of neural representations, even into task-unrelated primary sensory processing regions.


2012 ◽  
Vol 25 (0) ◽  
pp. 17
Author(s):  
Magdalena Chechlacz ◽  
Anna Terry ◽  
Pia Rotshtein ◽  
Wai-Ling Bickerton ◽  
Glyn Humphreys

Extinction is diagnosed when patients respond to a single contralesional item but fail to detect this item when an ipsilesional item is present concurrently. It is considered to be a disorder of attention characterized by a striking bias for the ipsilesional stimulus at the expense of the contralesional stimulus. Extinction has been studied mainly in the visual modality but it occurs also in other sensory modalities (touch, audition) and hence can be considered a multisensory phenomenon. The functional and neuroanatomical relations between extinction in different modalities are poorly understood. It could be hypothesised that extinction deficits in different modalities emerge after damage to both common (attention specific) and distinct (modality specific) brain regions. Here, we used voxel-based morphometry to examine the neuronal substrates of visual versus tactile extinction in a large group of stroke patients (). We found that extinction deficits in the two modalities were significantly correlated (; ). Lesions to inferior parietal lobule and middle frontal gyrus were linked to visual extinction, while lesions involving the superior temporal gyrus were associated with tactile extinction. Damage within the middle temporal gyrus was linked to both types of deficits but interestingly these lesions extended into the middle occipital gyrus in patients with visual but not tactile extinction. White matter damage within the temporal lobe was associated with both types of deficits, including lesions within long association pathways involved in spatial attention. Our findings indicate both common and distinct neural mechanisms of visual and tactile extinction.


2017 ◽  
Author(s):  
Gaëtan Sanchez ◽  
Thomas Hartmann ◽  
Marco Fuscà ◽  
Gianpaolo Demarchi ◽  
Nathan Weisz

AbstractAn increasing number of studies highlight common brain regions and processes in mediating conscious sensory experience. While most studies have been performed in the visual modality, it is implicitly assumed that similar processes are involved in other sensory modalities. However, the existence of supramodal neural processes related to conscious perception has not been convincingly shown so far. Here, we aim to directly address this issue by investigating whether neural correlates of conscious perception in one modality can predict conscious perception in a different modality. In two separate experiments, we presented participants with successive blocks of near-threshold tasks involving tactile, visual or auditory stimuli during the same magnetoencephalography (MEG) acquisition. Using decoding analysis in the post-stimulus period between sensory modalities, our first experiment uncovered supramodal spatio-temporal neural activity patterns predicting conscious perception of the feeble stimulation. Strikingly, these supramodal patterns included activity in primary sensory regions not directly relevant to the task (e.g. neural activity in visual cortex predicting conscious perception of auditory near-threshold stimulation). We carefully replicate our results in a control experiment that furthermore show that the relevant patterns are independent of the type of report (i.e. whether conscious perception was reported by pressing or withholding a button-press). Using standard paradigms for probing neural correlates of conscious perception, our findings reveal a common signature of conscious access across sensory modalities and illustrate the temporally late and widespread broadcasting of neural representations, even into task-unrelated primary sensory processing regions.


Author(s):  
Anna Conci ◽  
Merim Bilalić ◽  
Robert Gaschler

Abstract. Previous research on inattentional blindness (IB) has focused almost entirely on the visual modality. This study extends the paradigm by pairing visual with auditory stimuli. New visual and auditory stimuli were created to investigate the phenomenon of inattention in visual, auditory, and paired modality. The goal of the study was to assess to what extent the pairing of visual and auditory modality fosters the detection of change. Participants watched a video sequence and counted predetermined words in a spoken text. IB and inattentional deafness occurred in about 40% of participants when attention was engaged by this difficult (auditory) counting task. Most importantly, participants detected the changes considerably more often (88%) when the change occurred in both modalities rather than just one. One possible reason for the drastic reduction of IB or deafness in a multimodal context is that discrepancy between expected and encountered course of events proportionally increases across sensory modalities.


2016 ◽  
Vol 2016 ◽  
pp. 1-10 ◽  
Author(s):  
Jodie Naim-Feil ◽  
John L. Bradshaw ◽  
Dianne M. Sheppard ◽  
Oded Rosenberg ◽  
Yechiel Levkovitz ◽  
...  

While Major Depressive Disorder (MDD) is primarily characterized by mood disturbances, impaired attentional control is increasingly identified as a critical feature of depression. Deep transcranial magnetic stimulation (deepTMS), a noninvasive neuromodulatory technique, can modulate neural activity and induce neuroplasticity changes in brain regions recruited by attentional processes. This study examined whether acute and long-term high-frequency repetitive deepTMS to the dorsolateral prefrontal cortex (DLPFC) can attenuate attentional deficits associated with MDD. Twenty-one MDD patients and 26 matched control subjects (CS) were administered the Beck Depression Inventory and the Sustained Attention to Response Task (SART) at baseline. MDD patients were readministered the SART and depressive assessments following a single session (n=21) and after 4 weeks (n=13) of high-frequency (20 Hz) repetitive deepTMS applied to the DLPFC. To control for the practice effect, CS (n=26) were readministered the SART a further two times. The MDD group exhibited deficits in sustained attention and cognitive inhibition. Both acute and long-term high-frequency repetitive frontal deepTMS ameliorated sustained attention deficits in the MDD group. Improvement after acute deepTMS was related to attentional recovery after long-term deepTMS. Longer-term improvement in sustained attention was not related to antidepressant effects of deepTMS treatment.


2010 ◽  
Vol 24 (1) ◽  
pp. 1-6 ◽  
Author(s):  
Oscar H. Hernández ◽  
Muriel Vogel-Sprott

A missing stimulus task requires an immediate response to the omission of a regular recurrent stimulus. The task evokes a subclass of event-related potential known as omitted stimulus potential (OSP), which reflects some cognitive processes such as expectancy. The behavioral response to a missing stimulus is referred to as omitted stimulus reaction time (RT). This total RT measure is known to include cognitive and motor components. The cognitive component (premotor RT) is measured by the time from the missing stimulus until the onset of motor action. The motor RT component is measured by the time from the onset of muscle action until the completion of the response. Previous research showed that RT is faster to auditory than to visual stimuli, and that the premotor of RT to a missing auditory stimulus is correlated with the duration of an OSP. Although this observation suggests that similar cognitive processes might underlie these two measures, no research has tested this possibility. If similar cognitive processes are involved in the premotor RT and OSP duration, these two measures should be correlated in visual and somatosensory modalities, and the premotor RT to missing auditory stimuli should be fastest. This hypothesis was tested in 17 young male volunteers who performed a missing stimulus task, who were presented with trains of auditory, visual, and somatosensory stimuli and the OSP and RT measures were recorded. The results showed that premotor RT and OSP duration were consistently related, and that both measures were shorter with respect to auditory stimuli than to visual or somatosensory stimuli. This provides the first evidence that the premotor RT is related to an attribute of the OSP in all three sensory modalities.


Author(s):  
William S. Helton ◽  
Nicole Lopez ◽  
Sarah Tamminga

Author(s):  
Aaron Crowson ◽  
Zachary H. Pugh ◽  
Michael Wilkinson ◽  
Christopher B. Mayhorn

The development of head-mounted display virtual reality systems (e.g., Oculus Rift, HTC Vive) has resulted in an increasing need to represent the physical world while immersed in the virtual. Current research has focused on representing static objects in the physical room, but there has been little research into notifying VR users of changes in the environment. This study investigates how different sensory modalities affect noticeability and comprehension of notifications designed to alert head-mounted display users when a person enters his/her area of use. In addition, this study investigates how the use of an orientation type notification aids in perception of alerts that manifest outside a virtual reality users’ visual field. Results of a survey indicated that participants perceived the auditory modality as more effective regardless of notification type. An experiment corroborated these findings for the person notifications; however, the visual modality was in practice more effective for orientation notifications.


2011 ◽  
Vol 259 (6) ◽  
pp. 1191-1198 ◽  
Author(s):  
E. P. Hart ◽  
E. M. Dumas ◽  
R. H. A. M. Reijntjes ◽  
K. Hiele ◽  
S. J. A. Bogaard ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document