auditory stimuli
Recently Published Documents


TOTAL DOCUMENTS

1359
(FIVE YEARS 245)

H-INDEX

74
(FIVE YEARS 6)

2022 ◽  
Author(s):  
Long Li ◽  
Yanlong Zhang ◽  
Liming Fan ◽  
Jie Zhao ◽  
Jing Guo ◽  
...  

Abstract Background: Auditory feedback is one of the most important feedback in cognitive process. It plays an important guiding role in cognitive motor process. However, previous studies on auditory stimuli mainly focused on the cognitive effects of auditory stimuli on cortex, while the role of auditory feedback stimuli in motor imagery tasks is still unclear.Methods: 18 healthy subjects were recruited to complete the motor imagination task stimulated by meaningful words and meaningless words. In order to explore the role of auditory stimuli in motor imagination tasks, we studied EEG power spectrum, frontal parietal mismatch negativity (MMN) and inter test phase-locked consistency (ITPC). one-way Analysis of Variance (ANOVA) and Least Significant Difference (LSD) correction were used to test the differences between the two experimental groups and the differences of different bands in each experimental group.Results: EEG power spectrum analysis showed that the activity of contralateral motor cortex was significantly increased under the stimulation of meaningful words, and the amplitude of mismatch negative wave was also significantly increased. ITPC is mainly concentrated in μ, α and γ bands in the process of motor imagery task guided by the auditory stimulus of meaningful words, while it is mainly concentrated in the β band under the meaningless words stimulation.Conclusions: This results may be due to the influence of auditory cognitive process on motor imagery. We speculate that there may be a more complex mechanism for the effect of auditory stimulation on the inter test phase lock consistency. When the stimulus sound has the corresponding meaning to the motor action, the parietal motor cortex may be more affected by the prefrontal cognitive cortex, thus changing its normal response mode. This mode change is caused by the joint action of motor imagination, cognitive and auditory stimuli. This study provides a new insight into the neural mechanism of motor imagery task guided by auditory stimuli, and provides more information on the activity characteristics of the brain network in motor imagery task by cognitive auditory feedback.


2022 ◽  
Vol 416 ◽  
pp. 113534
Author(s):  
K. Stenstrom ◽  
H.U. Voss ◽  
K. Tokarev ◽  
M.L. Phan ◽  
M.E. Hauber
Keyword(s):  

2021 ◽  
Vol 15 ◽  
Author(s):  
Girija Kadlaskar ◽  
Sophia Bergmann ◽  
Rebecca McNally Keehn ◽  
Amanda Seidl ◽  
Brandon Keehn

Behavioral differences in responding to tactile and auditory stimuli are widely reported in individuals with autism spectrum disorder (ASD). However, the neural mechanisms underlying distinct tactile and auditory reactivity patterns in ASD remain unclear with theories implicating differences in both perceptual and attentional processes. The current study sought to investigate (1) the neural indices of early perceptual and later attentional factors underlying tactile and auditory processing in children with and without ASD, and (2) the relationship between neural indices of tactile and auditory processing and ASD symptomatology. Participants included 14, 6–12-year-olds with ASD and 14 age- and non-verbal IQ matched typically developing (TD) children. Children participated in an event-related potential (ERP) oddball paradigm during which they watched a silent video while being presented with tactile and auditory stimuli (i.e., 80% standard speech sound/a/; 10% oddball speech sound/i/; 10% novel vibrotactile stimuli on the fingertip with standard speech sound/a/). Children’s early and later ERP responses to tactile (P1 and N2) and auditory stimuli (P1, P3a, and P3b) were examined. Non-parametric analyses showed that children with ASD displayed differences in early perceptual processing of auditory (i.e., lower amplitudes at central region of interest), but not tactile, stimuli. Analysis of later attentional components did not show differences in response to tactile and auditory stimuli in the ASD and TD groups. Together, these results suggest that differences in auditory responsivity patterns could be related to perceptual factors in children with ASD. However, despite differences in caregiver-reported sensory measures, children with ASD did not differ in their neural reactivity to infrequent touch-speech stimuli compared to TD children. Nevertheless, correlational analyses confirmed that inter-individual differences in neural responsivity to tactile and auditory stimuli were related to social skills in all children. Finally, we discuss how the paradigm and stimulus type used in the current study may have impacted our results. These findings have implications for everyday life, where individual differences in responding to tactile and auditory stimuli may impact social functioning.


2021 ◽  
Author(s):  
Gregory Shay

In well documented studies, walking and music have independently shown substantial medical, health, productivity, and other human benefits. When music is combined with walking, and especially when the walking is done in synchrony to the beat, the music can stimulate faster walking without apparent awareness, the “velocity effect”. Some studies have reported that music that is either familiar, more enjoyable, and/or has higher “groove” tends to be more stimulating, and that some music can actually be sedating resulting in a slower speed relative to that of walking to a metronome at the same cadence. Research illuminating the velocity effect has mostly been conducted over relatively short stepping distances in a laboratory or similar outdoor setting. The current study examines walking on a real-world long distance outdoor track with a single genre of music that was at least somewhat familiar and somewhat enjoyable to the test subject. In this study, the test subject stepped in self-instructed synchrony with confirmed high accuracy to two types of auditory stimuli – either to the beat of a metronome (a presumed neutral source or what might be considered a most rudimentary form of music), or to the beat of a broad-spectrum of country music continuously over a 2-mile course. Nine metronome tempos and twenty-one country music tempos were examined in a walkable range of 90 to 130 beats per minute (BPM), and the effects of the music and metronome on walking performance were examined and quantified. Overall, the mix of country music was significantly more energizing than the metronome providing a relatively consistent 10% increase in step length and a resulting 10% increase in speed over the entire tempo/cadence range. Speed as a function of tempo was essentially linear in the beat range for both auditory stimuli with an apparent increase in speed relative to the trendlines occurring near 120 BPM, a preferred human response frequency reported in some prior investigations.


2021 ◽  
pp. 1-15
Author(s):  
Silvia Ceccacci

Driver behaviour recognition is of paramount importance for in-car automation assistance. It is widely recognized that not only attentional states, but also emotional ones have an impact on the safety of the driving behaviour. This research work proposes an emotion-aware in-car architecture where it is possible to adapt driver’s emotions to the vehicle dynamics, investigating the correlations between negative emotional states and driving performances, and suggesting a system to regulate the driver’s engagement through a unique user experience (e.g. using music, LED lighting) in the car cabin. The relationship between altered emotional states induced through auditory stimuli and vehicle dynamics is investigated in a driving simulator. The results confirm the need for both types of information to improve the robustness of the driver state recognition function and open up the possibility that auditory stimuli can modify driving performance somehow.


2021 ◽  
Vol 3 (2) ◽  
pp. 95-102
Author(s):  
Ediwarman Ediwarman ◽  
Syafrizal Syafrizal ◽  
John Pahamzah

This paper exmined the perception of speech using audio visual and replica for students of Sultan Ageng Tirtayasa Univesity. This research was aimed at discussing face-to-face conversation or speech felt by the ears and eyes.  The prerequisites for audio-visual perception of speech by using ambiguous perceptual sine wave replicas of natural speech as auditory stimuli are studied in details. When the subjects were unaware that auditory stimuli were speech, they only showed a negligible integration of auditory and visual stimuli. The same subjects learn to feel the same auditory stimuli as speech; they integrate auditory and visual stimuli in the same way as natural speech. These research result suggests a special mode of perception of multisensory speech.


2021 ◽  
Author(s):  
Intan K. Wardhani ◽  
Britt Hendrik Janssen ◽  
C. Nico Boehler

The present study investigates the effect of background luminance on the self-reported valence ratings of auditory stimuli, as suggested by some earlier work. A secondary aim was to better characterise the effect of auditory valence on pupillary responses, on which the literature is inconsistent. Participants were randomly presented with sounds of different valence categories (negative, neutral, and positive) obtained from the IADS-E database. At the same time, the background luminance of the computer screen (in blue hue) was manipulated across three levels (i.e., low, medium, and high), with pupillometry confirming the expected strong effect of luminance on pupil size. Participants were asked to rate the valence of the presented sound under these different luminance levels. On a behavioural level, we found trend-level evidence for a small effect of background luminance on the self-reported valence rating, with generally more positive ratings as background luminance increases. Turning to valence effects on pupil size, irrespective of background luminance, interestingly, we observed that pupils were smallest in the positive valence and the largest in negative valence condition, with neutral sounds in between. In sum, the present findings therefore provide some evidence concerning the relationship between luminance perception (and hence pupil size) and self-reported valence of auditory stimuli, indicating a possible cross-modal interaction of auditory valence processing with completely task-irrelevant visual background luminance. The present experiment furthermore contributes new data on the relationship between valence and pupil size for auditory stimuli.


Author(s):  
Elise Demeter ◽  
Brittany Glassberg ◽  
Marissa L. Gamble ◽  
Marty G. Woldorff

2021 ◽  
Author(s):  
◽  
Paige Badart

<p>Failures of attention can be hazardous, especially within the workplace where sustaining attention has become an increasingly important skill. This has produced a necessity for the development of methods to improve attention. One such method is the practice of meditation. Previous research has shown that meditation can produce beneficial changes to attention and associated brain regions. In particular, sustained attention has shown to be significantly improved by meditation. While this effect has shown to occur in the visual modality, there is less research on the effects of meditation and auditory sustained attention. Furthermore, there is currently no research which examines meditation on crossmodal sustained attention. This is relevant not only because visual and auditory are perceived simultaneously in reality, but also as it may assist in the debate as to whether sustained attention is managed by modality-specific systems or a single overarching supramodal system.  The current research was conducted to examine the effects of meditation on visual, auditory and audiovisual crossmodal sustained attention by using variants of the Sustained Attention to Response Task. In these tasks subjects were presented with either visual, auditory, or a combination of visual and auditory stimuli, and were required to respond to infrequent targets over an extended period of time. It was found that for all of the tasks, meditators significantly differed in accuracy compared to non-meditating control groups. The meditators made less errors without sacrificing response speed, with the exception of the Auditory-target crossmodal task. This demonstrates the benefit of meditation for improving sustained attention across sensory modalities and also lends support to the argument that sustained attention is governed by a supramodal system rather than modality-specific systems.</p>


Sign in / Sign up

Export Citation Format

Share Document