auditory stimulus
Recently Published Documents


TOTAL DOCUMENTS

390
(FIVE YEARS 58)

H-INDEX

39
(FIVE YEARS 3)

Author(s):  
A. B. Rebreikina ◽  
D. F. Kleeva ◽  
G. A. Soghoyan ◽  
O. V. Sysoeva

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Dana Maslovat ◽  
Christin M. Sadler ◽  
Victoria Smith ◽  
Allison Bui ◽  
Anthony N. Carlsen

AbstractIn a simple reaction time task, the presentation of a startling acoustic stimulus has been shown to trigger the prepared response at short latency, known as the StartReact effect. However, it is unclear under what conditions it can be assumed that the loud stimulus results in response triggering. The purpose of the present study was to examine how auditory stimulus intensity and preparation level affect the probability of involuntary response triggering and the incidence of activation in the startle reflex indicator of sternocleidomastoid (SCM). In two reaction time experiments, participants were presented with an irrelevant auditory stimulus of varying intensities at various time points prior to the visual go-signal. Responses were independently categorized as responding to either the auditory or visual stimulus and those with or without SCM activation (i.e., SCM+/−). Both the incidence of response triggering and proportion of SCM+ trials increased with stimulus intensity and presentation closer to the go-signal. Data also showed that participants reacted to the auditory stimulus at a much higher rate on trials where the auditory stimulus elicited SCM activity versus those that did not, and a logistic regression analysis confirmed that SCM activation is a reliable predictor of response triggering for all conditions.


i-Perception ◽  
2021 ◽  
Vol 12 (6) ◽  
pp. 204166952110582
Author(s):  
Jing Fu ◽  
Xuanru Guo ◽  
Xiaoyu Tang ◽  
Aijun Wang ◽  
Ming Zhang ◽  
...  

Attention contains three functional network subcomponents of alerting, orienting, and executive control. The attention network test (ANT) is usually used to measure the efficiency of three attention subcomponents. Previous researches have focused on examining the unimodal attention with visual or auditory ANT paradigms. However, it is still unclear how an auditory stimulus influences the visual attention networks. This study investigated the effects of bilateral auditory stimuli (Experiment 1) and ipsilateral auditory stimulus (Experiment 2) on the visual attention subcomponents. We employed an ANT paradigm and manipulated the target modality types, including visual and audiovisual modalities. The participants were instructed to distinguish the direction of the central arrow surrounded by distractor arrows. In Experiment 1, we found that the simultaneous bilateral auditory stimuli reduced the efficiency of visual alerting and orienting, but had no significant effect on the efficiency of visual executive control. In Experiment 2, the ipsilateral auditory stimulus reduced the efficiency of visual executive control, but had no significant effect on the efficiency of visual alerting and orienting. We also observed a reduced relative multisensory response enhancement (rMRE) effect in cue condition relative to no cue condition (Experiment 1), and an increased rMRE effect in congruent condition compared with incongruent condition (Experiment 2). These results firstly provide evidence for the alerting, orienting and executive control effects in audiovisual condition. And the bilateral and ipsilateral auditory stimuli have different effects on the subcomponents of visual attention.


PLoS ONE ◽  
2021 ◽  
Vol 16 (8) ◽  
pp. e0256914
Author(s):  
Yuto Tanaka ◽  
Yuri Terasawa ◽  
Satoshi Umeda

Interoceptive accuracy is an index of the ability to perceive an individual’s internal bodily state, including heartbeat and respiration. Individual differences in interoceptive accuracy influence emotional recognition through autonomic nervous activity. However, the precise mechanism by which interoceptive accuracy affects autonomic reactivity remains unclear. Here, we investigated how cardiac reactivity induced by a non-affective external rhythm differed among individuals, using a heartbeat counting task. Because individuals with poor interoceptive accuracy cannot distinguish an external rhythm from their cardiac cycles, it has been hypothesized that the interoceptive effect on heart rate works differently in individuals with good interoceptive accuracy and those with poor interoceptive accuracy. Study participants observed a visual or auditory stimulus presented at a rhythm similar to the participants’ resting heart rates. The stimulus rhythm was gradually changed from that of their resting heart rate, and we recorded electrocardiographs while participants were exposed to the stimuli. Individuals with good interoceptive accuracy exhibited a deceleration in heart rate when the rhythm of the auditory stimulus changed. In contrast, in the group with poor interoceptive accuracy, the heart rate decreased only when the stimulus became faster. They were unable to distinguish the rhythm of their own heartbeat from that of the external rhythm; therefore, we propose that such individuals recognize the stimuli at the pace of their heart rate. Individuals with good interoceptive accuracy were able to distinguish their heart rates from the external rhythm. A modality difference was not observed in this study, which suggests that both visual and auditory stimuli help mimic heart rate. These results may provide physiological evidence that autonomic reactivity influences the perception of the internal bodily state, and that interoception and the autonomic state interact to some degree.


Author(s):  
Ryo Tachibana ◽  
Kazumichi Matsumiya

AbstractVirtual reality (VR) is a new methodology for behavioral studies. In such studies, the millisecond accuracy and precision of stimulus presentation are critical for data replicability. Recently, Python, which is a widely used programming language for scientific research, has contributed to reliable accuracy and precision in experimental control. However, little is known about whether modern VR environments have millisecond accuracy and precision for stimulus presentation, since most standard methods in laboratory studies are not optimized for VR environments. The purpose of this study was to systematically evaluate the accuracy and precision of visual and auditory stimuli generated in modern VR head-mounted displays (HMDs) from HTC and Oculus using Python 2 and 3. We used the newest Python tools for VR and Black Box Toolkit to measure the actual time lag and jitter. The results showed that there was an 18-ms time lag for visual stimulus in both HMDs. For the auditory stimulus, the time lag varied between 40 and 60 ms, depending on the HMD. The jitters of those time lags were 1 ms for visual stimulus and 4 ms for auditory stimulus, which are sufficiently low for general experiments. These time lags were robustly equal, even when auditory and visual stimuli were presented simultaneously. Interestingly, all results were perfectly consistent in both Python 2 and 3 environments. Thus, the present study will help establish a more reliable stimulus control for psychological and neuroscientific research controlled by Python environments.


Perception ◽  
2021 ◽  
pp. 030100662110221
Author(s):  
Piers Douglas Lionel Howe ◽  
Serene Bee Wen Lee

Individuals are often unable to report an attribute of an object to which they recently attended, if they expected to report a different attribute, a phenomenon known as attribute amnesia (AA). To date, all AA studies have occurred in the visual domain. The purpose of this study was to explore the boundary conditions of AA by testing if AA also occurs in the auditory domain and, if so, for which attributes. It was found that AA was present when reporting the location ( p =  .003) and the number of tones ( p <  .001) of an auditory stimulus, but not when reporting its pitch ( p =  .383). These findings can be understood in terms of the organisation of the primary cortical areas and help explain the differences between visual working memory and auditory working memory.


Perception ◽  
2021 ◽  
pp. 030100662110186
Author(s):  
Aijun Wang ◽  
Heng Zhou ◽  
Wei Yu ◽  
Fan Zhang ◽  
Hanbin Sang ◽  
...  

Sound-induced flash illusion (SiFI) refers to the illusion that the number of visual flashes is equal to the number of auditory sounds when the visual flashes are accompanied by an unequal number of auditory sounds presented within 100 ms. The effect of repetition suppression (RS), an adaptive effect caused by stimulus repetition, upon the SiFI has not been investigated. Based on the classic SiFI paradigm, the present study investigated whether RS would affect the SiFI differently by adding preceding stimuli in visual and auditory modalities prior to the appearance of audiovisual stimuli. The results showed the auditory RS effect on the SiFI varied with the number of preceding auditory stimuli. The hit rate was higher with two preceding auditory stimuli than one preceding auditory stimulus in fission illusion, but it did not affect the size of the fusion illusion. However, the visual RS had no effect on the size of the fission and fusion illusions. The present study suggested that RS could affect the SiFI, indicating that the RS effect in different modalities would differentially affect the magnitude of the SiFI. In the process of multisensory integration, the visual and auditory modalities had asymmetrical RS effects.


2021 ◽  
pp. 104421
Author(s):  
Alex R. Seigel ◽  
Isabelle G. DeVriendt ◽  
Savanna J. Hohenstein ◽  
Mark B. Lueders ◽  
Ananda Shastri ◽  
...  

2021 ◽  
Author(s):  
Leonardo Versaci ◽  
Rodrigo Laje

Finger tapping is a task widely used in a variety of experimental paradigms, in particular to understand sensorimotor synchronization and time processing in the range of hundreds of milliseconds (millisecond timing). Normally, subjects don’t receive any instruction about what to attend to and the results are seldom interpreted taking into account the possible effects of attention. In this work we show that attention can be oriented to the purely temporal aspects of a paced finger tapping task and that it affects performance. Specifically, time-oriented attention improves the accuracy in paced finger tapping and it also increases the resynchronization efficiency after a period perturbation. We use two markers of the attention level: auditory ERPs and subjective report of the mental workload. In addition, we propose a novel algorithm to separate the auditory, stimulus-related components from the somatosensory, response-related ones, which are naturally overlapped in the recorded EEG.


Sign in / Sign up

Export Citation Format

Share Document