auditory detection
Recently Published Documents


TOTAL DOCUMENTS

151
(FIVE YEARS 21)

H-INDEX

22
(FIVE YEARS 2)

2021 ◽  
Vol 150 (4) ◽  
pp. A140-A140
Author(s):  
Frank S. Mobley ◽  
Eric R. Thompson ◽  
Margaret H. Ugolini ◽  
Elizabeth L. Fox ◽  
Hilary Gallagher
Keyword(s):  

2021 ◽  
Vol 150 (4) ◽  
pp. A301-A301
Author(s):  
Margaret H. Ugolini ◽  
Eric R. Thompson ◽  
Frank S. Mobley

2021 ◽  
Vol 12 ◽  
Author(s):  
Ting Lu ◽  
Jingjing Yang ◽  
Xinyu Zhang ◽  
Zihan Guo ◽  
Shengnan Li ◽  
...  

Depression is related to the defect of emotion processing, and people's emotional processing is crossmodal. This article aims to investigate whether there is a difference in audiovisual emotional integration between the depression group and the normal group using a high-resolution event-related potential (ERP) technique. We designed a visual and/or auditory detection task. The behavioral results showed that the responses to bimodal audiovisual stimuli were faster than those to unimodal auditory or visual stimuli, indicating that crossmodal integration of emotional information occurred in both the depression and normal groups. The ERP results showed that the N2 amplitude induced by sadness was significantly higher than that induced by happiness. The participants in the depression group showed larger amplitudes of N1 and P2, and the average amplitude of LPP evoked in the frontocentral lobe in the depression group was significantly lower than that in the normal group. The results indicated that there are different audiovisual emotional processing mechanisms between depressed and non-depressed college students.


2021 ◽  
pp. 100014
Author(s):  
Emmanuel Biau ◽  
Danying Wang ◽  
Hyojin Park ◽  
Ole Jensen ◽  
Simon Hanslmayr

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Pilar Montes-Lourido ◽  
Manaswini Kar ◽  
Isha Kumbam ◽  
Srivatsun Sadagopan

AbstractEstimates of detection and discrimination thresholds are often used to explore broad perceptual similarities between human subjects and animal models. Pupillometry shows great promise as a non-invasive, easily-deployable method of comparing human and animal thresholds. Using pupillometry, previous studies in animal models have obtained threshold estimates to simple stimuli such as pure tones, but have not explored whether similar pupil responses can be evoked by complex stimuli, what other stimulus contingencies might affect stimulus-evoked pupil responses, and if pupil responses can be modulated by experience or short-term training. In this study, we used an auditory oddball paradigm to estimate detection and discrimination thresholds across a wide range of stimuli in guinea pigs. We demonstrate that pupillometry yields reliable detection and discrimination thresholds across a range of simple (tones) and complex (conspecific vocalizations) stimuli; that pupil responses can be robustly evoked using different stimulus contingencies (low-level acoustic changes, or higher level categorical changes); and that pupil responses are modulated by short-term training. These results lay the foundation for using pupillometry as a reliable method of estimating thresholds in large experimental cohorts, and unveil the full potential of using pupillometry to explore broad similarities between humans and animal models.


BJS Open ◽  
2020 ◽  
Vol 5 (2) ◽  
Author(s):  
M Thomaschewski ◽  
M Heldmann ◽  
J C Uter ◽  
D Varbelow ◽  
T F Münte ◽  
...  

Abstract Background Increasing familiarity and practice might free up mental resources during laparoscopic surgical skills training. The aim of the study was to track changes in mental resource allocation during acquisition of laparoscopic surgical skills. Methods Medical students with no previous experience in laparoscopic surgery took part in a 5-week laparoscopic training curriculum. At the beginning and end of the training period, one of the training tasks was combined with a secondary auditory detection task that required pressing a foot switch for defined target tones, creating a dual-task situation. During execution of the two concurrent tasks, continuous electroencephalographic measurements were made, with special attention to the P300 component, an index of mental resources. Accuracy and reaction times of the secondary task were determined. Results All 14 participants successfully completed the training curriculum. Target times for successful completion of individual tasks decreased significantly during training sessions (P  <0.001 for all tasks). Comparing results before and after training showed a significant decrease in event-related brain potential amplitude at the parietal electrode cluster (P300 component, W = 67, P = 0.026), but there were no differences in accuracy (percentage correct responses: W = 48, P = 0.518) or reaction times (W = 42, P = 0.850) in the auditory detection task. Conclusion The P300 decrease in the secondary task over training demonstrated a shift of mental resources to the primary task: the surgical exercise. This indicates that, with more practice, mental resources are freed up for additional tasks.


2020 ◽  
Author(s):  
Pilar Montes-Lourido ◽  
Manaswini Kar ◽  
Isha Kumbam ◽  
Srivatsun Sadagopan

AbstractEstimates of detection and discrimination thresholds are often used to explore broad perceptual similarities between human subjects and animal models. Pupillometry shows great promise as a non-invasive, easily-deployable method of comparing human and animal thresholds. Using pupillometry, previous studies in animal models have obtained threshold estimates to simple stimuli such as pure tones, but have not explored whether similar pupil responses can be evoked by complex stimuli, what other stimulus contingencies might affect stimulus-evoked pupil responses, and if pupil responses can be modulated by experience or short-term training. In this study, we used an auditory oddball paradigm to estimate detection and discrimination thresholds across a wide range of stimuli in guinea pigs. We demonstrate that pupillometry yields reliable detection and discrimination thresholds across a range of simple (tones) and complex (conspecific vocalizations) stimuli; that pupil responses can be robustly evoked using different stimulus contingencies (low-level acoustic changes, or higher level categorical changes); and that pupil responses are modulated by short-term training. These results lay the foundation for using pupillometry as a high-throughput method of estimating thresholds in large experimental cohorts, and unveil the full potential of using pupillometry to explore broad similarities between humans and animal models.


2020 ◽  
Author(s):  
Emmanuel Biau ◽  
Danying Wang ◽  
Hyojin Park ◽  
Ole Jensen ◽  
Simon Hanslmayr

ABSTRACTAudiovisual speech perception relies, among other things, on our expertise to map a speaker’s lip movements with speech sounds. This multimodal matching is facilitated by salient syllable features that align lip movements and acoustic envelope signals in the 4 - 8 Hz theta band. Although non-exclusive, the predominance of theta rhythms in speech processing has been firmly established by studies showing that neural oscillations track the acoustic envelope in the primary auditory cortex. Equivalently, theta oscillations in the visual cortex entrain to lip movements, and the auditory cortex is recruited during silent speech perception. These findings suggest that neuronal theta oscillations may play a functional role in organising information flow across visual and auditory sensory areas. We presented silent speech movies while participants performed a pure tone detection task to test whether entrainment to lip movements directs the auditory system and drives behavioural outcomes. We showed that auditory detection varied depending on the ongoing theta phase conveyed by lip movements in the movies. In a complementary experiment presenting the same movies while recording participants’ electro-encephalogram (EEG), we found that silent lip movements entrained neural oscillations in the visual and auditory cortices with the visual phase leading the auditory phase. These results support the idea that the visual cortex entrained by lip movements filtered the sensitivity of the auditory cortex via theta phase synchronisation.


2020 ◽  
Vol 29 (1) ◽  
pp. 23-34
Author(s):  
Kelly N. Jahn ◽  
Molly D. Bergan ◽  
Julie G. Arenberg

Purpose The goal of this study was to evaluate differences in the electrode–neuron interface as a function of hearing loss etiology in pediatric cochlear implant (CI) listeners with enlarged vestibular aqueduct (EVA) syndrome and in those with autosomal recessive connexin-26 mutations (DFNB1). Method Fifteen implanted ears (9 participants, 5 ears with EVA, 10 ears with DFNB1) were assessed. Single-channel auditory detection thresholds were measured using broad and spatially focused electrode configurations (steered quadrupolar; focusing coefficients = 0 and 0.9). Cochlear resistivity estimates were obtained via electrode impedances and electrical field imaging. Between-group differences were evaluated using linear mixed-effects models. Results Children with EVA had significantly higher auditory detection thresholds than children with DFNB1, irrespective of electrode configuration. Between-group differences in thresholds were more pronounced on apical electrodes than on basal electrodes. In the apex, electrode impedances and electrical field imaging values were higher for children with EVA than for those with DFNB1. Conclusions The electrode–neuron interface differs between pediatric CI listeners with DFNB1 and those with EVA. It is possible that optimal clinical interventions may depend, in part, on hearing loss etiology. Future investigations with large samples should investigate individualized CI programming strategies for listeners with EVA and DFNB1.


Sign in / Sign up

Export Citation Format

Share Document