auditory task
Recently Published Documents


TOTAL DOCUMENTS

158
(FIVE YEARS 56)

H-INDEX

26
(FIVE YEARS 2)

2021 ◽  
Vol 12 (1) ◽  
Author(s):  
Lluís Hernández-Navarro ◽  
Ainhoa Hermoso-Mendizabal ◽  
Daniel Duque ◽  
Jaime de la Rocha ◽  
Alexandre Hyafil

AbstractStandard models of perceptual decision-making postulate that a response is triggered in reaction to stimulus presentation when the accumulated stimulus evidence reaches a decision threshold. This framework excludes however the possibility that informed responses are generated proactively at a time independent of stimulus. Here, we find that, in a free reaction time auditory task in rats, reactive and proactive responses coexist, suggesting that choice selection and motor initiation, commonly viewed as serial processes, are decoupled in general. We capture this behavior by a novel model in which proactive and reactive responses are triggered whenever either of two competing processes, respectively Action Initiation or Evidence Accumulation, reaches a bound. In both types of response, the choice is ultimately informed by the Evidence Accumulation process. The Action Initiation process readily explains premature responses, contributes to urgency effects at long reaction times and mediates the slowing of the responses as animals get satiated and tired during sessions. Moreover, it successfully predicts reaction time distributions when the stimulus was either delayed, advanced or omitted. Overall, these results fundamentally extend standard models of evidence accumulation in decision making by showing that proactive and reactive processes compete for the generation of responses.


2021 ◽  
Author(s):  
Grace M Clements ◽  
Mate Gyurkovics ◽  
Kathy A Low ◽  
Diane M Beck ◽  
Monica Fabiani ◽  
...  

In the face of multiple sensory streams, there may be competition for processing resources in multimodal cortical area devoted to establishing representations. In such cases, alpha oscillations may serve to maintain the relevant representations and protect them from interference, whereas theta oscillations may facilitate their updating when needed. It can be hypothesized that these oscillations would differ in response to an auditory stimulus when the eyes are open or closed, as intermodal resource competition may be more prominent in the former than in the latter case. Across two studies we investigated the role of alpha and theta power in multimodal competition using an auditory task with the eyes open and closed, respectively enabling and disabling visual processing in parallel with the incoming auditory stream. In a passive listening task (Study 1a), we found alpha suppression following a pip tone with both eyes open and closed, but subsequent alpha enhancement only with closed eyes. We replicated this eyes-closed alpha enhancement in an independent sample (Study 1b). In an active auditory oddball task (Study 2), we again observed the eyes open/eyes closed alpha pattern found in Study 1 and also demonstrated that the more attentionally demanding oddball trials elicit the largest oscillatory effects. Theta power did not interact with eye status in either study. We propose a hypothesis to account for the findings in which alpha may be endemic to multimodal cortical areas in addition to visual ones.


2021 ◽  
Author(s):  
Nikolas Francis ◽  
Shoutik Mukherjee ◽  
Loren Kocillari ◽  
Stefano Panzeri ◽  
Behtash Babadi ◽  
...  

During auditory task performance, cortical processing of task-relevant information enables mammals to recognize sensory input and flexibly select behavioral responses. In mouse auditory cortex, small neuronal networks encode behavioral choice during a pure-tone detection task, but it is poorly understood how neuronal networks encode behavioral choice during a pure-tone discrimination task where tones have to be categorized into targets and non-targets. While the interactions between networked neurons are thought to encode behavioral choice, it remains unclear how patterns of neuronal network activity indicate the transmission of task-relevant information within the network. To this end, we trained mice to behaviorally discriminate target vs. non-target pure-tones while we used in vivo 2-photon imaging to record neuronal population activity in primary auditory cortex layer 2/3. We found that during task performance, a specialized subset of neurons transiently encoded intersection information, i.e., sensory information that was used to inform behavioral choice. Granger causality analysis showed that these neurons formed functional networks in which task-relevant information was transmitted sequentially between neurons. Differences in network structure between target and non-target sounds encoded behavioral choice. Correct behavioral choices were associated with shorter timescale communication between neurons. In summary, we find that specialized neuronal populations in auditory cortex form functional networks during auditory task performance whose structures depend on both sensory input and behavioral choice.


Author(s):  
Alice Bollini ◽  
Davide Esposito ◽  
Claudio Campus ◽  
Monica Gori

AbstractThe human brain creates an external world representation based on magnitude judgments by estimating distance, numerosity, or size. The magnitude and spatial representation are hypothesized to rely on common mechanisms shared by different sensory modalities. We explored the relationship between magnitude and spatial representation using two different sensory systems. We hypothesize that the interaction between space and magnitude is combined differently depending on sensory modalities. Furthermore, we aimed to understand the role of the spatial reference frame in magnitude representation. We used stimulus–response compatibility (SRC) to investigate these processes assuming that performance is improved if stimulus and response share common features. We designed an auditory and tactile SRC task with conflicting spatial and magnitude mapping. Our results showed that sensory modality modulates the relationship between space and magnitude. A larger effect of magnitude over spatial congruency occurred in a tactile task. However, magnitude and space showed similar weight in the auditory task, with neither spatial congruency nor magnitude congruency having a significant effect. Moreover, we observed that the spatial frame activated during tasks was elicited by the sensory inputs. The participants' performance was reversed in the tactile task between uncrossed and crossed hands posture, suggesting an internal coordinate system. In contrast, crossing the hands did not alter performance (i.e., using an allocentric frame of reference). Overall, these results suggest that space and magnitude interaction differ in auditory and tactile modalities, supporting the idea that these sensory modalities use different magnitude and spatial representation mechanisms.


Life ◽  
2021 ◽  
Vol 11 (8) ◽  
pp. 850
Author(s):  
Federico Linassi ◽  
David Peter Obert ◽  
Eleonora Maran ◽  
Paola Tellaroli ◽  
Matthias Kreuzer ◽  
...  

General anesthesia should induce unconsciousness and provide amnesia. Amnesia refers to the absence of explicit and implicit memories. Unlike explicit memory, implicit memory is not consciously recalled, and it can affect behavior/performance at a later time. The impact of general anesthesia in preventing implicit memory formation is not well-established. We performed a systematic review with meta-analysis of studies reporting implicit memory occurrence in adult patients after deep sedation (Observer’s Assessment of Alertness/Sedation of 0–1 with spontaneous breathing) or general anesthesia. We also evaluated the impact of different anesthetic/analgesic regimens and the time point of auditory task delivery on implicit memory formation. The meta-analysis included the estimation of odds ratios (ORs) and 95% confidence intervals (CIs). We included a total of 61 studies with 3906 patients and 119 different cohorts. For 43 cohorts (36.1%), implicit memory events were reported. The American Society of Anesthesiologists (ASA) physical status III–IV was associated with a higher likelihood of implicit memory formation (OR:3.48; 95%CI:1.18–10.25, p < 0.05) than ASA physical status I–II. Further, there was a lower likelihood of implicit memory formation for deep sedation cases, compared to general anesthesia (OR:0.10; 95%CI:0.01–0.76, p < 0.05) and for patients receiving premedication with benzodiazepines compared to not premedicated patients before general anesthesia (OR:0.35; 95%CI:0.13–0.93, p = 0.05).


2021 ◽  
Vol 12 ◽  
Author(s):  
Hisako W. Yamamoto ◽  
Misako Kawahara ◽  
Akihiro Tanaka

Due to the COVID-19 pandemic, the significance of online research has been rising in the field of psychology. However, online experiments with child participants are rare compared to those with adults. In this study, we investigated the validity of web-based experiments with child participants 4–12 years old and adult participants. They performed simple emotional perception tasks in an experiment designed and conducted on the Gorilla Experiment Builder platform. After short communication with each participant via Zoom videoconferencing software, participants performed the auditory task (judging emotion from vocal expression) and the visual task (judging emotion from facial expression). The data collected were compared with data collected in our previous similar laboratory experiment, and similar tendencies were found. For the auditory task in particular, we replicated differences in accuracy perceiving vocal expressions between age groups and also found the same native language advantage. Furthermore, we discuss the possibility of using online cognitive studies for future developmental studies.


2021 ◽  
Vol 15 ◽  
Author(s):  
Nicole H. Yuen ◽  
Fred Tam ◽  
Nathan W. Churchill ◽  
Tom A. Schweizer ◽  
Simon J. Graham

IntroductionDriving motor vehicles is a complex task that depends heavily on how visual stimuli are received and subsequently processed by the brain. The potential impact of distraction on driving performance is well known and poses a safety concern – especially for individuals with cognitive impairments who may be clinically unfit to drive. The present study is the first to combine functional magnetic resonance imaging (fMRI) and eye-tracking during simulated driving with distraction, providing oculomotor metrics to enhance scientific understanding of the brain activity that supports driving performance.Materials and MethodsAs initial work, twelve healthy young, right-handed participants performed turns ranging in complexity, including simple right and left turns without oncoming traffic, and left turns with oncoming traffic. Distraction was introduced as an auditory task during straight driving, and during left turns with oncoming traffic. Eye-tracking data were recorded during fMRI to characterize fixations, saccades, pupil diameter and blink rate.ResultsBrain activation maps for right turns, left turns without oncoming traffic, left turns with oncoming traffic, and the distraction conditions were largely consistent with previous literature reporting the neural correlates of simulated driving. When the effects of distraction were evaluated for left turns with oncoming traffic, increased activation was observed in areas involved in executive function (e.g., middle and inferior frontal gyri) as well as decreased activation in the posterior brain (e.g., middle and superior occipital gyri). Whereas driving performance remained mostly unchanged (e.g., turn speed, time to turn, collisions), the oculomotor measures showed that distraction resulted in more consistent gaze at oncoming traffic in a small area of the visual scene; less time spent gazing at off-road targets (e.g., speedometer, rear-view mirror); more time spent performing saccadic eye movements; and decreased blink rate.ConclusionOculomotor behavior modulated with driving task complexity and distraction in a manner consistent with the brain activation features revealed by fMRI. The results suggest that eye-tracking technology should be included in future fMRI studies of simulated driving behavior in targeted populations, such as the elderly and individuals with cognitive complaints – ultimately toward developing better technology to assess and enhance fitness to drive.


2021 ◽  
Author(s):  
Paula Ríos López ◽  
Andreas Widmann ◽  
Aurélie Bidet-Caulet ◽  
Nicole Wetzel

Everyday cognitive tasks are rarely performed in a quiet environment. Quite on the contrary, very diverse surrounding acoustic signals such as speech can involuntarily deviate our attention from the task at hand. Despite its tight relation to attentional processes, pupillometry remained a rather unexploited method to measure attention allocation towards irrelevant speech. In the present study, we registered changes in pupil diameter size to quantify the effect of meaningfulness of background speech upon performance in an attentional task. We recruited 41 native German speakers who had neither received formal instruction in French nor had extensive informal contact with this language. The focal task consisted of an auditory oddball task. Participants performed an animal sound duration discrimination task containing frequently repeated standard sounds and rarely presented deviant sounds while a story was read in German or (non-meaningful) French in the background. Our results revealed that, whereas effects of language meaningfulness on attention were not detectable at the behavioural level, participants’ pupil dilated more in response to the sounds of the auditory task when background speech was played in non-meaningful French compared to German, independent of sound type. This could suggest that semantic processing of the native language required attentional resources, which lead to fewer resources devoted to the processing of the sounds of the focal task. Our results highlight the potential of the pupil dilation response for the investigation of subtle cognitive processes that might not surface when only behaviour is measured.


2021 ◽  
Vol 15 ◽  
Author(s):  
Luis M. Rivera-Perez ◽  
Julia T. Kwapiszewski ◽  
Michael T. Roberts

The inferior colliculus (IC), the midbrain hub of the central auditory system, receives extensive cholinergic input from the pontomesencephalic tegmentum. Activation of nicotinic acetylcholine receptors (nAChRs) in the IC can alter acoustic processing and enhance auditory task performance. However, how nAChRs affect the excitability of specific classes of IC neurons remains unknown. Recently, we identified vasoactive intestinal peptide (VIP) neurons as a distinct class of glutamatergic principal neurons in the IC. Here, in experiments using male and female mice, we show that cholinergic terminals are routinely located adjacent to the somas and dendrites of VIP neurons. Using whole-cell electrophysiology in brain slices, we found that acetylcholine drives surprisingly strong and long-lasting excitation and inward currents in VIP neurons. This excitation was unaffected by the muscarinic receptor antagonist atropine. Application of nAChR antagonists revealed that acetylcholine excites VIP neurons mainly via activation of α3β4∗ nAChRs, a nAChR subtype that is rare in the brain. Furthermore, we show that acetylcholine excites VIP neurons directly and does not require intermediate activation of presynaptic inputs that might express nAChRs. Lastly, we found that low frequency trains of acetylcholine puffs elicited temporal summation in VIP neurons, suggesting that in vivo-like patterns of cholinergic input can reshape activity for prolonged periods. These results reveal the first cellular mechanisms of nAChR regulation in the IC, identify a functional role for α3β4∗ nAChRs in the auditory system, and suggest that cholinergic input can potently influence auditory processing by increasing excitability in VIP neurons and their postsynaptic targets.


2021 ◽  
Author(s):  
Zhe Chen ◽  
Yue Zhang ◽  
Zhikai Zhang ◽  
Lei Ren ◽  
Chaogang Wei ◽  
...  

Abstract In this study, we aimed to investigate the effect of mild to moderate U-shape hearing loss on listening effort in Alport syndrome (AS) children by pupillometry. Subjects were required to answer questions after listening to conversations that simulate real scenes in daily life. We recorded the accuracy rate and pupil data of two conditions: SNR = + 15 dB and SNR = -2 dB. A mixed effect model was established to analyze the influence of SNR, mid-frequency energy proportion and hearing status on the accuracy and pupil response. The results showed SNR had a main effect on the accuracy. The baseline pupil diameter of AS children was always smaller than normal hearing children. When analyzing the time window including the stages of listening to the conversation, listening to the question and thinking, SNR and the hearing status had main effect on mean pupil dilation. We concluded that AS children with hearing loss were often in a state of low arousal before auditory task. Both hearing status and task difficulty have impact on listening effort of AS children. The effort of AS children with U-shape hearing loss might come mainly from consequent cognitive processing (as a consequence of effortful listening) instead of passive listening during speech communication.


Sign in / Sign up

Export Citation Format

Share Document