scholarly journals Fluctuations in Arousal Correlate with Neural Activity in the Human Thalamus

Author(s):  
Tetsuya Iidaka

Abstract The neural basis of consciousness has been explored in humans and animals; however, the exact nature of consciousness remains elusive. In this study, we aimed to elucidate which brain regions are relevant to arousal in humans. Simultaneous recordings of brain activity and eye-tracking were conducted in 20 healthy human participants. Brain activity was measured by resting-state functional magnetic resonance imaging with a multiband acquisition protocol. The subjective levels of arousal were investigated based on the degree of eyelid closure recorded using a near-infrared eye camera within the scanner. The results showed that the participants were in an aroused state for 79% of the scan time, and the bilateral thalami were significantly associated with the arousal condition. Among the major thalamic subnuclei, the mediodorsal nucleus showed greater involvement in arousal, compared with other subnuclei. A receiver operating characteristic analysis with leave-one-out cross validation conducted using template-based brain activity and arousal level data from eye-tracking showed that in most participants, thalamic activity significantly predicted the subjective levels of arousal. These results indicate a significant role of the thalamus, and in particular, the mediodorsal nucleus, which has rich connectivity with the prefrontal cortices and the limbic system in human consciousness.

2021 ◽  
Vol 11 (2) ◽  
pp. 196
Author(s):  
Sébastien Laurent ◽  
Laurence Paire-Ficout ◽  
Jean-Michel Boucheix ◽  
Stéphane Argon ◽  
Antonio Hidalgo-Muñoz

The question of the possible impact of deafness on temporal processing remains unanswered. Different findings, based on behavioral measures, show contradictory results. The goal of the present study is to analyze the brain activity underlying time estimation by using functional near infrared spectroscopy (fNIRS) techniques, which allow examination of the frontal, central and occipital cortical areas. A total of 37 participants (19 deaf) were recruited. The experimental task involved processing a road scene to determine whether the driver had time to safely execute a driving task, such as overtaking. The road scenes were presented in animated format, or in sequences of 3 static images showing the beginning, mid-point, and end of a situation. The latter presentation required a clocking mechanism to estimate the time between the samples to evaluate vehicle speed. The results show greater frontal region activity in deaf people, which suggests that more cognitive effort is needed to process these scenes. The central region, which is involved in clocking according to several studies, is particularly activated by the static presentation in deaf people during the estimation of time lapses. Exploration of the occipital region yielded no conclusive results. Our results on the frontal and central regions encourage further study of the neural basis of time processing and its links with auditory capacity.


2015 ◽  
Vol 29 (4) ◽  
pp. 135-146 ◽  
Author(s):  
Miroslaw Wyczesany ◽  
Szczepan J. Grzybowski ◽  
Jan Kaiser

Abstract. In the study, the neural basis of emotional reactivity was investigated. Reactivity was operationalized as the impact of emotional pictures on the self-reported ongoing affective state. It was used to divide the subjects into high- and low-responders groups. Independent sources of brain activity were identified, localized with the DIPFIT method, and clustered across subjects to analyse the visual evoked potentials to affective pictures. Four of the identified clusters revealed effects of reactivity. The earliest two started about 120 ms from the stimulus onset and were located in the occipital lobe and the right temporoparietal junction. Another two with a latency of 200 ms were found in the orbitofrontal and the right dorsolateral cortices. Additionally, differences in pre-stimulus alpha level over the visual cortex were observed between the groups. The attentional modulation of perceptual processes is proposed as an early source of emotional reactivity, which forms an automatic mechanism of affective control. The role of top-down processes in affective appraisal and, finally, the experience of ongoing emotional states is also discussed.


2019 ◽  
Author(s):  
Shannon Burns ◽  
Lianne N. Barnes ◽  
Ian A. McCulloh ◽  
Munqith M. Dagher ◽  
Emily B. Falk ◽  
...  

The large majority of social neuroscience research uses WEIRD populations – participants from Western, educated, industrialized, rich, and democratic locations. This makes it difficult to claim whether neuropsychological functions are universal or culture specific. In this study, we demonstrate one approach to addressing the imbalance by using portable neuroscience equipment in a study of persuasion conducted in Jordan with an Arabic-speaking sample. Participants were shown persuasive videos on various health and safety topics while their brain activity was measured using functional near infrared spectroscopy (fNIRS). Self-reported persuasiveness ratings for each video were then recorded. Consistent with previous research conducted with American subjects, this work found that activity in the dorsomedial and ventromedial prefrontal cortex predicted how persuasive participants found the videos and how much they intended to engage in the messages’ endorsed behaviors. Further, activity in the left ventrolateral prefrontal cortex was associated with persuasiveness ratings, but only in participants for whom the message was personally relevant. Implications for these results on the understanding of the brain basis of persuasion and on future directions for neuroimaging in diverse populations are discussed.


2012 ◽  
Vol 24 (9) ◽  
pp. 1867-1883 ◽  
Author(s):  
Bradley R. Buchsbaum ◽  
Sabrina Lemire-Rodger ◽  
Candice Fang ◽  
Hervé Abdi

When we have a rich and vivid memory for a past experience, it often feels like we are transported back in time to witness once again this event. Indeed, a perfect memory would exactly mimic the experiential quality of direct sensory perception. We used fMRI and multivoxel pattern analysis to map and quantify the similarity between patterns of activation evoked by direct perception of a diverse set of short video clips and the vivid remembering, with closed eyes, of these clips. We found that the patterns of distributed brain activation during vivid memory mimicked the patterns evoked during sensory perception. Using whole-brain patterns of activation evoked by perception of the videos, we were able to accurately classify brain patterns that were elicited when participants tried to vividly recall those same videos. A discriminant analysis of the activation patterns associated with each video revealed a high degree (explaining over 80% of the variance) of shared representational similarity between perception and memory. These results show that complex, multifeatured memory involves a partial reinstatement of the whole pattern of brain activity that is evoked during initial perception of the stimulus.


Author(s):  
Federico Cassioli ◽  
Laura Angioletti ◽  
Michela Balconi

AbstractHuman–computer interaction (HCI) is particularly interesting because full-immersive technology may be approached differently by users, depending on the complexity of the interaction, users’ personality traits, and their motivational systems inclination. Therefore, this study investigated the relationship between psychological factors and attention towards specific tech-interactions in a smart home system (SHS). The relation between personal psychological traits and eye-tracking metrics is investigated through self-report measures [locus of control (LoC), user experience (UX), behavioral inhibition system (BIS) and behavioral activation system (BAS)] and a wearable and wireless near-infrared illumination based eye-tracking system applied to an Italian sample (n = 19). Participants were asked to activate and interact with five different tech-interaction areas with different levels of complexity (entrance, kitchen, living room, bathroom, and bedroom) in a smart home system (SHS), while their eye-gaze behavior was recorded. Data showed significant differences between a simpler interaction (entrance) and a more complex one (living room), in terms of number of fixation. Moreover, slower time to first fixation in a multifaceted interaction (bathroom), compared to simpler ones (kitchen and living room) was found. Additionally, in two interaction conditions (living room and bathroom), negative correlations were found between external LoC and fixation count, and between BAS reward responsiveness scores and fixation duration. Findings led to the identification of a two-way process, where both the complexity of the tech-interaction and subjects’ personality traits are important impacting factors on the user’s visual exploration behavior. This research contributes to understand the user responsiveness adding first insights that may help to create more human-centered technology.


Information ◽  
2021 ◽  
Vol 12 (6) ◽  
pp. 226
Author(s):  
Lisa-Marie Vortmann ◽  
Leonid Schwenke ◽  
Felix Putze

Augmented reality is the fusion of virtual components and our real surroundings. The simultaneous visibility of generated and natural objects often requires users to direct their selective attention to a specific target that is either real or virtual. In this study, we investigated whether this target is real or virtual by using machine learning techniques to classify electroencephalographic (EEG) and eye tracking data collected in augmented reality scenarios. A shallow convolutional neural net classified 3 second EEG data windows from 20 participants in a person-dependent manner with an average accuracy above 70% if the testing data and training data came from different trials. This accuracy could be significantly increased to 77% using a multimodal late fusion approach that included the recorded eye tracking data. Person-independent EEG classification was possible above chance level for 6 out of 20 participants. Thus, the reliability of such a brain–computer interface is high enough for it to be treated as a useful input mechanism for augmented reality applications.


2003 ◽  
Vol 89 (5) ◽  
pp. 2516-2527 ◽  
Author(s):  
Laurent Petit ◽  
Michael S. Beauchamp

We used event-related fMRI to measure brain activity while subjects performed saccadic eye, head, and gaze movements to visually presented targets. Two distinct patterns of response were observed. One set of areas was equally active during eye, head, and gaze movements and consisted of the superior and inferior subdivisions of the frontal eye fields, the supplementary eye field, the intraparietal sulcus, the precuneus, area MT in the lateral occipital sulcus and subcortically in basal ganglia, thalamus, and the superior colliculus. These areas have been previously observed in functional imaging studies of human eye movements, suggesting that a common set of brain areas subserves both oculomotor and head movement control in humans, consistent with data from single-unit recording and microstimulation studies in nonhuman primates that have described overlapping eye- and head-movement representations in oculomotor control areas. A second set of areas was active during head and gaze movements but not during eye movements. This set of areas included the posterior part of the planum temporale and the cortex at the temporoparietal junction, known as the parieto-insular vestibular cortex (PIVC). Activity in PIVC has been observed during imaging studies of invasive vestibular stimulation, and we confirm its role in processing the vestibular cues accompanying natural head movements. Our findings demonstrate that fMRI can be used to study the neural basis of head movements and show that areas that control eye movements also control head movements. In addition, we provide the first evidence for brain activity associated with vestibular input produced by natural head movements as opposed to invasive caloric or galvanic vestibular stimulation.


Sign in / Sign up

Export Citation Format

Share Document