Modeling Brain Dynamics During Virtual Reality-Based Emergency Response Learning Under Stress

Author(s):  
Oshin Tyagi ◽  
Sarah Hopko ◽  
John Kang ◽  
Yangming Shi ◽  
Jing Du ◽  
...  

Background Stress affects learning during training, and virtual reality (VR) based training systems that manipulate stress can improve retention and retrieval performance for firefighters. Brain imaging using functional Near Infrared Spectroscopy (fNIRS) can facilitate development of VR-based adaptive training systems that can continuously assess the trainee’s states of learning and cognition. Objective The aim of this study was to model the neural dynamics associated with learning and retrieval under stress in a VR-based emergency response training exercise. Methods Forty firefighters underwent an emergency shutdown training in VR and were randomly assigned to either a control or a stress group. The stress group experienced stressors including smoke, fire, and explosions during the familiarization and training phase. Both groups underwent a stress memory retrieval and no-stress memory retrieval condition. Participant’s performance scores, fNIRS-based neural activity, and functional connectivity between the prefrontal cortex (PFC) and motor regions were obtained for the training and retrieval phases. Results The performance scores indicate that the rate of learning was slower in the stress group compared to the control group, but both groups performed similarly during each retrieval condition. Compared to the control group, the stress group exhibited suppressed PFC activation. However, they showed stronger connectivity within the PFC regions during the training and between PFC and motor regions during the retrieval phases. Discussion While stress impaired performance during training, adoption of stress-adaptive neural strategies (i.e., stronger brain connectivity) were associated with comparable performance between the stress and the control groups during the retrieval phase.

2022 ◽  
Vol 3 ◽  
Author(s):  
Luciënne A. de With ◽  
Nattapong Thammasan ◽  
Mannes Poel

To enable virtual reality exposure therapy (VRET) that treats anxiety disorders by gradually exposing the patient to fear using virtual reality (VR), it is important to monitor the patient's fear levels during the exposure. Despite the evidence of a fear circuit in the brain as reflected by functional near-infrared spectroscopy (fNIRS), the measurement of fear response in highly immersive VR using fNIRS is limited, especially in combination with a head-mounted display (HMD). In particular, it is unclear to what extent fNIRS can differentiate users with and without anxiety disorders and detect fear response in a highly ecological setting using an HMD. In this study, we investigated fNIRS signals captured from participants with and without a fear of height response. To examine the extent to which fNIRS signals of both groups differ, we conducted an experiment during which participants with moderate fear of heights and participants without it were exposed to VR scenarios involving heights and no heights. The between-group statistical analysis shows that the fNIRS data of the control group and the experimental group are significantly different only in the channel located close to right frontotemporal lobe, where the grand average oxygenated hemoglobin Δ[HbO] contrast signal of the experimental group exceeds that of the control group. The within-group statistical analysis shows significant differences between the grand average Δ[HbO] contrast values during fear responses and those during no-fear responses, where the Δ[HbO] contrast values of the fear responses were significantly higher than those of the no-fear responses in the channels located towards the frontal part of the prefrontal cortex. Also, the channel located close to frontocentral lobe was found to show significant difference for the grand average deoxygenated hemoglobin contrast signals. Support vector machine-based classifier could detect fear responses at an accuracy up to 70% and 74% in subject-dependent and subject-independent classifications, respectively. The results demonstrate that cortical hemodynamic responses of a control group and an experimental group are different to a considerable extent, exhibiting the feasibility and ecological validity of the combination of VR-HMD and fNIRS to elicit and detect fear responses. This research thus paves a way toward the a brain-computer interface to effectively manipulate and control VRET.


2021 ◽  
pp. 1-9
Author(s):  
Kyeong Joo Song ◽  
Min Ho Chun ◽  
Junekyung Lee ◽  
Changmin Lee

OBJECTIVE: To investigate the effects of the robot–assisted gait training on cortical activation and functional outcomes in stroke patients. METHODS: The patients were randomly assigned: training with Morning Walk® (Morning Walk group; n = 30); conventional physiotherapy (control group; n = 30). Rehabilitation was performed five times a week for 3 weeks. The primary outcome was the cortical activation in the Morning Walk group. The secondary outcomes included gait speed, 10-Meter Walk Test (10MWT), FAC, Motricity Index–Lower (MI–Lower), Modified Barthel Index (MBI), Rivermead Mobility Index (RMI), and Berg Balance Scale (BBS). RESULTS: Thirty-six subjects were analyzed, 18 in the Morning Walk group and 18 in the control group. The cortical activation was lower in affected hemisphere than unaffected hemisphere at the beginning of robot rehabilitation. After training, the affected hemisphere achieved a higher increase in cortical activation than the unaffected hemisphere. Consequently, the cortical activation in affected hemisphere was significantly higher than that in unaffected hemisphere (P = 0.036). FAC, MBI, BBS, and RMI scores significantly improved in both groups. The Morning Walk group had significantly greater improvements than the control group in 10MWT (P = 0.017), gait speed (P = 0.043), BBS (P = 0.010), and MI–Lower (P = 0.047) scores. CONCLUSION: Robot-assisted gait training not only improved functional outcomes but also increased cortical activation in stroke patients.


Sign in / Sign up

Export Citation Format

Share Document