Preliminary test of affective virtual reality scenes with head mount display for emotion elicitation experiment

Author(s):  
Kenta Hidaka ◽  
Haoyu Qin ◽  
Jun Kobayashi
2021 ◽  
pp. 93-114
Author(s):  
Radiah Rivu ◽  
Ruoyu Jiang ◽  
Ville Mäkelä ◽  
Mariam Hassib ◽  
Florian Alt

Author(s):  
Sahinya Susindar ◽  
Mahnoosh Sadeghi ◽  
Lea Huntington ◽  
Andrew Singer ◽  
Thomas K. Ferris

Classical methods for eliciting emotional responses, including the use of emotionally-charged pictures and films, have been used to study the influence of affective states on human decision-making and other cognitive processes. Advanced multisensory display systems, such as Virtual Reality (VR) headsets, offer a degree of immersion that may support more reliable elicitation of emotional experiences than less-immersive displays, and can provide a powerful yet relatively safe platform for inducing negative emotions such as fear and anger. However, it is not well understood how the presentation medium influences the degree to which emotions are elicited. In this study, emotionally-charged stimuli were introduced via two display configurations – on a desktop computer and on a VR system –and were evaluated based on performance in a decision task. Results show that the use of VR can be a more effective method for emotion elicitation when study decision-making under the influence of emotions.


2020 ◽  
Author(s):  
Sayyed Amir Hossain Maghool ◽  
Mitra Homolja ◽  
Marc Aurel Schnabel

In contrast to reductionist investigating of interrelation between emotion and architecture, we have proposed a new concept for creating an adaptive architecture system that employs biosensors and virtual reality (VR). We have generated a dynamic audio-visual Virtual Environment (VE) that has the potential of manipulating the emotional arousal level of the users measured via electrodermal activity (EDA) of skin. Much like the second-order cybernetics system, our simulations have actuators, sensors, and an adaptation mechanism, whereby participant's real-time biofeedback is interpreted and loops back into the simulation to moderate the user experience. The results of our preliminary test show that our system is capable of manipulating the emotional arousal level of the participants by using its dynamic VE.


2017 ◽  
Vol 12 (5) ◽  
pp. 882-890 ◽  
Author(s):  
Takuzo Yamashita ◽  
Mahendra Kumar Pal ◽  
Kazutoshi Matsuzaki ◽  
Hiromitsu Tomozawa ◽  
◽  
...  

To construct a virtual reality (VR) experience system for interior damage due to an earthquake, VR image contents were created by obtaining images, sounds, and vibration data from multiple devices, with synchronization information, in a room at the 10thfloor of 10-story RC structure tested at E-Defense shake table. An application for displaying 360-degree images of interior damage using a head mount display (HMD) was developed. The developed system was exhibited in public disaster prevention events, and then a questionnaire survey was conducted to assess usefulness of VR experience in disaster prevention education.


2016 ◽  
Vol 16 (1) ◽  
pp. 58-63 ◽  
Author(s):  
Bartal Henriksen ◽  
Ronni Nielsen ◽  
Laszlo Szabo ◽  
Nicolaj Evers ◽  
Martin Kraus ◽  
...  

This paper describes the implementation of a phantom limb pain (PLP) home-based system using virtual reality (VR) and a motion sensor to immerse the users in a virtual environment (VE). The work is inspired by mirror therapy (MT), which has been used to relieve PLP. The target patient group focuses on unilateral upper-limb amputees with phantom pain. Using a motion sensor, the system tracks the movement of a user's hand and translates it onto the virtual hand. The system consists of exercises including opening and closing the hand, rotating the hand, and finer finger movements. These exercises are conveyed in the VR as three games: (1) A bending game, where the patients have to bend a rod, (2) a box game where the patients pick up and place boxes with their hands, (3) and a button memory game where the patients have to push buttons in a given sequence. These games were tested on twelve healthy participants to evaluate if the games encouraged similar movements as in MT. Prior to the experiment a preliminary test was conducted on an amputee with PLP to gather qualitative feedback from an end-user. The results indicated that the games did convey the exercises from the MT, although further testing is needed.


PM&R ◽  
2019 ◽  
Vol 12 (3) ◽  
pp. 257-262 ◽  
Author(s):  
Seung Hak Lee ◽  
Hae‐Yoon Jung ◽  
Seo Jung Yun ◽  
Byung‐Mo Oh ◽  
Han Gil Seo

2019 ◽  
Vol 9 (12) ◽  
pp. 2501 ◽  
Author(s):  
Jeong-Youn Kim ◽  
Jae-Beom Son ◽  
Hyun-Sung Leem ◽  
Seung-Hwan Lee

Brain functional changes could be observed in people after an experience of virtual reality (VR). The present study investigated cyber sickness and changes of brain regional activity using electroencephalogram (EEG)-based source localization, before and after a VR experience involving a smartphone-assisted head mount display. Thirty participants (mean age = 25 years old) were recruited. All were physically healthy and had no ophthalmological diseases. Their corrected vision was better than 20/20. Resting state EEG and the simulator sickness questionnaire (SSQ) were measured before and after the VR experience. Source activity of each frequency band was calculated using the sLORETA program. After the VR experience, the SSQ total score and sub scores (nausea, oculomotor symptoms, and disorientation) were significantly increased, and brain source activations were significantly increased: alpha1 activity in the cuneus and alpha2 activity in the cuneus and posterior cingulate gyrus (PCG). The change of SSQ score (after–before) showed significant negative correlation with the change of PCG activation (after–before) in the alpha2 band. The study demonstrated increased cyber sickness and increased alpha band power in the cuneus and PCG after the VR experience. Reduced PCG activation in alpha band may be associated with the symptom severity of cyber sickness.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Seo-Young Choi ◽  
Jae-Hwan Choi ◽  
Eun Hye Oh ◽  
Se-Joon Oh ◽  
Kwang-Dong Choi

AbstractTo determine the effect of customized vestibular exercise (VE) and optokinetic stimulation (OS) using a virtual reality system in patients with persistent postural-perceptual dizziness (PPPD). Patients diagnosed with PPPD were randomly assigned to the VE group or VE with OS group. All participants received VE for 20 min using a virtual reality system with a head mount display once a week for 4 weeks. The patients in the VE with OS group additionally received OS for 9 min. We analysed the questionnaires, timed up-to-go (TUG) test, and posturography scores at baseline and after 4 weeks. A total of 28 patients (median age = 74.5, IQR 66–78, men = 12) completed the intervention. From baseline to 4 weeks, the dizziness handicap inventory, activities of daily living (ADL), visual vertigo analogue scale, and TUG improved in the VE group, but only ADL and TUG improved in the VE with OS group. Patients with severe visual vertigo improved more on their symptoms than patients with lesser visual vertigo (Pearson’s p = 0.716, p < 0.001). Our VE program can improve dizziness, quality of life, and gait function in PPPD; however, additional optokinetic stimuli should be applied for individuals with visual vertigo symptoms.


Sign in / Sign up

Export Citation Format

Share Document