scholarly journals I See Your Gesture: A VR-Based Study of Bidirectional Communication between Pedestrians and Automated Vehicles

2021 ◽  
Vol 2021 ◽  
pp. 1-10
Author(s):  
Michael R. Epke ◽  
Lars Kooijman ◽  
Joost C. F. de Winter

Automated vehicles (AVs) are able to detect pedestrians reliably but still have difficulty in predicting pedestrians’ intentions from their implicit body language. This study examined the effects of using explicit hand gestures and receptive external human-machine interfaces (eHMIs) in the interaction between pedestrians and AVs. Twenty-six participants interacted with AVs in a virtual environment while wearing a head-mounted display. The participants’ movements in the virtual environment were visualized using a motion-tracking suit. The first independent variable was the participants’ opportunity to use a hand gesture to increase the probability that the AV would stop for them. The second independent variable was the AV’s response “I SEE YOU,” displayed on an eHMI when the vehicle yielded. Accordingly, one-way communication (gesture or eHMI) and two-way communication (gesture and eHMI combined) were investigated. The results showed that the participants decided to use hand gestures in 70% of the trials. Furthermore, the eHMI improved the predictability of the AV’s behavior compared to no eHMI, as inferred from self-reports and hand-use behavior. A postexperiment questionnaire indicated that two-way communication was the most preferred condition and that the eHMI alone was more preferred than the gesture alone. The results further indicate limitations of hand gestures regarding false-positive detection and confusion if the AV decides not to yield. It is concluded that bidirectional human-robot communication has considerable potential.

Information ◽  
2019 ◽  
Vol 10 (12) ◽  
pp. 386 ◽  
Author(s):  
Lars Kooijman ◽  
Riender Happee ◽  
Joost de Winter

In future trac, automated vehicles may be equipped with external human-machine interfaces (eHMIs) that can communicate with pedestrians. Previous research suggests that, during first encounters, pedestrians regard text-based eHMIs as clearer than light-based eHMIs. However, in much of the previous research, pedestrians were asked to imagine crossing the road, and unable or not allowed to do so. We investigated the effects of eHMIs on participants’ crossing behavior. Twenty-four participants were immersed in a virtual urban environment using a head-mounted display coupled to a motion-tracking suit. We manipulated the approaching vehicles’ behavior (yielding, nonyielding) and eHMI type (None, Text, Front Brake Lights). Participants could cross the road whenever they felt safe enough to do so. The results showed that forward walking velocities, as recorded at the pelvis, were, on average, higher when an eHMI was present compared to no eHMI if the vehicle yielded. In nonyielding conditions, participants showed a slight forward motion and refrained from crossing. An analysis of participants’ thorax angle indicated rotation towards the approaching vehicles and subsequent rotation towards the crossing path. It is concluded that results obtained via a setup in which participants can cross the road are similar to results from survey studies, with eHMIs yielding a higher crossing intention compared to no eHMI. The motion suit allows investigating pedestrian behaviors related to bodily attention and hesitation.


Sensors ◽  
2021 ◽  
Vol 21 (2) ◽  
pp. 397
Author(s):  
Qimeng Zhang ◽  
Ji-Su Ban ◽  
Mingyu Kim ◽  
Hae Won Byun ◽  
Chang-Hun Kim

We propose a low-asymmetry interface to improve the presence of non-head-mounted-display (non-HMD) users in shared virtual reality (VR) experiences with HMD users. The low-asymmetry interface ensures that the HMD and non-HMD users’ perception of the VR environment is almost similar. That is, the point-of-view asymmetry and behavior asymmetry between HMD and non-HMD users are reduced. Our system comprises a portable mobile device as a visual display to provide a changing PoV for the non-HMD user and a walking simulator as an in-place walking detection sensor to enable the same level of realistic and unrestricted physical-walking-based locomotion for all users. Because this allows non-HMD users to experience the same level of visualization and free movement as HMD users, both of them can engage as the main actors in movement scenarios. Our user study revealed that the low-asymmetry interface enables non-HMD users to feel a presence similar to that of the HMD users when performing equivalent locomotion tasks in a virtual environment. Furthermore, our system can enable one HMD user and multiple non-HMD users to participate together in a virtual world; moreover, our experiments show that the non-HMD user satisfaction increases with the number of non-HMD participants owing to increased presence and enjoyment.


Author(s):  
Linda Talley ◽  
Samuel R Temple

Nonverbal immediacy is a core element of a leader’s ability to lead followers. Nevertheless, there are no empirical studies regarding a link between a leader’s hand gestures and followers’ perceptions of immediacy (attraction to someone) or nonimmediacy (distancing). Guided by Mehrabian’s theory of nonverbal behavior, this study included one independent variable segmented into seven levels (positive hand gestures defined as community hand, humility hands, and steepling hands; three defensive gestures, defined as hands in pocket, arms crossed over chest, and hands behind back; and neutral/no hand gestures) to test for immediacy or nonimmediacy. In this experimental study, participants (<em>n </em>= 300; male = 164; female = 143) were shown one of seven pictures of a leader. Four hypotheses were tested for main and interactional effects and all were supported by the results. Immediate communication received strong support, meaning immediacy on the part of a leader is likely to lead to increased emotional connection to achieve desirable outcomes. This study advances theory from previous research that specific hand gestures are more effective than others at creating immediacy between leaders and followers.


Ergonomics ◽  
1996 ◽  
Vol 39 (11) ◽  
pp. 1370-1380 ◽  
Author(s):  
TETSUO KAWARA ◽  
MASAO OHMI ◽  
TATSUYA YOSHIZAWA

Author(s):  
Iurii Krak ◽  
Ruslan Bahrii ◽  
Olexander Barmak

The article describes the information technology of alternative communication implemented by non-contact text entry using a limited number of simple dynamic gestures. Non-contact text entry technologies and motion tracking devices are analysed. A model of the human hand is proposed, which provides information on the position of the hand at each moment in time. Parameters sufficient for recognizing static and dynamic gestures are identified. The process of calculating the features of the various components of the movement that occur when showing dynamic hand gestures is considered. Common methods for selecting letters with non-contact text entry are analysed. To implement the user interaction interface, it is proposed to use a radial virtual keyboard with keys containing alphabetical letters grouped. A functional model and a model of human-computer interaction of non-contact text entry have been developed. It enabled to develop an easy-to-use software system for alternative communication, which is implemented by non-contact text entry using hand gestures. The developed software system provides a communication mechanism for people with disabilities.


2021 ◽  
Author(s):  
Silvia Arias ◽  
Axel Mossberg ◽  
Daniel Nilsson ◽  
Jonathan Wahlqvist

AbstractComparing results obtained in Virtual Reality to those obtained in physical experiments is key for validation of Virtual Reality as a research method in the field of Human Behavior in Fire. A series of experiments based on similar evacuation scenarios in a high-rise building with evacuation elevators was conducted. The experiments consisted of a physical experiment in a building, and two Virtual Reality experiments in a virtual representation of the same building: one using a Cave Automatic Virtual Environment (CAVE), and one using a head-mounted display (HMD). The data obtained in the HMD experiment is compared to data obtained in the CAVE and physical experiment. The three datasets were compared in terms of pre-evacuation time, noticing escape routes, walking paths, exit choice, waiting times for the elevators and eye-tracking data related to emergency signage. The HMD experiment was able to reproduce the data obtained in the physical experiment in terms of pre-evacuation time and exit choice, but there were large differences with the results from the CAVE experiment. Possible factors affecting the data produced using Virtual Reality are identified, such as spatial orientation and movement in the virtual environment.


2021 ◽  
Vol 10 (5) ◽  
pp. 3546-3551
Author(s):  
Tamanna Nurai

Cybersickness continues to become a negative consequence that degrades the interface for users of virtual worlds created for Virtual Reality (VR) users. There are various abnormalities that might cause quantifiable changes in body awareness when donning an Head Mounted Display (HMD) in a Virtual Environment (VE). VR headsets do provide VE that matches the actual world and allows users to have a range of experiences. Motion sickness and simulation sickness performance gives self-report assessments of cybersickness with VEs. In this study a simulator sickness questionnaire is being used to measure the aftereffects of the virtual environment. This research aims to answer if Immersive VR induce cybersickness and impact equilibrium coordination. The present research is formed as a cross-sectional observational analysis. According to the selection criteria, a total of 40 subjects would be recruited from AVBRH, Sawangi Meghe for the research. With intervention being used the experiment lasted 6 months. Simulator sickness questionnaire is used to evaluate the after-effects of a virtual environment. It holds a single period for measuring motion sickness and evaluation of equilibrium tests were done twice at exit and after 10 mins. Virtual reality being used in video games is still in its development. Integrating gameplay action into the VR experience will necessitate a significant amount of study and development. The study has evaluated if Immersive VR induce cybersickness and impact equilibrium coordination. To measure cybersickness, numerous scales have been developed. The essence of cybersickness has been revealed owing to work on motion sickness in a simulated system.


2016 ◽  
Vol 48 ◽  
pp. 261-266 ◽  
Author(s):  
Maxime T. Robert ◽  
Laurent Ballaz ◽  
Martin Lemay

Sensors ◽  
2019 ◽  
Vol 19 (16) ◽  
pp. 3548 ◽  
Author(s):  
Piotr Kaczmarek ◽  
Tomasz Mańkowski ◽  
Jakub Tomczyński

In this paper, we present a putEMG dataset intended for the evaluation of hand gesture recognition methods based on sEMG signal. The dataset was acquired for 44 able-bodied subjects and include 8 gestures (3 full hand gestures, 4 pinches and idle). It consists of uninterrupted recordings of 24 sEMG channels from the subject’s forearm, RGB video stream and depth camera images used for hand motion tracking. Moreover, exemplary processing scripts are also published. The putEMG dataset is available under a Creative Commons Attribution-NonCommercial 4.0 International (CC BY-NC 4.0). The dataset was validated regarding sEMG amplitudes and gesture recognition performance. The classification was performed using state-of-the-art classifiers and feature sets. An accuracy of 90% was achieved for SVM classifier utilising RMS feature and for LDA classifier using Hudgin’s and Du’s feature sets. Analysis of performance for particular gestures showed that LDA/Du combination has significantly higher accuracy for full hand gestures, while SVM/RMS performs better for pinch gestures. The presented dataset can be used as a benchmark for various classification methods, the evaluation of electrode localisation concepts, or the development of classification methods invariant to user-specific features or electrode displacement.


Sign in / Sign up

Export Citation Format

Share Document