gesture interfaces
Recently Published Documents


TOTAL DOCUMENTS

61
(FIVE YEARS 11)

H-INDEX

8
(FIVE YEARS 1)

Electronics ◽  
2021 ◽  
Vol 10 (24) ◽  
pp. 3078
Author(s):  
Huanwei Wu ◽  
Yi Han ◽  
Yanyin Zhou ◽  
Xiangliang Zhang ◽  
Jibin Yin ◽  
...  

To improve the efficiency of computer input, extensive research has been conducted on hand movement in a spatial region. Most of it has focused on the technologies but not the users’ spatial controllability. To assess this, we analyze a users’ common operational area through partitioning, including a layered array of one dimension and a spatial region array of two dimensions. In addition, to determine the difference in spatial controllability between a sighted person and a visually impaired person, we designed two experiments: target selection under a visual and under a non-visual scenario. Furthermore, we explored two factors: the size and the position of the target. Results showed the following: the 5 × 5 target blocks, which were 60.8 mm × 48 mm, could be easily controlled by both the sighted and the visually impaired person; the sighted person could easily select the bottom-right area; however, for the visually impaired person, the easiest selected area was the upper right. Based on the results of the users’ spatial controllability, we propose two interaction techniques (non-visual selection and a spatial gesture recognition technique for surgery) and four spatial partitioning strategies for human-computer interaction designers, which can improve the users spatial controllability.


2021 ◽  
pp. 82-99
Author(s):  
D.A. Ryumin ◽  
I.A. Kagirov

In this paper, hardware and software solutions addressed to automatic gesture recognition are considered. Trends in image analysis in the current computer vision-based approaches are analysed. Each of the considered approaches was addressed, in order to reveal their advantages and drawbacks. Research papers on the usability of gesture interfaces were reviewed. It was revealed that sensor-based systems, being quite accurate and demonstrating high speed of recognition, have limited application due to the specificity of devices (gloves, suit) and their relatively narrow distribution. At the same time, computer vision-based approaches can be successfully applied only when problems of occlusions and datasets are solved. The results obtained can be used for designing training systems.


2021 ◽  
Author(s):  
Yu Wai Chau

In order to investigate gestural behavior during human-computer interactions, an investigation into the designs of current interaction methods is conducted. This information is then compared to current emerging databases to observe if the gesture designs follow guidelines discovered in the above investigation. The comparison will also observe common trends in the currently developed gesture databases such as similar gesture for specific commands. In order to investigate gestural behavior during interactions with computer interfaces, an experiment has been devised to observe and record gestures in use for gesture databases through the use of a hardware sensor device. It was discovered that factors such as opposing adjacent fingers and gestures that simulated object manipulation are factors in user comfort. The results of this study will create guidelines for creating new gestures for hand gesture interfaces.


2021 ◽  
Author(s):  
Yu Wai Chau

In order to investigate gestural behavior during human-computer interactions, an investigation into the designs of current interaction methods is conducted. This information is then compared to current emerging databases to observe if the gesture designs follow guidelines discovered in the above investigation. The comparison will also observe common trends in the currently developed gesture databases such as similar gesture for specific commands. In order to investigate gestural behavior during interactions with computer interfaces, an experiment has been devised to observe and record gestures in use for gesture databases through the use of a hardware sensor device. It was discovered that factors such as opposing adjacent fingers and gestures that simulated object manipulation are factors in user comfort. The results of this study will create guidelines for creating new gestures for hand gesture interfaces.


2021 ◽  
Author(s):  
Sai Chaitanya Cherukumilli

Human-computer interaction systems have been providing new ways for amateurs to compose music using traditional computer peripherals as well as gesture interfaces. Vibro-tactile patterns, which are a vibrational art form similar to auditory music, can also be composed using human-computer interfaces. This thesis discusses the gesture interface system called the Vibro-Motion, which facilitates the composition of vibro-tactile patterns in real-time on an existing tactile sensory substitution system called the Emoti-Chair. The Vibro-Motion allows users to control the pitch, magnitude of the vibration as well as the position of the vibration. A usability evaluation of Vibro-Motion system showed it to be intuitive, comfortable and enjoyable for the participants.


2021 ◽  
Author(s):  
Amanda Powell

Many studies in the field of human-computer interaction (HCI) point to gesture-based interaction (GBI) as a transformative method for communicating with computers. GBI allows people to use common body language like waving or pointing to manipulate devices without physically touching them. Current research suggests that moving beyond traditional mechanical devices such as a mouse or keyboard may create richer and more ‘natural’ user experiences. Despite this finding, this mode of interaction has not seen broadscale adoption. A critical analysis of the work taking place in both industry and HCI studies demonstrates tension between the theory and practice of creating gesture-centric interfaces. This study provides a critical overview of the issues, technological, design and social, that pose a challenge to the widescale adoption of this technology.


2021 ◽  
Author(s):  
Amanda Powell

Many studies in the field of human-computer interaction (HCI) point to gesture-based interaction (GBI) as a transformative method for communicating with computers. GBI allows people to use common body language like waving or pointing to manipulate devices without physically touching them. Current research suggests that moving beyond traditional mechanical devices such as a mouse or keyboard may create richer and more ‘natural’ user experiences. Despite this finding, this mode of interaction has not seen broadscale adoption. A critical analysis of the work taking place in both industry and HCI studies demonstrates tension between the theory and practice of creating gesture-centric interfaces. This study provides a critical overview of the issues, technological, design and social, that pose a challenge to the widescale adoption of this technology.


2021 ◽  
Author(s):  
Sai Chaitanya Cherukumilli

Human-computer interaction systems have been providing new ways for amateurs to compose music using traditional computer peripherals as well as gesture interfaces. Vibro-tactile patterns, which are a vibrational art form similar to auditory music, can also be composed using human-computer interfaces. This thesis discusses the gesture interface system called the Vibro-Motion, which facilitates the composition of vibro-tactile patterns in real-time on an existing tactile sensory substitution system called the Emoti-Chair. The Vibro-Motion allows users to control the pitch, magnitude of the vibration as well as the position of the vibration. A usability evaluation of Vibro-Motion system showed it to be intuitive, comfortable and enjoyable for the participants.


Author(s):  
Sherrie Holder ◽  
Leia Stirling

There are many robotic scenarios that require real-time function in large or unconstrained environments, for example, the robotic arm on the International Space Station (ISS). Use of fully-wearable gesture control systems are well-suited to human-robot interaction scenarios where users are mobile and must have hands free. A human study examined operation of a simulated ISS robotic arm using three different gesture input mappings compared to the traditional joystick interface. Two gesture mappings permitted multiple simultaneous inputs (multi-input), while the third was a single-input method. Experimental results support performance advantages of multi-input gesture methods over single input. Differences between the two multi-input methods in task completion and workload display an effect of user-directed attention on interface success. Mappings based on natural human arm movement are promising for gesture interfaces in mobile robotic applications. This study also highlights challenges in gesture mapping, including how users align gestures with their body and environment.


2020 ◽  
Vol 2 (2) ◽  
pp. 153-161
Author(s):  
Egemen Ertugrul ◽  
Ping Li ◽  
Bin Sheng

Sign in / Sign up

Export Citation Format

Share Document