scholarly journals Dots — An Inclusive Natural User Interfaces (NUI) for Spatial Computing

Author(s):  
Weilun Gong ◽  
Lan Xiao ◽  
Xiaohui Wang ◽  
Chang Hee Lee
2021 ◽  
Vol 1 ◽  
pp. 283-292
Author(s):  
Jakob Harlan ◽  
Benjamin Schleich ◽  
Sandro Wartzack

AbstractThe increased availability of affordable virtual reality hardware in the last years boosted research and development of such systems for many fields of application. While extended reality systems are well established for visualization of product data, immersive authoring tools that can create and modify that data are yet to see widespread productive use. Making use of building blocks, we see the possibility that such tools allow quick expression of spatial concepts, even for non-expert users. Optical hand-tracking technology allows the implementation of this immersive modeling using natural user interfaces. Here the users manipulated the virtual objects with their bare hands. In this work, we present a systematic collection of natural interactions suited for immersive building-block-based modeling systems. The interactions are conceptually described and categorized by the task they fulfil.


2015 ◽  
Vol 25 (1) ◽  
pp. 17-34 ◽  
Author(s):  
Juan-Fernando Martin-SanJose ◽  
M.-Carmen Juan ◽  
Ramón Mollá ◽  
Roberto Vivó

Author(s):  
Shannon K. T. Bailey ◽  
Daphne E. Whitmer ◽  
Bradford L. Schroeder ◽  
Valerie K. Sims

Human-computer interfaces are changing to meet the evolving needs of users and overcome limitations of previous generations of computer systems. The current state of computers consists largely of graphical user interfaces (GUI) that incorporate windows, icons, menus, and pointers (WIMPs) as visual representations of computer interactions controlled via user input on a mouse and keyboard. Although this model of interface has dominated human-computer interaction for decades, WIMPs require an extra step between the user’s intent and the computer action, imposing both limitations on the interaction and introducing cognitive demands (van Dam, 1997). Alternatively, natural user interfaces (NUI) employ input methods such as speech, touch, and gesture commands. With NUIs, users can interact directly with the computer without using an intermediary device (e.g., mouse, keyboard). Using the body as an input device may be more “natural” because it allows the user to apply existing knowledge of how to interact with the world (Roupé, Bosch-Sijtsema, & Johansson, 2014). To utilize the potential of natural interfaces, research must first determine what interactions can be considered natural. For the purpose of this paper, we focus on the naturalness of gesture-based interfaces. The purpose of this study was to determine how people perform natural gesture-based computer actions. To answer this question, we first narrowed down potential gestures that would be considered natural for an action. In a previous study, participants ( n=17) were asked how they would gesture to interact with a computer to complete a series of actions. After narrowing down the potential natural gestures by calculating the most frequently performed gestures for each action, we asked participants ( n=188) to rate the naturalness of the gestures in the current study. Participants each watched 26 videos of gestures (3-5 seconds each) and were asked how natural or arbitrary they interpreted each gesture for the series of computer commands (e.g., move object left, shrink object, select object, etc.). The gestures in these videos included the 17 gestures that were most often performed in the previous study in which participants were asked what gesture they would naturally use to complete the computer actions. Nine gestures were also included that were created arbitrarily to act as a comparison to the natural gestures. By analyzing the ratings on a continuum from “Completely Arbitrary” to “Completely Natural,” we found that the natural gestures people produced in the first study were also interpreted as the intended action by this separate sample of participants. All the gestures that were rated as either “Mostly Natural” or “Completely Natural” by participants corresponded to how the object manipulation would be performed physically. For example, the gesture video that depicts a fist closing was rated as “natural” by participants for the action of “selecting an object.” All of the gestures that were created arbitrarily were interpreted as “arbitrary” when they did not correspond to the physical action. Determining how people naturally gesture computer commands and how people interpret those gestures is useful because it can inform the development of NUIs and contributes to the literature on what makes gestures seem “natural.”


Author(s):  
Nicolás Jofré ◽  
Graciela Rodríguez ◽  
Yoselie Alvarado ◽  
Jacqueline Fernández ◽  
Roberto Guerrero

2015 ◽  
Vol 57 (1) ◽  
Author(s):  
Johannes Schöning

AbstractMy research interest lies at the interaction between human-computer interaction (HCI) and geoinformatics. I am interested in developing new methods and novel user interfaces to navigate through spatial information. This article will give a brief overview on my past and current research topics and streams. Generally speaking, geography is playing an increasingly important role in computer science and also in the field of HCI ranging from social computing to natural user interfaces (NUIs). At the same time, research in geography has focused more and more on technology-mediated interaction with spatiotemporal phenomena. By bridging the two fields, my aim is to exploit this fruitful intersection between those two and develop, design and evaluate user interfaces that help people to solve their daily tasks more enjoyable and effectively.


Sign in / Sign up

Export Citation Format

Share Document