Development of Gesture-based Commands for Natural User Interfaces

Author(s):  
Shannon K. T. Bailey ◽  
Daphne E. Whitmer ◽  
Bradford L. Schroeder ◽  
Valerie K. Sims

Human-computer interfaces are changing to meet the evolving needs of users and overcome limitations of previous generations of computer systems. The current state of computers consists largely of graphical user interfaces (GUI) that incorporate windows, icons, menus, and pointers (WIMPs) as visual representations of computer interactions controlled via user input on a mouse and keyboard. Although this model of interface has dominated human-computer interaction for decades, WIMPs require an extra step between the user’s intent and the computer action, imposing both limitations on the interaction and introducing cognitive demands (van Dam, 1997). Alternatively, natural user interfaces (NUI) employ input methods such as speech, touch, and gesture commands. With NUIs, users can interact directly with the computer without using an intermediary device (e.g., mouse, keyboard). Using the body as an input device may be more “natural” because it allows the user to apply existing knowledge of how to interact with the world (Roupé, Bosch-Sijtsema, & Johansson, 2014). To utilize the potential of natural interfaces, research must first determine what interactions can be considered natural. For the purpose of this paper, we focus on the naturalness of gesture-based interfaces. The purpose of this study was to determine how people perform natural gesture-based computer actions. To answer this question, we first narrowed down potential gestures that would be considered natural for an action. In a previous study, participants ( n=17) were asked how they would gesture to interact with a computer to complete a series of actions. After narrowing down the potential natural gestures by calculating the most frequently performed gestures for each action, we asked participants ( n=188) to rate the naturalness of the gestures in the current study. Participants each watched 26 videos of gestures (3-5 seconds each) and were asked how natural or arbitrary they interpreted each gesture for the series of computer commands (e.g., move object left, shrink object, select object, etc.). The gestures in these videos included the 17 gestures that were most often performed in the previous study in which participants were asked what gesture they would naturally use to complete the computer actions. Nine gestures were also included that were created arbitrarily to act as a comparison to the natural gestures. By analyzing the ratings on a continuum from “Completely Arbitrary” to “Completely Natural,” we found that the natural gestures people produced in the first study were also interpreted as the intended action by this separate sample of participants. All the gestures that were rated as either “Mostly Natural” or “Completely Natural” by participants corresponded to how the object manipulation would be performed physically. For example, the gesture video that depicts a fist closing was rated as “natural” by participants for the action of “selecting an object.” All of the gestures that were created arbitrarily were interpreted as “arbitrary” when they did not correspond to the physical action. Determining how people naturally gesture computer commands and how people interpret those gestures is useful because it can inform the development of NUIs and contributes to the literature on what makes gestures seem “natural.”

Author(s):  
John Fulcher

Much has changed in computer interfacing since the early days of computing—or has it? Admittedly, gone are the days of punched cards and/or paper tape readers as input devices; likewise, monitors (displays) have superseded printers as the primary output device. Nevertheless, the QWERTY keyboard shows little sign of falling into disuse—this is essentially the same input device as those used on the earliest (electromechanical) TeleTYpewriters, in which the “worst” key layout was deliberately chosen to slow down user input (i.e., fast typists). The three major advances since the 1950s have been (1) the rise of low cost (commodity off-theshelf) CRT monitors in the 1960s (and in more recent times, LCD ones), (2) the replacement of (text-based) command line interfaces with graphical user interfaces in the 1980s, and (3) the rise of the Internet/World Wide Web during the 1990s. In recent times, while speech recognition (and synthesis) has made some inroads (i.e., McTeal, 2002; O’Shaughnessy, 2003), the QWERTY keyboard and mouse remain the dominant input modalities.


2013 ◽  
Vol 8-9 ◽  
pp. 535-542
Author(s):  
Georgiana Simion

The use of hand as direct input device evolved with the development of Natural User Interfaces (NUI). Touch screens are well integrated in our daily life and the new challenge is to implement interfaces which involve no direct contact. This paper presents such a solution implemented within the framework of sparse techniques. The feature vectors representing key distances and angles are extracted and used to detect fingers. The experimental results have demonstrated that this technique is able to obtain an error rate about 5% in finger detection.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Shreeya Sriram ◽  
Shitij Avlani ◽  
Matthew P. Ward ◽  
Shreyas Sen

AbstractContinuous multi-channel monitoring of biopotential signals is vital in understanding the body as a whole, facilitating accurate models and predictions in neural research. The current state of the art in wireless technologies for untethered biopotential recordings rely on radiative electromagnetic (EM) fields. In such transmissions, only a small fraction of this energy is received since the EM fields are widely radiated resulting in lossy inefficient systems. Using the body as a communication medium (similar to a ’wire’) allows for the containment of the energy within the body, yielding order(s) of magnitude lower energy than radiative EM communication. In this work, we introduce Animal Body Communication (ABC), which utilizes the concept of using the body as a medium into the domain of untethered animal biopotential recording. This work, for the first time, develops the theory and models for animal body communication circuitry and channel loss. Using this theoretical model, a sub-inch$$^3$$ 3 [1″ × 1″ × 0.4″], custom-designed sensor node is built using off the shelf components which is capable of sensing and transmitting biopotential signals, through the body of the rat at significantly lower powers compared to traditional wireless transmissions. In-vivo experimental analysis proves that ABC successfully transmits acquired electrocardiogram (EKG) signals through the body with correlation $$>99\%$$ > 99 % when compared to traditional wireless communication modalities, with a 50$$\times$$ × reduction in power consumption.


2021 ◽  
pp. 136787792110035
Author(s):  
Mari Lehto ◽  
Susanna Paasonen

This article investigates the affective power of social media by analysing everyday encounters with parenting content among mothers. Drawing on data composed of diaries of social media use and follow-up interviews with six women, we ask how our study participants make sense of their experiences of parenting content and the affective intensities connected to it. Despite the negativity involved in reading and participating in parenting discussions, the participants find themselves wanting to maintain the very connections that irritate them, or even evoke a sense of failure, as these also yield pleasure, joy and recognition. We suggest that the ambiguities addressed in our research data speak of something broader than the specific experiences of the women in question. We argue that they point to the necessity of focusing on, and working through affective ambiguity in social media research in order to gain fuller understanding the complex appeal of platforms and exchanges.


2021 ◽  
Vol 1 ◽  
pp. 283-292
Author(s):  
Jakob Harlan ◽  
Benjamin Schleich ◽  
Sandro Wartzack

AbstractThe increased availability of affordable virtual reality hardware in the last years boosted research and development of such systems for many fields of application. While extended reality systems are well established for visualization of product data, immersive authoring tools that can create and modify that data are yet to see widespread productive use. Making use of building blocks, we see the possibility that such tools allow quick expression of spatial concepts, even for non-expert users. Optical hand-tracking technology allows the implementation of this immersive modeling using natural user interfaces. Here the users manipulated the virtual objects with their bare hands. In this work, we present a systematic collection of natural interactions suited for immersive building-block-based modeling systems. The interactions are conceptually described and categorized by the task they fulfil.


2021 ◽  
pp. 003329412199777
Author(s):  
Robin Besse ◽  
Whitney K. Whitaker ◽  
Laura A. Brannon

While many facets of loneliness have been explored, research examining the efficacy of loneliness interventions has been overlooked among young adults. The study of loneliness among young adults has become increasingly important considering the current state of isolation and stay-at-home orders issued to prevent the spread of COVID-19. Preliminary reports suggest an increase in loneliness as a result of the current health pandemic, especially among young adults, who have reported feeling lonelier than any other age group. Such findings warrant the study of ways to help reduce loneliness among young adults. The current study examined the efficacy of strategies that might be used to help young adults manage feelings of loneliness. Two hundred and seventy-eight young adults completed the study. Participants read one of four messages: mindfulness, social cognitions, coping behaviors, or a control. Participants in the mindfulness condition felt better equipped to manage future instances of loneliness and held better attitudes toward this intervention. The current research helps to advance understanding of effective ways of helping young adults cope with loneliness.


Semantic Web ◽  
2021 ◽  
pp. 1-16
Author(s):  
Esko Ikkala ◽  
Eero Hyvönen ◽  
Heikki Rantala ◽  
Mikko Koho

This paper presents a new software framework, Sampo-UI, for developing user interfaces for semantic portals. The goal is to provide the end-user with multiple application perspectives to Linked Data knowledge graphs, and a two-step usage cycle based on faceted search combined with ready-to-use tooling for data analysis. For the software developer, the Sampo-UI framework makes it possible to create highly customizable, user-friendly, and responsive user interfaces using current state-of-the-art JavaScript libraries and data from SPARQL endpoints, while saving substantial coding effort. Sampo-UI is published on GitHub under the open MIT License and has been utilized in several internal and external projects. The framework has been used thus far in creating six published and five forth-coming portals, mostly related to the Cultural Heritage domain, that have had tens of thousands of end-users on the Web.


2015 ◽  
Vol 25 (1) ◽  
pp. 17-34 ◽  
Author(s):  
Juan-Fernando Martin-SanJose ◽  
M.-Carmen Juan ◽  
Ramón Mollá ◽  
Roberto Vivó

2020 ◽  
Vol 11 (1) ◽  
pp. 147-160
Author(s):  
Leonardo Mariano Gomes ◽  
Rita Wu

AbstractIn this article, we present TouchYou, a pair of wearable interfaces that enable affective touch interactions with people at long-distance. Through a touch-sensitive interface, which works by touch, pressure and capacitance, the body becomes the own input for stimulating the other body, which has a stimulation interface that enables the feeling of being touched. The person receives an electrical muscle stimulation, thermal and mechanical stimulation that react depending on the touch sensed by the first interface. By using the TouchYou, people can stimulate each other, using their own body, not only for sexual relations at a distance but for the production of affection and another way of feeling. We discuss the importance of the touch for human relationships, the current state of the art in haptic interfaces and how the technology can be used for the affection remote transmission. We present the design process of the TouchYou sensitive and stimulation interfaces, with a contribution of a method for developing custom touch sensors, we explore usage scenarios for the technology, including sex toys and sex robots and we present the concept of using the body as a remote sex interface.


Sign in / Sign up

Export Citation Format

Share Document