touch gestures
Recently Published Documents


TOTAL DOCUMENTS

124
(FIVE YEARS 36)

H-INDEX

10
(FIVE YEARS 2)

Machines ◽  
2021 ◽  
Vol 10 (1) ◽  
pp. 15
Author(s):  
Akiyoshi Hayashi ◽  
Liz Katherine Rincon-Ardila ◽  
Gentiane Venture

In the future, in a society where robots and humans live together, HRI is an important field of research. While most human–robot-interaction (HRI) studies focus on appearance and dialogue, touch-communication has not been the focus of many studies despite the importance of its role in human–human communication. This paper investigates how and where humans touch an inorganic non-zoomorphic robot arm. Based on these results, we install touch sensors on the robot arm and conduct experiments to collect data of users’ impressions towards the robot when touching it. Our results suggest two main things. First, the touch gestures were collected with two sensors, and the collected data can be analyzed using machine learning to classify the gestures. Second, communication between humans and robots using touch can improve the user’s impression of the robots.


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Ioannis Stylios ◽  
Spyros Kokolakis ◽  
Andreas Skalkos ◽  
Sotirios Chatzis

Purpose The purpose of this paper is to present a new paradigm, named BioGames, for the extraction of behavioral biometrics (BB) conveniently and entertainingly. To apply the BioGames paradigm, the authors developed a BB collection tool for mobile devices named BioGames App. The BioGames App collects keystroke dynamics, touch gestures, and motion modalities and is available on GitHub. Interested researchers and practitioners may use it to create their datasets for research purposes. Design/methodology/approach One major challenge for BB and continuous authentication (CA) research is the lack of actual BB datasets for research purposes. The compilation and refinement of an appropriate set of BB data constitute a challenge and an open problem. The issue is aggravated by the fact that most users are reluctant to participate in long demanding procedures entailed in the collection of research biometric data. As a result, they do not complete the data collection procedure, or they do not complete it correctly. Therefore, the authors propose a new paradigm and introduce a BB collection tool, which they call BioGames, for the extraction of biometric features in a convenient way. The BioGames paradigm proposes a methodology where users play games without participating in an experimental painstaking process. The BioGames App collects keystroke dynamics, touch gestures, and motion modalities. Findings The authors proposed a new paradigm for the collection of BB on mobile devices and created the BioGames application. The BioGames App is an Android application that collects BB data on mobile devices and sends them to a database. The database design allows multiple users to store their sensor data at any time. Thus, there is no concern about data separation and synchronization. BioGames App is General Data Protection Regulation (GDPR) compliant as it collects and processes only anonymous data. Originality/value The BioGames App is a publicly available tool that combines the keystroke dynamics, touch gestures, and motion modalities. In addition, it uses a methodology where users play games without participating in an experimental painstaking process.


2021 ◽  
Vol 34 (4) ◽  
pp. 157-171
Author(s):  
Huhn Kim ◽  
Huhn Kim ◽  
Hojeong Im ◽  
Hojeong Im
Keyword(s):  

2021 ◽  
pp. 41-41
Author(s):  
Barbara Juskow
Keyword(s):  

2021 ◽  
Vol 12 ◽  
Author(s):  
Mary L. Courage ◽  
Lynn M. Frizzell ◽  
Colin S. Walsh ◽  
Megan Smith

Although very young children have unprecedented access to touchscreen devices, there is limited research on how successfully they operate these devices for play and learning. For infants and toddlers, whose cognitive, fine motor, and executive functions are immature, several basic questions are significant: (1) Can they operate a tablet purposefully to achieve a goal? (2) Can they acquire operating skills and learn new information from commercially available apps? (3) Do individual differences in executive functioning predict success in using and learning from the apps? Accordingly, 31 2-year-olds (M = 30.82 month, SD = 2.70; 18 female) were compared with 29 3-year-olds (M = 40.92 month, SD = 4.82; 13 female) using two commercially available apps with different task and skill requirements: (1) a shape matching app performed across 3 days, and (2) a storybook app with performance compared to that on a matched paper storybook. Children also completed (3) the Minnesota Executive Functioning Scale. An adult provided minimal scaffolding throughout. The results showed: (1) toddlers could provide simple goal-directed touch gestures and the manual interactions needed to operate the tablet (2) after controlling for prior experience with shape matching, toddlers’ increased success and efficiency, made fewer errors, decreased completion times, and required less scaffolding across trials, (3) they recognized more story content from the e-book and were less distracted than from the paper book, (4) executive functioning contributed unique variance to the outcome measures on both apps, and (5) 3-year-olds outperformed 2-year-olds on all measures. The results are discussed in terms of the potential of interactive devices to support toddlers’ learning.


2021 ◽  
Author(s):  
Jeffrey Haber

This thesis presents a reconfigurable Ground Control Station designed for Unmanned Aerial Vehicle use, which utilizes multi-touch gesture inputs as well as the ability for the operator to personalize where the instruments they interact with are located on screen. The Ground Control Station that is presented was designed and developed in Ryerson University’s Mixed-Reality Immersive Motion Simulation Laboratory utilizing commercial off the shelf programs supplied by Presagis. Presagis’ VAPS XT 4.1 beta was used to design and develop the actual Ground Control Station’s User Interface due to its ability to create high quality interfaces for aircraft that harness multi-touch gestures. While FlightSIM 14 was used to simulate a high fidelity aircraft being controlled by the Ground Control Station. The final interface was comprised of six key features and 12 different instrument panels that could be manipulated by the operator to control a simulated aircraft throughout a virtual environment.


2021 ◽  
Author(s):  
Jeffrey Haber

This thesis presents a reconfigurable Ground Control Station designed for Unmanned Aerial Vehicle use, which utilizes multi-touch gesture inputs as well as the ability for the operator to personalize where the instruments they interact with are located on screen. The Ground Control Station that is presented was designed and developed in Ryerson University’s Mixed-Reality Immersive Motion Simulation Laboratory utilizing commercial off the shelf programs supplied by Presagis. Presagis’ VAPS XT 4.1 beta was used to design and develop the actual Ground Control Station’s User Interface due to its ability to create high quality interfaces for aircraft that harness multi-touch gestures. While FlightSIM 14 was used to simulate a high fidelity aircraft being controlled by the Ground Control Station. The final interface was comprised of six key features and 12 different instrument panels that could be manipulated by the operator to control a simulated aircraft throughout a virtual environment.


Author(s):  
Marc Hesenius ◽  
Markus Kleffmann ◽  
Volker Gruhn

Abstract To gain a common understanding of an application’s layouts, dialogs and interaction flows, development teams often sketch user interface (UI). Nowadays, they must also define multi-touch gestures, but tools for sketching UIs often lack support for custom gestures and typically just integrate a basic predefined gesture set, which might not suffice to specifically tailor the interaction to the desired use cases. Furthermore, sketching can be enhanced with digital means, but it remains unclear whether digital sketching is actually beneficial when designing gesture-based applications. We extended the AugIR, a digital sketching environment, with GestureCards, a hybrid gesture notation, to allow software engineers to define custom gestures when sketching UIs. We evaluated our approach in a user study contrasting digital and analog sketching of gesture-based UIs.


Sensors ◽  
2021 ◽  
Vol 21 (4) ◽  
pp. 1328
Author(s):  
Jorge Martin-Gutierrez ◽  
Marta Sylvia Del Rio Guerra

There has been a conscious shift towards developing increasingly inclusive applications. However, despite this fact, most research has focused on supporting those with visual or hearing impairments and less attention has been paid to cognitive impairments. The purpose of this study is to analyse touch gestures used for touchscreens and identify which gestures are suitable for individuals living with Down syndrome (DS) or other forms of physical or cognitive impairments. With this information, app developers can satisfy Design for All (DfA) requirements by selecting adequate gestures from existing lists of gesture sets. Twenty touch gestures were defined for this study and a sample group containing eighteen individuals with Down syndrome was used. A tool was developed to measure the performance of touch gestures and participants were asked to perform simple tasks that involved the repeated use of these twenty gestures. Three variables are analysed to establish whether they influence the success rates or completion times of gestures, as they could have a collateral effect on the skill with which gestures are performed. These variables are Gender, Type of Down syndrome, and Socioeconomic Status. Analysis reveals that significant difference is present when a pairwise comparison is performed, meaning individuals with DS cannot perform all gestures with the same ease. The variables Gender and Socioeconomic Status do not influence success rates or completion times, but Type of DS does.


Sign in / Sign up

Export Citation Format

Share Document