Fast 3D hand estimation for mobile interactions

Author(s):  
Yuru Pei ◽  
Gengyu Ma
Keyword(s):  
Author(s):  
Lisbeth Amhag

The aim of this study is to analyze mobile technologies for student centered learning in a distance higher education program with a focus on mobile online webinars (web-based seminars or conferencing) using mobile applications such as laptops, smart phones, or tablets. As wearable technologies continue to grow it could very well extend to smart glasses, smart watches etc. These tools can provide face-to-face interactions, recording flipped classrooms and parallel chat communications. The data collection consists of observations of ten online face-to-face webinars with 22 students, six interviews, and two surveys. Theoretically, the study joins the research tradition of Computer-Supported Collaborative Learning with emphasis on collaboration, and Computer Self-Efficacy concerned with individuals' media and information literacy. Important conclusions from the study demonstrated mobile interactions increased student centered learning on theoretical concepts, assisted in the ability to review information critically, and provided experiences bridging professional teaching practices.


2018 ◽  
pp. 1084-1094
Author(s):  
Stefan Schneegass ◽  
Thomas Olsson ◽  
Sven Mayer ◽  
Kristof van Laerhoven

Wearable computing has a huge potential to shape the way we interact with mobile devices in the future. Interaction with mobile devices is still mainly limited to visual output and tactile finger-based input. Despite the visions of next-generation mobile interaction, the hand-held form factor hinders new interaction techniques becoming commonplace. In contrast, wearable devices and sensors are intended for more continuous and close-to-body use. This makes it possible to design novel wearable-augmented mobile interaction methods – both explicit and implicit. For example, the EEG signal from a wearable breast strap could be used to identify user status and change the device state accordingly (implicit) and the optical tracking with a head-mounted camera could be used to recognize gestural input (explicit). In this paper, the authors outline the design space for how the existing and envisioned wearable devices and sensors could augment mobile interaction techniques. Based on designs and discussions in a recently organized workshop on the topic as well as other related work, the authors present an overview of this design space and highlight some use cases that underline the potential therein.


2020 ◽  
Vol 21 (2) ◽  
pp. 1-13
Author(s):  
Hea-Suk Kim ◽  
Yoonjung Cha ◽  
Na-Young Kim

2013 ◽  
Vol 9 (2) ◽  
pp. 281-294 ◽  
Author(s):  
Ralph Barthel ◽  
Alexander Kröner ◽  
Jens Haupert

Author(s):  
Mayra Donaji Barrera Machuca ◽  
Winyu Chinthammit ◽  
Yi Yang ◽  
Henry Duh

Author(s):  
Andrew Molineux ◽  
Keith Cheverst

In recent years, vision recognition applications have made the transition from desktop computers to mobile phones. This has allowed a new range of mobile interactions and applications to be realised. However, this shift has unearthed new issues in mobile hardware, interactions and usability. As such the authors present a survey into mobile vision recognition, outlining a number of academic and commercial applications, analysing what tasks they are able to perform and how they achieve them. The authors conclude with a discussion on the issues and trends found in the survey.


2016 ◽  
Vol 58 (5) ◽  
Author(s):  
Daniel Buschek

AbstractThis essay contributes an extended view on user information inferred by personal devices to motivate applications of biometrics beyond user identification. We unfold a new design space in two parts: First, we take inspiration from the shared focus on individuality in both biometrics and Belk's


Sign in / Sign up

Export Citation Format

Share Document