scholarly journals Role Assignment Via Physical Mobile Interaction Techniques in Mobile Multi-user Applications for Children

Author(s):  
Karin Leichtenstern ◽  
Elisabeth André ◽  
Thurid Vogt
Author(s):  
Andrew Dekker ◽  
Justin Marrington ◽  
Stephen Viller

Unlike traditional forms of Human-Computer Interaction (such as conducting desktop or Web-based design), mobile design has by its nature little control over the contextual variables of its research. Short-term evaluations of novel mobile interaction techniques are abundant, but these controlled studies only address limited contexts through artificial deployments, which cannot hope to reveal the patterns of use that arise as people appropriate a tool and take it with them into the varying social and physical contexts of their lives. The authors propose a rapid and reflective model of in-situ deployment of high-fidelity prototypes, borrowing the tested habits of industry, where researchers relinquish tight control over their prototypes in exchange for an opportunity to observe patterns of use that would be intractable to plan for in controlled studies. The approach moves the emphasis in prototyping away from evaluation and towards exploration and reflection, promoting an iterative prototyping methodology that captures the complexities of the real world.


2018 ◽  
pp. 1084-1094
Author(s):  
Stefan Schneegass ◽  
Thomas Olsson ◽  
Sven Mayer ◽  
Kristof van Laerhoven

Wearable computing has a huge potential to shape the way we interact with mobile devices in the future. Interaction with mobile devices is still mainly limited to visual output and tactile finger-based input. Despite the visions of next-generation mobile interaction, the hand-held form factor hinders new interaction techniques becoming commonplace. In contrast, wearable devices and sensors are intended for more continuous and close-to-body use. This makes it possible to design novel wearable-augmented mobile interaction methods – both explicit and implicit. For example, the EEG signal from a wearable breast strap could be used to identify user status and change the device state accordingly (implicit) and the optical tracking with a head-mounted camera could be used to recognize gestural input (explicit). In this paper, the authors outline the design space for how the existing and envisioned wearable devices and sensors could augment mobile interaction techniques. Based on designs and discussions in a recently organized workshop on the topic as well as other related work, the authors present an overview of this design space and highlight some use cases that underline the potential therein.


Author(s):  
Enrico Rukzio ◽  
Karin Leichtenstern ◽  
Vic Callaghan ◽  
Paul Holleis ◽  
Albrecht Schmidt ◽  
...  

2016 ◽  
Vol 8 (4) ◽  
pp. 104-114 ◽  
Author(s):  
Stefan Schneegass ◽  
Thomas Olsson ◽  
Sven Mayer ◽  
Kristof van Laerhoven

Wearable computing has a huge potential to shape the way we interact with mobile devices in the future. Interaction with mobile devices is still mainly limited to visual output and tactile finger-based input. Despite the visions of next-generation mobile interaction, the hand-held form factor hinders new interaction techniques becoming commonplace. In contrast, wearable devices and sensors are intended for more continuous and close-to-body use. This makes it possible to design novel wearable-augmented mobile interaction methods – both explicit and implicit. For example, the EEG signal from a wearable breast strap could be used to identify user status and change the device state accordingly (implicit) and the optical tracking with a head-mounted camera could be used to recognize gestural input (explicit). In this paper, the authors outline the design space for how the existing and envisioned wearable devices and sensors could augment mobile interaction techniques. Based on designs and discussions in a recently organized workshop on the topic as well as other related work, the authors present an overview of this design space and highlight some use cases that underline the potential therein.


2013 ◽  
Vol 18 (4) ◽  
pp. 1013-1026 ◽  
Author(s):  
Julian Seifert ◽  
David Dobbelstein ◽  
Dominik Schmidt ◽  
Paul Holleis ◽  
Enrico Rukzio

Author(s):  
Sean T. Hayes ◽  
Julie A. Adams

Linear changes in position are difficult to measure using only a mobile device’s onboard sensors. Prior research has relied on external sensors or known environmental references in order to develop mobile phone interaction techniques. The Amazon Fire Phone’s® unique head-tracking capabilities were leveraged and evaluated for navigating large application spaces using device motion gestures. Although touch interaction is shown to outperform the device-motion, this research demonstrates the feasibility of using effective device-motion gestures that rely on changes in the device’s position and orientation. Design guidance for future device motion interaction capabilities is provided.


Sign in / Sign up

Export Citation Format

Share Document