Kinetic User Interfaces

2009 ◽  
pp. 1015-1035 ◽  
Author(s):  
Vincenzo Pallotta ◽  
Pascal Bruegger

This chapter presents a conceptual framework for an emerging type of user interfaces for mobile ubiquitous computing systems, and focuses in particular on the interaction through motion of people and objects in physical space. We introduce the notion of Kinetic User Interface as a unifying framework and a middleware for the design of pervasive interfaces, in which motion is considered as the primary input modality.

Author(s):  
Vincenzo Pallotta ◽  
Pascal Bruegger ◽  
Béat Hirsbrunner

This chapter presents a conceptual framework for an emerging type of user interfaces for mobile ubiquitous computing systems, and focuses in particular on the interaction through motion of people and objects in physical space. We introduce the notion of Kinetic User Interface as a unifying framework and a middleware for the design of pervasive interfaces, in which motion is considered as the primary input modality.


Author(s):  
Vincenzo Pallotta ◽  
Pascal Bruegger ◽  
Béat Hirsbrunner

This chapter presents a conceptual framework for an emerging type of user interfaces for mobile ubiquitous computing systems, and focuses in particular on the interaction through motion of people and objects in physical space. We introduce the notion of Kinetic User Interface as a unifying framework and a middleware for the design of pervasive interfaces, in which motion is considered as the primary input modality.


Author(s):  
Vincenzo Pallotta

Unobtrusiveness is a key factor in usability of mobile and ubiquitous computing systems. These systems are made of several ambient and mobile devices whose goal is supporting everyday life users’ activities, hopefully without interfering with them. We intend to address the topic of obtrusiveness by assessing its impact in the design of interfaces for mobile and ubiquitous computing systems. We will make the case of how unobtrusive interfaces can be designed by means of Kinetic User Interfaces: an emerging interaction paradigm where input to system is provided through coordinated motion of objects and people in the physical space.


Sensors ◽  
2016 ◽  
Vol 16 (7) ◽  
pp. 1049 ◽  
Author(s):  
Gervasio Varela ◽  
Alejandro Paz-Lopez ◽  
Jose Becerra ◽  
Richard Duro

2020 ◽  
Author(s):  
Alvaro Pastor

Typing is still the primary input modality for computing systems. Most typical Virtual Reality (VR) setups replace users' capable hands and fingers with cumbersome hand-held controllers (HC). This study examines the hypothesis that finger interaction and realistic representation of users' hands increases typing performance, sense of Presence, and the usability of a typing system for a text transcription task in VR. We developed a hand and finger tracking and visualization system (VH) aimed to help users interact with on-screen keyboards in VR, and compared participants typing performance using HC. We found that the VH paradigm in VR significantly increased typing performance for inexperienced typists and that HC users were more prone to commit errors. Further research may delve deeper into the utility of VH input paradigm for people unable to grasp HCs and for other symbolic communications such as sign language.


Author(s):  
Andreas Hartl

Ubiquitous computing with its multitude of devices certainly makes it necessary to supplant the desktop metaphor of graphical user interfaces by other kinds of user interfaces. Applications must adapt themselves to many modalities: they must support a wide variety of devices and interaction languages. Software engineering methods and tools also need to embrace this change so that developers can build usable adaptive applications more easily. This chapter will present three different software engineering approaches that address this challenge: extensions to Web-based approaches, abstract user interface definitions that add a level of abstraction to the user interface definition, and model-based approaches that extend model-based application development to integrate user interface issues as well.


Author(s):  
Jörn Loviscach

An annual and highly visible event of the HCI community, the Association for Computing Machinery’s Conference on Human Factors in Computing Systems (CHI) demonstrated the state of the art and trends in user interface technology. Even though the April 2009 conference did not focus specifically on visual user interfaces, which form the focus of IJCICG, about two dozen of the contributions—ranging from posters to full papers—presented promising ideas or addressed vital but as yet mostly overlooked issues in graphicsbased interaction. This event review summarizes some of the most interesting of those aspects seen at CHI 2009, and provides a set of event-related references as context.


Sign in / Sign up

Export Citation Format

Share Document