Bodily Engagement in Multimodal Interaction

Author(s):  
Kai Tuuri ◽  
Antti Pirhonen ◽  
Pasi Välkkynen

The creative processes of interaction design operate in terms we generally use for conceptualising human-computer interaction (HCI). Therefore the prevailing design paradigm provides a framework that essentially affects and guides the design process. We argue that the current mainstream design paradigm for multimodal user-interfaces takes human sensory-motor modalities and the related userinterface technologies as separate channels of communication between user and an application. Within such a conceptualisation, multimodality implies the use of different technical devices in interaction design. This chapter outlines an alternative design paradigm, which is based on an action-oriented perspective on human perception and meaning creation process. The proposed perspective stresses the integrated sensory-motor experience and the active embodied involvement of a subject in perception coupled as a natural part of interaction. The outlined paradigm provides a new conceptual framework for the design of multimodal user interfaces. A key motivation for this new framework is in acknowledging multimodality as an inevitable quality of interaction and interaction design, the existence of which does not depend on, for example, the number of implemented presentation modes in an HCI application. We see that the need for such an interaction- and experience-derived perspective is amplified within the trend for computing to be moving into smaller devices of various forms which are being embedded into our everyday life. As a brief illustration of the proposed framework in practice, one case study of sonic interaction design is presented.

Author(s):  
Marco Blumendorf ◽  
Grzegorz Lehmann ◽  
Dirk Roscher ◽  
Sahin Albayrak

The widespread use of computing technology raises the need for interactive systems that adapt to user, device and environment. Multimodal user interfaces provide the means to support the user in various situations and to adapt the interaction to the user’s needs. In this chapter we present a system utilizing design-time user interface models at runtime to provide flexible multimodal user interfaces. The server-based system allows the combination and integration of multiple devices to support multimodal interaction and the adaptation of the user interface to the used devices, the user and the environment. The utilization of the user interface models at runtime allows exploiting the design information for advanced adaptation possibilities. An implementation of the system has been successfully deployed in a smart home environment throughout the Service Centric Home project (www.sercho.de).


Author(s):  
Xiaojun Bi ◽  
Andrew Howes ◽  
Per Ola Kristensson ◽  
Antti Oulasvirta ◽  
John Williamson

This chapter introduces the field of computational interaction, and explains its long tradition of research on human interaction with technology that applies to human factors engineering, cognitive modelling, artificial intelligence and machine learning, design optimization, formal methods, and control theory. It discusses how the book as a whole is part of an argument that, embedded in an iterative design process, computational interaction design has the potential to complement human strengths and provide a means to generate inspiring and elegant designs without refuting the part played by the complicated, and uncertain behaviour of humans. The chapters in this book manifest intellectual progress in the study of computational principles of interaction, demonstrated in diverse and challenging applications areas such as input methods, interaction techniques, graphical user interfaces, information retrieval, information visualization, and graphic design.


Author(s):  
Hamdi Dibeklioğlu ◽  
Elif Surer ◽  
Albert Ali Salah ◽  
Thierry Dutoit

Author(s):  
António Teixeira ◽  
Carlos Pereira ◽  
Miguel Oliveira e Silva ◽  
Joaquim Alvarelhão ◽  
Anabela G. Silva ◽  
...  

The world’s population is getting older with the percentage of people over 60 increasing more rapidly than any other age group. Telerehabilitation may help minimise the pressure this puts on the traditional healthcare system, but recent studies showed ease of use, usability, and accessibility as unsolved problems, especially for older people who may have little experience or confidence in using technology. Current migration towards multimodal interaction has benefits for seniors, allowing hearing and vision problems to be addressed by exploring redundancy and complementarity of modalities. This chapter presents and contextualizes work in progress in a new telerehabilitation service targeting the combined needs of the elderly to have professionally monitored exercises without leaving their homes with their need regarding interaction, directly related to age-related effects on, for example, vision, hearing, and cognitive capabilities. After a brief general overview of the service, additional information on its two supporting applications are presented, including information on user interfaces. First results from a preliminary evaluation are also included.


2009 ◽  
pp. 127-142
Author(s):  
Stefano Forti ◽  
Barbara Purin ◽  
Claudio Eccher

This chapter presents a case study of using interaction design methods for exploring and testing usability and user experience of a Personal Health Record (PHR) user interface based on visual and graphical elements. To identify problems and improve the design of PHR user interface we conducted two taskoriented usability testing based on the think-aloud technique for observing users during their interaction with a high-fidelity PHR prototype, and questionnaires and semistructured interviews for measuring user satisfaction. Our study demonstrates that a user-centered approach to interaction design involving the final users in an iterative design-evaluation process is important for exploring innovative user interfaces and for identification of problems in the early stages of the development cycle of a PHR.


2008 ◽  
pp. 996-1005
Author(s):  
Christopher J. Pavlovski ◽  
Stella Mitchell

In this article we discuss multimodal technologies that address the technical and usability constraints of the mobile phone or PDA. These environments pose several additional challenges over general mobility solutions. This includes computational strength of the device, bandwidth constraints, and screen size restrictions. We outline the requirements of mobile multimodal solutions involving cellular phones. Drawing upon several trial deployments, we summarize the key designs points from both a technology and usability standpoint, and identify the outstanding problems in these designs. We also outline several future trends in how this technology is being deployed in various application scenarios, ranging from simple voice-activated search engines through to comprehensive mobile office applications.


Sign in / Sign up

Export Citation Format

Share Document