scholarly journals Towards Augmented Reality Driven Human-City Interaction: Current Research on Mobile Headsets and Future Challenges

2022 ◽  
Vol 54 (8) ◽  
pp. 1-38
Author(s):  
Lik-Hang Lee ◽  
Tristan Braud ◽  
Simo Hosio ◽  
Pan Hui

Interaction design for Augmented Reality (AR) is gaining attention from both academia and industry. This survey discusses 260 articles (68.8% of articles published between 2015–2019) to review the field of human interaction in connected cities with emphasis on augmented reality-driven interaction. We provide an overview of Human-City Interaction and related technological approaches, followed by reviewing the latest trends of information visualization, constrained interfaces, and embodied interaction for AR headsets. We highlight under-explored issues in interface design and input techniques that warrant further research and conjecture that AR with complementary Conversational User Interfaces (CUIs) is a crucial enabler for ubiquitous interaction with immersive systems in smart cities. Our work helps researchers understand the current potential and future needs of AR in Human-City Interaction.

Author(s):  
Xiaojun Bi ◽  
Andrew Howes ◽  
Per Ola Kristensson ◽  
Antti Oulasvirta ◽  
John Williamson

This chapter introduces the field of computational interaction, and explains its long tradition of research on human interaction with technology that applies to human factors engineering, cognitive modelling, artificial intelligence and machine learning, design optimization, formal methods, and control theory. It discusses how the book as a whole is part of an argument that, embedded in an iterative design process, computational interaction design has the potential to complement human strengths and provide a means to generate inspiring and elegant designs without refuting the part played by the complicated, and uncertain behaviour of humans. The chapters in this book manifest intellectual progress in the study of computational principles of interaction, demonstrated in diverse and challenging applications areas such as input methods, interaction techniques, graphical user interfaces, information retrieval, information visualization, and graphic design.


1997 ◽  
Vol 6 (6) ◽  
pp. 687-700 ◽  
Author(s):  
Véronique Normand ◽  
Didier Pernel ◽  
Béatrice Bacconnet

The Thomson-CSF Corporate Research Laboratories are investigating the issues of user-interface design, spoken and multimodal interaction design and realization in virtual environments. This paper describes our technical approach to speech-enabled multimodal virtual environments, based on our past achievements in the multimodal interaction domain, and presents our main supporting projects in this area. These projects involve augmented reality for maintenance, military situation building and assessment, and collaborative virtual environments.


2014 ◽  
Vol 602-605 ◽  
pp. 3630-3634
Author(s):  
Xiao Jia Zou ◽  
Xiang Dong You ◽  
Hao Pan ◽  
Xu Zhang ◽  
Qian Luo

In this paper, we mainly explore how to design and implement the user interfaces of Electricity Operation Information System based on Android. The paper extends its process as the following four aspects---requirements analysis, UI design, interaction design and programmatic implementation. In response to user actions fluidly and friendly, we add modules to handle exceptions. In the end, we give a briefly test on the system UI to ensure it run smoothly and make less mistakes. There are limited studies focusing on the flow design of UI combined with programmatic implementation. The UI design and implementation methodology has good reference at the early stage of developing an application, especially on Android platforms.


2020 ◽  
Vol 8 (6) ◽  
pp. 4667-4673

Virtual Reality, Augmented Reality and other such immersive environments have gained popularity with the increase in technological trends in the past decade. As they became widely used, the human computer interface design and the designing criteria emerges as a challenging task. Virtual and Augmented Reality provide a wide range of applications ranging from a primitive level like improving learning, education experiences to complex industrial and medical operations. Virtual reality is a viable alternative that can be focussed on, in the future interface design development because it can remove existing generic and complex physical interfaces and replace them with an alternative sensory relayed input form. It provides a natural and efficient mode of interaction, that the users can work with.Virtual and Augmented reality eradicates the need for development of different acceptable standards for user interfaces as it can provide a whole and generic interface to accommodate the work setting.In this paper, we investigated various prospects of applications for user interaction in Virtual and Augemnted realities and the limitations in the respective domains. The paper provides an outline on how the new era of human computer interaction leading to cognition-based communications, and how Virtual and Augmented realities can tailor the user needs and address the future demands which replaces the need for command-based interaction between the humans and computers.


Author(s):  
Marva Angélica Mora-Lumbreras ◽  
Norma Sánchez-Sánchez ◽  
Carolina Rocío Sánchez-Pérez

An Integrative Activity uses the skills and knowledge provided in various subjects to solve practical problems, with an individual and/ or group approach. Specifically in this article we work on a Point of Sale system and the Diffusion of Products with Augmented Reality, developed over three semesters and involving courses of Software Engineering, Computer Human Interaction, Design of Virtual Environments and Computing for Mobile Devices. The activity applies knowledge of software development models, usability, 3D modeling, augmented reality and development of web applications for mobile devices. At the end of this activity, the student has managed to develop complete software, from planning until testing phases.


Author(s):  
Mikael Wiberg

No matter if we think about interaction design as a design tradition aimed at giving form to the interaction with computational objects, or if we think about interaction design as being simply about user interface design it is hard to escape the fact that the user interface to a large extent defines the scene and the form of the interaction. Without adopting a fully deterministic perspective here it is still a fact that if the user interface is screen-based and graphical and the input modality is mouse-based, then it is likely that the form of that interaction, that is what the turn-taking looks like and what is demanded by the user, is very similar to other screen-based interfaces with similar input devices. However, the design space for the form of interaction is growing fast. While command-based interfaces and text-based interfaces sort of defined the whole design space in the 1970s, the development since then, including novel ways of bringing sensors, actuators, and smart materials to the user interface has certainly opened up for a broader design space for interaction design. But it is not only the range of materials that has been extended over the last few decades, but we have also moved through a number of form paradigms for interaction design. With this as a point of departure I will in this chapter reflect on how we have moved from early days of command-based user interfaces, via the use of metaphors in the design of graphical user interfaces (GUIs), towards ways of interacting with the computer via tangible user interfaces (TUIs). Further on, I will describe how this movement towards TUIs was a first step away from building user interfaces based on representations and metaphors and a first step towards material interactions.


Author(s):  
Nikola Mitrovic ◽  
Eduardo Mena ◽  
Jose Alberto Royo

Mobility for graphical user interfaces (GUIs) is a challenging problem, as different GUIs need to be constructed for different device capabilities and changing context, preferences and users’ locations. GUI developers frequently create multiple user interface versions for different devices. The solution lies in using a single, abstract, user interface description that is used later to automatically generate user interfaces for different devices. Various techniques are proposed to adapt GUIs from an abstract specification to a concrete interface. Design-time techniques have the possibility of creating better performing GUIs but, in contrast to run-time techniques, lack flexibility and mobility. Run-time techniques’ mobility and autonomy can be significantly improved by using mobile agent technology and an indirect GUI generation paradigm. Using indirect generation enables analysis of computer-human interaction and application of artificial intelligence techniques to be made at run-time, increasing GUIs’ performance and usability.


2001 ◽  
Vol 10 (1) ◽  
pp. 96-108 ◽  
Author(s):  
Doug A. Bowman ◽  
Ernst Kruijff ◽  
Joseph J. LaViola ◽  
Ivan Poupyrev

Three-dimensional user interface design is a critical component of any virtual environment (VE) application. In this paper, we present a broad overview of 3-D interaction and user interfaces. We discuss the effect of common VE hardware devices on user interaction, as well as interaction techniques for generic 3-D tasks and the use of traditional 2-D interaction styles in 3-D environments. We divide most user-interaction tasks into three categories: navigation, selection/manipulation, and system control. Throughout the paper, our focus is on presenting not only the available techniques but also practical guidelines for 3-D interaction design and widely held myths. Finally, we briefly discuss two approaches to 3-D interaction design and some example applications with complex 3-D interaction requirements. We also present an annotated online bibliography as a reference companion to this article.


Sign in / Sign up

Export Citation Format

Share Document