scholarly journals Review: Development and Technical Design of Tangible User Interfaces in Wide-Field Areas of Application

Sensors ◽  
2021 ◽  
Vol 21 (13) ◽  
pp. 4258
Author(s):  
Alice Krestanova ◽  
Martin Cerny ◽  
Martin Augustynek

A tangible user interface or TUI connects physical objects and digital interfaces. It is more interactive and interesting for users than a classic graphic user interface. This article presents a descriptive overview of TUI’s real-world applications sorted into ten main application areas—teaching of traditional subjects, medicine and psychology, programming, database development, music and arts, modeling of 3D objects, modeling in architecture, literature and storytelling, adjustable TUI solutions, and commercial TUI smart toys. The paper focuses on TUI’s technical solutions and a description of technical constructions that influences the applicability of TUIs in the real world. Based on the review, the technical concept was divided into two main approaches: the sensory technical concept and technology based on a computer vision algorithm. The sensory technical concept is processed to use wireless technology, sensors, and feedback possibilities in TUI applications. The image processing approach is processed to a marker and markerless approach for object recognition, the use of cameras, and the use of computer vision platforms for TUI applications.

Author(s):  
Mikael Wiberg

No matter if we think about interaction design as a design tradition aimed at giving form to the interaction with computational objects, or if we think about interaction design as being simply about user interface design it is hard to escape the fact that the user interface to a large extent defines the scene and the form of the interaction. Without adopting a fully deterministic perspective here it is still a fact that if the user interface is screen-based and graphical and the input modality is mouse-based, then it is likely that the form of that interaction, that is what the turn-taking looks like and what is demanded by the user, is very similar to other screen-based interfaces with similar input devices. However, the design space for the form of interaction is growing fast. While command-based interfaces and text-based interfaces sort of defined the whole design space in the 1970s, the development since then, including novel ways of bringing sensors, actuators, and smart materials to the user interface has certainly opened up for a broader design space for interaction design. But it is not only the range of materials that has been extended over the last few decades, but we have also moved through a number of form paradigms for interaction design. With this as a point of departure I will in this chapter reflect on how we have moved from early days of command-based user interfaces, via the use of metaphors in the design of graphical user interfaces (GUIs), towards ways of interacting with the computer via tangible user interfaces (TUIs). Further on, I will describe how this movement towards TUIs was a first step away from building user interfaces based on representations and metaphors and a first step towards material interactions.


Author(s):  
Wolfgang Beer

PurposeThe aim of this paper is to present an architecture and prototypical implementation of a context‐sensitive software system which combines the tangible user interface approach with a mobile augmented reality (AR) application.Design/methodology/approachThe work which is described within this paper is based on a creational approach, which means that a prototypical implementation is used to gather further research results. The prototypical approach allows performing ongoing tests concerning the accuracy and different context‐sensitive threshold functions.FindingsWithin this paper, the implementation and practical use of tangible user interfaces for outdoor selection of geographical objects is reported and discussed in detail.Research limitations/implicationsFurther research is necessary within the area of context‐sensitive dynamically changing threshold functions, which would allow improving the accuracy of the selected tangible user interface approach.Practical implicationsThe practical implication of using tangible user interfaces within outdoor applications should improve the usability of AR applications.Originality/valueDespite the fact that there exist a multitude of research results within the area of gesture recognition and AR applications, this research work focuses on the pointing gesture to select outdoor geographical objects.


Computers ◽  
2019 ◽  
Vol 8 (1) ◽  
pp. 14 ◽  
Author(s):  
Andrew Chen ◽  
Kevin Wang

As we move towards improving the skill of computers to play games like chess against humans, the ability to accurately perceive real-world game boards and game states remains a challenge in many cases, hindering the development of game-playing robots. In this paper, we present a computer vision algorithm developed as part of a chess robot project that detects the chess board, squares, and piece positions in relatively unconstrained environments. Dynamically responding to lighting changes in the environment, accounting for perspective distortion, and using accurate detection methodologies results in a simple but robust algorithm that succeeds 100% of the time in standard environments, and 80% of the time in extreme environments with external lighting. The key contributions of this paper are a dynamic approach to the Hough line transform, and a hybrid edge and morphology-based approach for object/occupancy detection, that enable the development of a robot chess player that relies solely on the camera for sensory input.


Author(s):  
Kazi Tanvir Ahmed Siddiqui ◽  
David Feil-Seifer ◽  
Tianyi Jiang ◽  
Sonu Jose ◽  
Siming Liu ◽  
...  

Simulation environments for Unmanned Aerial Vehicles (UAVs) can be very useful for prototyping user interfaces and training personnel that will operate UAVs in the real world. The realistic operation of such simulations will only enhance the value of such training. In this paper, we present the integration of a model-based waypoint navigation controller into the Reno Rescue Simulator for the purposes of providing a more realistic user interface in simulated environments. We also present potential uses for such simulations, even for real-world operation of UAVs.


Author(s):  
Ivan V. Stepanyan

More workers are involved into interaction with graphic user interfaces most part of the working shift. However, low ergonomic qualities or incorrect usage of graphic user interface could result in risk of unfavorable influence on workers’ health. The authors revealed and classified typical scenarios of graphic user interface usage. Various types of graphic user interface and operator occupations are characterized by various parameters of exertion, both biomechanical and psycho-physiological. Among main elements of graphic user interface are presence or absence of mouse or joystick, intuitive clearness, balanced palette, fixed position of graphic elements, comfort level, etc. Review of various graphic user interface and analysis of their characteristics demonstrated possibility of various occupational risk factors. Some disclosed ergonomic problems are connected with incorporation of graphic user interface into various information technologies and systems. The authors presented a role of ergonomic characteristics of graphic user interface for safe and effective work of operators, gave examples of algorithms to visualize large information volumes for easier comprehension and analysis. Correct usage of interactive means of computer visualization with competent design and observing ergonomic principles will optimize mental work in innovative activity and preserve operators’ health. Prospective issues in this sphere are ergonomic interfaces developed with consideration of information hygiene principles, big data analysis technology and automatically generated cognitive graphics.


2020 ◽  
Vol 4 (2) ◽  
pp. 1-13
Author(s):  
Zahid Islam

Inclusion of tangible user interfaces can facilitate learning through contextual experience, interaction with the provided information, and epistemic actions, resulting in effecting learning in design education. The goal of this study is to investigate how tangible user interface (TUI) affects design learning through the cognitive load. Extended reality-based TUI and traditional desktop-based GUI were utilized to deliver the same information to two groups of students. The NASA TLX tool was used to measure students' perceived cognitive load after receiving information through the two modalities. Contemporary design pedagogy, the potential use of XR, design cognition, today's design learners experience-oriented lifestyle were combined to provide a theoretical framework to understand how information delivery modalities affect design learning. The results reveal that the use of XR-based TUIs decreases cognitive load resulting in enhanced experience and effective learning in design studios.


Author(s):  
Manjit Singh Sidhu ◽  
Waleed Maqableh

Tangible User Interfaces (TUI) is an emerging human-machine interaction (HMI) style where significant number of new TUIs has been incorporated into the educational technology domain. This work presents the design of new user interface for technology-assisted problem solving (TAPS) packages for Engineering Mechanics subject at University Tenaga Nasional (UNITEN). In this study the TAPS packages were further enhanced by adopting TUI and compared to the previous TAPS packages. The study found additional benefits when using TUI whereby it provided human interface senses, such as haptics and tactile (touch) making it more natural to use.


2021 ◽  
Vol 1105 (1) ◽  
pp. 012076
Author(s):  
Warqaa Hashim ◽  
Ali Al-Naji ◽  
Izzat A. Al-Rayahi ◽  
Munir Oudah

2002 ◽  
Vol 11 (2) ◽  
pp. 119-133 ◽  
Author(s):  
Nicholas R. Hedley ◽  
Mark Billinghurst ◽  
Lori Postner ◽  
Richard May ◽  
Hirokazu Kato

In this paper, we describe two explorations in the use of hybrid user interfaces for collaborative geographic data visualization. Our first interface combines three technologies: augmented reality (AR), immersive virtual reality (VR), and computer vision-based hand and object tracking. Wearing a lightweight display with an attached camera, users can look at a real map and see three-dimensional virtual terrain models overlaid on the map. From this AR interface, they can fly in and experience the model immersively, or use free hand gestures or physical markers to change the data representation. Building on this work, our second interface explores alternative interface techniques, including a zoomable user interface, paddle interactions, and pen annotations. We describe the system hardware and software and the implications for GIS and spatial science applications.


Sign in / Sign up

Export Citation Format

Share Document