Augmented Reality Interface for Sailing Navigation: a User Study for Wind Representation

Author(s):  
Francesco Laera ◽  
Vito M. Manghisi ◽  
Alessandro Evangelista ◽  
Mario Massimo Foglia ◽  
Michele Fiorentino
Keyword(s):  
2021 ◽  
Vol 11 (13) ◽  
pp. 6047
Author(s):  
Soheil Rezaee ◽  
Abolghasem Sadeghi-Niaraki ◽  
Maryam Shakeri ◽  
Soo-Mi Choi

A lack of required data resources is one of the challenges of accepting the Augmented Reality (AR) to provide the right services to the users, whereas the amount of spatial information produced by people is increasing daily. This research aims to design a personalized AR that is based on a tourist system that retrieves the big data according to the users’ demographic contexts in order to enrich the AR data source in tourism. This research is conducted in two main steps. First, the type of the tourist attraction where the users interest is predicted according to the user demographic contexts, which include age, gender, and education level, by using a machine learning method. Second, the correct data for the user are extracted from the big data by considering time, distance, popularity, and the neighborhood of the tourist places, by using the VIKOR and SWAR decision making methods. By about 6%, the results show better performance of the decision tree by predicting the type of tourist attraction, when compared to the SVM method. In addition, the results of the user study of the system show the overall satisfaction of the participants in terms of the ease-of-use, which is about 55%, and in terms of the systems usefulness, about 56%.


2020 ◽  
Vol 4 (4) ◽  
pp. 78
Author(s):  
Andoni Rivera Pinto ◽  
Johan Kildal ◽  
Elena Lazkano

In the context of industrial production, a worker that wants to program a robot using the hand-guidance technique needs that the robot is available to be programmed and not in operation. This means that production with that robot is stopped during that time. A way around this constraint is to perform the same manual guidance steps on a holographic representation of the digital twin of the robot, using augmented reality technologies. However, this presents the limitation of a lack of tangibility of the visual holograms that the user tries to grab. We present an interface in which some of the tangibility is provided through ultrasound-based mid-air haptics actuation. We report a user study that evaluates the impact that the presence of such haptic feedback may have on a pick-and-place task of the wrist of a holographic robot arm which we found to be beneficial.


2019 ◽  
Vol 9 (23) ◽  
pp. 5123 ◽  
Author(s):  
Diego Vaquero-Melchor ◽  
Ana M. Bernardos

Nowadays, Augmented-Reality (AR) head-mounted displays (HMD) deliver a more immersive visualization of virtual contents, but the available means of interaction, mainly based on gesture and/or voice, are yet limited and obviously lack realism and expressivity when compared to traditional physical means. In this sense, the integration of haptics within AR may help to deliver an enriched experience, while facilitating the performance of specific actions, such as repositioning or resizing tasks, that are still dependent on the user’s skills. In this direction, this paper gathers the description of a flexible architecture designed to deploy haptically enabled AR applications both for mobile and wearable visualization devices. The haptic feedback may be generated through a variety of devices (e.g., wearable, graspable, or mid-air ones), and the architecture facilitates handling the specificity of each. For this reason, within the paper, it is discussed how to generate a haptic representation of a 3D digital object depending on the application and the target device. Additionally, the paper includes an analysis of practical, relevant issues that arise when setting up a system to work with specific devices like HMD (e.g., HoloLens) and mid-air haptic devices (e.g., Ultrahaptics), such as the alignment between the real world and the virtual one. The architecture applicability is demonstrated through the implementation of two applications: (a) Form Inspector and (b) Simon Game, built for HoloLens and iOS mobile phones for visualization and for UHK for mid-air haptics delivery. These applications have been used to explore with nine users the efficiency, meaningfulness, and usefulness of mid-air haptics for form perception, object resizing, and push interaction tasks. Results show that, although mobile interaction is preferred when this option is available, haptics turn out to be more meaningful in identifying shapes when compared to what users initially expect and in contributing to the execution of resizing tasks. Moreover, this preliminary user study reveals some design issues when working with haptic AR. For example, users may be expecting a tailored interface metaphor, not necessarily inspired in natural interaction. This has been the case of our proposal of virtual pressable buttons, built mimicking real buttons by using haptics, but differently interpreted by the study participants.


Sensors ◽  
2020 ◽  
Vol 20 (6) ◽  
pp. 1612 ◽  
Author(s):  
Sara Condino ◽  
Benish Fida ◽  
Marina Carbone ◽  
Laura Cercenelli ◽  
Giovanni Badiali ◽  
...  

Augmented reality (AR) Head-Mounted Displays (HMDs) are emerging as the most efficient output medium to support manual tasks performed under direct vision. Despite that, technological and human-factor limitations still hinder their routine use for aiding high-precision manual tasks in the peripersonal space. To overcome such limitations, in this work, we show the results of a user study aimed to validate qualitatively and quantitatively a recently developed AR platform specifically conceived for guiding complex 3D trajectory tracing tasks. The AR platform comprises a new-concept AR video see-through (VST) HMD and a dedicated software framework for the effective deployment of the AR application. In the experiments, the subjects were asked to perform 3D trajectory tracing tasks on 3D-printed replica of planar structures or more elaborated bony anatomies. The accuracy of the trajectories traced by the subjects was evaluated by using templates designed ad hoc to match the surface of the phantoms. The quantitative results suggest that the AR platform could be used to guide high-precision tasks: on average more than 94% of the traced trajectories stayed within an error margin lower than 1 mm. The results confirm that the proposed AR platform will boost the profitable adoption of AR HMDs to guide high precision manual tasks in the peripersonal space.


Electronics ◽  
2020 ◽  
Vol 9 (11) ◽  
pp. 1814
Author(s):  
Yuzhao Liu ◽  
Yuhan Liu ◽  
Shihui Xu ◽  
Kelvin Cheng ◽  
Soh Masuko ◽  
...  

Despite the convenience offered by e-commerce, online apparel shopping presents various product-related risks, as consumers can neither physically see nor try products on themselves. Augmented reality (AR) and virtual reality (VR) technologies have been used to improve the shopping online experience. Therefore, we propose an AR- and VR-based try-on system that provides users a novel shopping experience where they can view garments fitted onto their personalized virtual body. Recorded personalized motions are used to allow users to dynamically interact with their dressed virtual body in AR. We conducted two user studies to compare the different roles of VR- and AR-based try-ons and validate the impact of personalized motions on the virtual try-on experience. In the first user study, the mobile application with the AR- and VR-based try-on is compared to a traditional e-commerce interface. In the second user study, personalized avatars with pre-defined motion and personalized motion is compared to a personalized no-motion avatar with AR-based try-on. The result shows that AR- and VR-based try-ons can positively influence the shopping experience, compared with the traditional e-commerce interface. Overall, AR-based try-on provides a better and more realistic garment visualization than VR-based try-on. In addition, we found that personalized motions do not directly affect the user’s shopping experience.


2014 ◽  
Vol 1018 ◽  
pp. 39-46 ◽  
Author(s):  
Jens Lambrecht ◽  
Jörg Krüger

In this paper, we present a robot programming system taking into account natural communication and process integrated simulation as well as a unified robot control layer and an interface towards the Digital Factory for program transmission. We choose an integrative approach including markerless gesture recognition and a mobile Augmented Reality simulation on common handheld devices, e.g. smartphones or tablet-PCs. The user is enabled to draw poses and trajectories into the workspace of the robot supported with simultaneous visual feedback in Augmented Reality. In addition the user can adapt the robot program by gestural manipulation of poses and trajectories. Within a task-oriented implementation of the robot program a pick and place task was implemented through the programming by demonstration principle. With the help of a user study we evaluate programming duration, programming errors and subjective assessment compared with Teach-In and Offline Programming. The analysis of the results shows a significant reduction of programming duration as well as a reduction of programming errors compared with Teach-In. Furthermore, most participants favour the spatial programming system.


Author(s):  
Huyen Nguyen ◽  
Sarah Ketchell ◽  
Ulrich Engelke ◽  
Bruce H. Thomas ◽  
Paulo de Souza

Author(s):  
Ralph Stelzer ◽  
Erik Steindecker ◽  
Bernhard Saske ◽  
Stefanie Lässig

Augmented Reality (AR) is a technology used for maintenance support of existing products. The service instructions are depicted directly into the operator’s vision and enhance his product view while maintaining the item. In order to provide high-quality maintenance support the manuals have to be created and tested at the development stage of the product. This paper describes a possible solution for testing AR instructions on virtual prototypes by combining AR and Virtual Reality (VR) technology. Technical aspects for combining VR and AR displays are outlined and a combined system is presented. Furthermore, the paper shows the development of a specific test scenario for evaluating the created system and contains design patterns for depicting AR instructions in Virtual Environments. Finally, the usability of the solution is tested by a user study.


2014 ◽  
Vol 668-669 ◽  
pp. 1399-1402 ◽  
Author(s):  
Yao Hua Yu

This paper presents a study of an intelligent augmented reality tourist guide application aiming to run on Android smart phones. The strategy for positioning the target scene and intelligent planning strategy of tourist routes are proposed. Some important parts in the system implement stage are also described in detail. The results of a follow-up user study show that the intelligent augmented reality tourist guide application is very helpful for tourists, particularly enhancing self-guided tours and improving tourists’ experience in their touring.


10.29007/7jch ◽  
2019 ◽  
Author(s):  
James Stigall ◽  
Sharad Sharma

Building occupants must know how to properly exit a building should the need ever arise. Being aware of appropriate evacuation procedures eliminates (or reduces) the risk of injury and death occurring during an existing catastrophe. Augmented reality (AR) is increasingly being sought after as a teaching and training tool because it offers a visualization and interaction capability that captures the learner’s attention and enhances the learner’s capacity to retain what was learned. Utilizing the visualization and interaction capability that AR offers and the need for emergency evacuation training, this paper explores mobile AR application (MARA) constructed to help users evacuate a building in the event of an emergency such as a building fire, active shooter, earthquake, and similar circumstances. The MARA was built for Android-based devices using Unity and Vuforia. Its features include the use of intelligent signs (i.e. visual cues to guide users to the exits) to help users evacuate a building. Inter alia, this paper discusses the MARA’s implementation and its evaluation through a user study utilizing the Technology Acceptance Model (TAM) and the System Usability Scale (SUS) frameworks. The results demonstrate the participants’ opinions that the MARA is both usable and effective in helping users evacuate a building.


Sign in / Sign up

Export Citation Format

Share Document