scholarly journals The Effect of the Smart Navigation System Based on Augmented Reality

Currently, virtual technology is applied to everyday life. AR (Augmented Reality) has widely become a challenger technology that brings virtual 3D images into the real world through a camera. Augmented Reality is a technology that brings virtual images that is a 3D model simulated into the real world through cameras and processing that brings objects to overlap into one image and AR can help people to understand the content easily. Hence, in increasing the efficiency of services and publicizing various information, this paper presents the smart navigation system using augmented reality based on a smartphone in the case study of Benjakitti Park, Thailand. The application can navigate users to the POI destination and the system based on mobile devices is composed of two parts: the navigation application and the bone collector game. This project produced user satisfaction at a good level and the proposed application was able to support the significant information for navigation in aspects of performance, usability, and effectiveness.

2018 ◽  
Vol 7 (12) ◽  
pp. 463 ◽  
Author(s):  
Chris Panou ◽  
Lemonia Ragia ◽  
Despoina Dimelli ◽  
Katerina Mania

In this paper, we present the software architecture of a complete mobile tourist guide for cultural heritage sites located in the old town of Chania, Crete, Greece. This includes gamified components that motivate the user to traverse the suggested interest points, as well as technically challenging outdoors augmented reality (AR) visualization features. The main focus of the AR feature is to superimpose 3D models of historical buildings in their past state onto the real world, while users walk around the Venetian part of Chania’s city, exploring historical information in the form of text and images. We examined and tested registration and tracking mechanisms based on commercial AR frameworks in the challenging outdoor, sunny environment of a Mediterranean town, addressing relevant technical challenges. Upon visiting one of three significant monuments, a 3D model displaying the monument in its past state is visualized onto the mobile phone’s screen at the exact location of the real-world monument, while the user is exploring the area. A location-based experience was designed and integrated into the application, enveloping the 3D model with real-world information at the same time. The users are urged to explore interest areas and unlock historical information, while earning points following a gamified experience. By combining AR technologies with location-aware and gamified elements, we aim to promote the technologically enhanced public appreciation of cultural heritage sites and showcase the cultural depth of the city of Chania.


Author(s):  
Alvebi Hopaliki ◽  
Yupianti Yupianti ◽  
Juju Jumadi

Augmented Reality (AR) is a variation of the virtual environment or more often calledasAR technology users that can see the real world, with virtual objects added to the real world. So, users see virtual objects and real objects are in the sum place. Augmented reality requires streaming video with a camera that is used as a sowce of image input, then tracking and detecting markers. After the W is detected 30 model will appear of an item. This 3D model was mated mite software for 3D design, jbr example 3DS Max. Blender and others In this ancient animal Ieaming media using pattern recognition that can be interment! as taking raw data and based on data classification Then it can take the mutation problem how to design 30 objects with the Blender application to introduce d anc‘ient animals. purpose of this m is to build ancient animal leaning media in real time by using augmented reality technology.


Sensors ◽  
2020 ◽  
Vol 20 (20) ◽  
pp. 5890
Author(s):  
Bo-Chen Huang ◽  
Jiun Hsu ◽  
Edward T.-H. Chu ◽  
Hui-Mei Wu

Due to the popularity of indoor positioning technology, indoor navigation applications have been deployed in large buildings, such as hospitals, airports, and train stations, to guide visitors to their destinations. A commonly-used user interface, shown on smartphones, is a 2D floor map with a route to the destination. The navigation instructions, such as turn left, turn right, and go straight, pop up on the screen when users come to an intersection. However, owing to the restrictions of a 2D navigation map, users may face mental pressure and get confused while they are making a connection between the real environment and the 2D navigation map before moving forward. For this reason, we developed ARBIN, an augmented reality-based navigation system, which posts navigation instructions on the screen of real-world environments for ease of use. Thus, there is no need for users to make a connection between the navigation instructions and the real-world environment. In order to evaluate the applicability of ARBIN, a series of experiments were conducted in the outpatient area of the National Taiwan University Hospital YunLin Branch, which is nearly 1800 m2, with 35 destinations and points of interests, such as a cardiovascular clinic, x-ray examination room, pharmacy, and so on. Four different types of smartphone were adopted for evaluation. Our results show that ARBIN can achieve 3 to 5 m accuracy, and provide users with correct instructions on their way to the destinations. ARBIN proved to be a practical solution for indoor navigation, especially for large buildings.


Author(s):  
C. P. Huang ◽  
S. Agarwal ◽  
F. W. Liou

Abstract Due to the advances in computer engineering technologies, currently much effort has been devoted to simulate the real world in a computer generated environment. However, there are always differences between a virtual environment and the real world, and those variations can be from the complexities and the uncertainties of initial conditions, contributing parameters and the models employed. Before a virtual environment is put into work for design and development, some way of quantifying possible errors or uncertainties in the computer model is needed so that a robust and reliable system can be achieved. The aim of this paper is to present a current case study on an augmented reality environment with 3-D tracking and dynamic simulation technologies for the parts feeding systems, so that engineers can run high-fidelity simulation to test new materials, components, and systems before investing valuable resources in construction.


2019 ◽  
Vol 2019 (1) ◽  
pp. 237-242
Author(s):  
Siyuan Chen ◽  
Minchen Wei

Color appearance models have been extensively studied for characterizing and predicting the perceived color appearance of physical color stimuli under different viewing conditions. These stimuli are either surface colors reflecting illumination or self-luminous emitting radiations. With the rapid development of augmented reality (AR) and mixed reality (MR), it is critically important to understand how the color appearance of the objects that are produced by AR and MR are perceived, especially when these objects are overlaid on the real world. In this study, nine lighting conditions, with different correlated color temperature (CCT) levels and light levels, were created in a real-world environment. Under each lighting condition, human observers adjusted the color appearance of a virtual stimulus, which was overlaid on a real-world luminous environment, until it appeared the whitest. It was found that the CCT and light level of the real-world environment significantly affected the color appearance of the white stimulus, especially when the light level was high. Moreover, a lower degree of chromatic adaptation was found for viewing the virtual stimulus that was overlaid on the real world.


Author(s):  
Yulia Fatma ◽  
Armen Salim ◽  
Regiolina Hayami

Along with the development, the application can be used as a medium for learning. Augmented Reality is a technology that combines two-dimensional’s virtual objects and three-dimensional’s virtual objects into a real three-dimensional’s  then projecting the virtual objects in real time and simultaneously. The introduction of Solar System’s material, students are invited to get to know the planets which are directly encourage students to imagine circumtances in the Solar System. Explenational of planets form and how the planets make the revolution and rotation in books are considered less material’s explanation because its only display objects in 2D. In addition, students can not practice directly in preparing the layout of the planets in the Solar System. By applying Augmented Reality Technology, information’s learning delivery can be clarified, because in these applications are combined the real world and the virtual world. Not only display the material, the application also display images of planets in 3D animation’s objects with audio.


Author(s):  
Kevin Lesniak ◽  
Conrad S. Tucker

The method presented in this work reduces the frequency of virtual objects incorrectly occluding real-world objects in Augmented Reality (AR) applications. Current AR rendering methods cannot properly represent occlusion between real and virtual objects because the objects are not represented in a common coordinate system. These occlusion errors can lead users to have an incorrect perception of the environment around them when using an AR application, namely not knowing a real-world object is present due to a virtual object incorrectly occluding it and incorrect perception of depth or distance by the user due to incorrect occlusions. The authors of this paper present a method that brings both real-world and virtual objects into a common coordinate system so that distant virtual objects do not obscure nearby real-world objects in an AR application. This method captures and processes RGB-D data in real-time, allowing the method to be used in a variety of environments and scenarios. A case study shows the effectiveness and usability of the proposed method to correctly occlude real-world and virtual objects and provide a more realistic representation of the combined real and virtual environments in an AR application. The results of the case study show that the proposed method can detect at least 20 real-world objects with potential to be incorrectly occluded while processing and fixing occlusion errors at least 5 times per second.


2018 ◽  
Author(s):  
Uri Korisky ◽  
Rony Hirschhorn ◽  
Liad Mudrik

Notice: a peer-reviewed version of this preprint has been published in Behavior Research Methods and is available freely at http://link.springer.com/article/10.3758/s13428-018-1162-0Continuous Flash Suppression (CFS) is a popular method for suppressing visual stimuli from awareness for relatively long periods. Thus far, it has only been used for suppressing two-dimensional images presented on-screen. We present a novel variant of CFS, termed ‘real-life CFS’, with which the actual immediate surroundings of an observer – including three-dimensional, real life objects – can be rendered unconscious. Real-life CFS uses augmented reality goggles to present subjects with CFS masks to their dominant eye, leaving their non-dominant eye exposed to the real world. In three experiments we demonstrate that real objects can indeed be suppressed from awareness using real-life CFS, and that duration suppression is comparable that obtained using the classic, on-screen CFS. We further provide an example for an experimental code, which can be modified for future studies using ‘real-life CFS’. This opens the gate for new questions in the study of consciousness and its functions.


Sign in / Sign up

Export Citation Format

Share Document