Educational Augmented Reality (AR) Applications and Development Process

Author(s):  
Muzaffer Özdemir

In recent years, presenting the useful information in an effective way has become a great necessity for educators. The opportunities provided by the AR technologies offer practical ways to meet this need of educators. By integrating the digital objects with real-world assets simultaneously, AR helps to concretize abstract concepts, and enhances the sense of reality, which in turn is a huge contribution to learning. In this chapter, it was presented the various limitations and advantages of AR revealed by some empirical studies in the literature. In addition, it was given information about AR development tools/ programs, add-on packages and presented development stages for an exemplary AR book page. The use of the Unity and Vuforia was explained as the development tools. It is believed that this information would be useful for those who will develop AR application which can be easily displayed by mobile or desktop PCs.

Author(s):  
Elmar Peise ◽  
Paolo Bientinesi

In scientific computing, optimal use of computing resources comes at the cost of extensive coding, tuning, and benchmarking. While the classic approach of “features first, performance later” is supported by a variety of tools such as Tau, Vampir, and Scalasca, the emerging performance-centric approach, in which both features and performance are primary objectives, is still lacking suitable development tools. For dense linear algebra applications, we fill this gap with the Experimental Linear Algebra Performance Studies (ELAPS) framework, a multi-platform open-source environment for easy, fast, and yet powerful performance experimentation and prototyping. In contrast to many existing tools, ELAPS targets the beginning of the development process, assisting application developers in both algorithmic and optimization decisions. With ELAPS, users construct experiments to investigate how performance and efficiency depend on factors such as caching, algorithmic parameters, problem size, and parallelism. Experiments are designed either through Python scripts or a specialized Graphical User Interface (GUI), and run on a spectrum of architectures, ranging from laptops to accelerators and clusters. The resulting reports provide various metrics and statistics that can be analyzed both numerically and visually. In this article, we introduce ELAPS and illustrate its practical value in guiding critical performance decisions already in early development stages.


2018 ◽  
Author(s):  
Rebeca Motta ◽  
Mario Bonicenha ◽  
Claudia Susie Rodrigues ◽  
Cláudia Werner

Augmented reality creates a bridge between virtual and real world, providing stimulating resources for different purposes. This technology enables new teaching possibilities since it can bring more abstract concepts into reality and put the knowledge related to several areas, such as Software Engineering, into practice. MetricRA is a tool developed to help Software Engineering students to understand Cohesion and Coupling metrics. The solution was implemented with Augmented Reality technology, where the user can control a class diagram to observe the metrics transformation. This article describes MetricRA tool and presents a study conducted to evaluate its ability to contribute to the understanding of the concepts proposed.


Author(s):  
Pranav Jain ◽  
Conrad Tucker

Abstract In this paper, a mobile-based augmented reality (AR) method is presented that is capable of accurate occlusion between digital and real-world objects in real-time. AR occlusion is the process of hiding or showing virtual objects behind physical ones. Existing approaches that address occlusion in AR applications typically require the use of markers or depth sensors, coupled with compute machines (e.g., laptop or desktop). Furthermore, real-world environments are cluttered and contain motion artifacts that result in occlusion errors and improperly rendered virtual objects, relative to the real world environment. These occlusion errors can lead users to have an incorrect perception of the environment around them while using an AR application, namely not knowing a real-world object is present. Moving the technology to mobile-based AR environments is necessary to reduce the cost and complexity of these technologies. This paper presents a mobile-based AR method that brings real and virtual objects into a similar coordinate system so that virtual objects do not obscure nearby real-world objects in an AR environment. This method captures and processes visual data in real-time, allowing the method to be used in a variety of non-static environments and scenarios. The results of the case study show that the method has the potential to reduce compute complexity, maintain high frame rates to run in real-time, and maintain occlusion efficacy.


Author(s):  
Jiří Šťastný ◽  
David Procházka ◽  
Tomáš Koubek ◽  
Jaromír Landa

The integral part of production process in many companies is prototyping. Although, these companies commonly have high quality visualization tools (large screen projections, virtual reality), prototyping was never abandoned. There is a number of reasons. The most important is the possibility of model observation from any angle without any physical constraints and its haptic feedback. The interactivity of model adjustments is important as well. The direct work with the model allows the designers to focus on the creative process more than work with a computer. There is still a problem with a difficult adjustability of the model. More significant changes demand completely new prototype or at least longer time for its realization.The first part of the article describes our approach for solution of this problem by means of Augmented Reality. The merging of the real world model and digital objects allows streamline the work with the model and speed up the whole production phase significantly. The main advantage of augmented reality is the possibility of direct manipulation with the scene using a portable digital camera. Also adding digital objects into the scene could be done using identification markers placed on the surface of the model. Therefore it is not necessary to work with special input devices and lose the contact with the real world model. Adjustments are done directly on the model. The key problem of outlined solution is the ability of identification of an object within the camera picture and its replacement with the digital object. The second part of the article is focused especially on the identification of exact position and orientation of the marker within the picture. The identification marker is generalized into the triple of points which represents a general plane in space. There is discussed the space identification of these points and the description of representation of their position and orientation be means of transformation matrix. This matrix is used for rendering of the graphical objects (e. g. in OpenGL and Direct3D).


2020 ◽  
Vol 4 (2) ◽  
pp. 32-61
Author(s):  
Adam C. Carreon ◽  
Sean J. Smith ◽  
Kavita Rao

Augmented reality (AR) continues to gain popularity within the classroom setting, lauded for the potential it brings to further engage students and contextualize instruction. AR offers an interactive experience where digital objects, seen through various mobile devices (e.g., iPad, mobile phone), are overlaid on the real world. This literature review of 38 research studies conducted in K-12 settings examined the defining characteristics of AR, the purpose and application of the AR intervention, and the outcomes associated with the current use of AR. The results of the review reveal that studies use varying defining characteristics of AR which leads to varying levels of applications for all students in instructional settings. With no common definition leading to a wide array of classroom usage, the authors examine AR usage for students with and without disabilities. This article also provides recommendations to establish a strong research base on specific characteristics and the impact AR has on education.


2019 ◽  
Vol 2019 (1) ◽  
pp. 237-242
Author(s):  
Siyuan Chen ◽  
Minchen Wei

Color appearance models have been extensively studied for characterizing and predicting the perceived color appearance of physical color stimuli under different viewing conditions. These stimuli are either surface colors reflecting illumination or self-luminous emitting radiations. With the rapid development of augmented reality (AR) and mixed reality (MR), it is critically important to understand how the color appearance of the objects that are produced by AR and MR are perceived, especially when these objects are overlaid on the real world. In this study, nine lighting conditions, with different correlated color temperature (CCT) levels and light levels, were created in a real-world environment. Under each lighting condition, human observers adjusted the color appearance of a virtual stimulus, which was overlaid on a real-world luminous environment, until it appeared the whitest. It was found that the CCT and light level of the real-world environment significantly affected the color appearance of the white stimulus, especially when the light level was high. Moreover, a lower degree of chromatic adaptation was found for viewing the virtual stimulus that was overlaid on the real world.


2018 ◽  
Author(s):  
Kyle Plunkett

This manuscript provides two demonstrations of how Augmented Reality (AR), which is the projection of virtual information onto a real-world object, can be applied in the classroom and in the laboratory. Using only a smart phone and the free HP Reveal app, content rich AR notecards were prepared. The physical notecards are based on Organic Chemistry I reactions and show only a reagent and substrate. Upon interacting with the HP Reveal app, an AR video projection shows the product of the reaction as well as a real-time, hand-drawn curved-arrow mechanism of how the product is formed. Thirty AR notecards based on common Organic Chemistry I reactions and mechanisms are provided in the Supporting Information and are available for widespread use. In addition, the HP Reveal app was used to create AR video projections onto laboratory instrumentation so that a virtual expert can guide the user during the equipment setup and operation.


10.28945/2207 ◽  
2015 ◽  
Vol 10 ◽  
pp. 021-035 ◽  
Author(s):  
Yan Lu ◽  
Joseph T. Chao ◽  
Kevin R. Parker

This project shows a creative approach to the familiar scavenger hunt game. It involved the implementation of an iPhone application, HUNT, with Augmented Reality (AR) capability for the users to play the game as well as an administrative website that game organizers can use to create and make available games for users to play. Using the HUNT mobile app, users will first make a selection from a list of games, and they will then be shown a list of objects that they must seek. Once the user finds a correct object and scans it with the built-in camera on the smartphone, the application will attempt to verify if it is the correct object and then display associated multi-media AR content that may include images and videos overlaid on top of real world views. HUNT not only provides entertaining activities within an environment that players can explore, but the AR contents can serve as an educational tool. The project is designed to increase user involvement by using a familiar and enjoyable game as a basis and adding an educational dimension by incorporating AR technology and engaging and interactive multimedia to provide users with facts about the objects that they have located


Sign in / Sign up

Export Citation Format

Share Document