scholarly journals Exploring the Acceptance of Augmented Reality Among Tesl Teachers and Students and its Effects on Motivation Level: A Case Study in Kuwait

2021 ◽  
Vol 8 (12) ◽  
pp. 23-34
Author(s):  
Mohamed Abdelmagid Abdelmagid ◽  
Norillah bt Abdullah ◽  
Abdulmajid Mohammed Abdulwahab Aldaba

.

Author(s):  
Miri Ben-Amram ◽  
Nitza Davidovitch ◽  
Iryna Herasimovich ◽  
Yuri Ribakov

2021 ◽  
Author(s):  
Patrick Dallasega ◽  
Felix Schulze ◽  
Andrea Revolti ◽  
Martin Martinelli

Author(s):  
Geoffrey Momin ◽  
Raj Panchal ◽  
Daniel Liu ◽  
Sharman Perera

Human error accounts for about 60% of the annual power loss due to maintenance incidents in the fossil power industry. The International Atomic Energy Agency reports that 80\% of industrial accidents in the nuclear industry can be attributed to human error and 20\% to equipment failure. The Personal Augmented Reality Reference System (PARRS) is a suite of computer-mediated reality applications that looks to minimize human error by digitizing manual procedures and providing real-time monitoring of hazards present in an environment. Our mission is to be able to provide critical feedback to inform personnel in real-time and protect them from avoidable hazards. PARRS aims to minimize human error and increase worker productivity by bringing innovation to safety and procedural compliance by leveraging technologies such as augmented reality, LiDAR, computer machine learning and particulate mapping using remote systems.


Author(s):  
Kevin Lesniak ◽  
Conrad S. Tucker

The method presented in this work reduces the frequency of virtual objects incorrectly occluding real-world objects in Augmented Reality (AR) applications. Current AR rendering methods cannot properly represent occlusion between real and virtual objects because the objects are not represented in a common coordinate system. These occlusion errors can lead users to have an incorrect perception of the environment around them when using an AR application, namely not knowing a real-world object is present due to a virtual object incorrectly occluding it and incorrect perception of depth or distance by the user due to incorrect occlusions. The authors of this paper present a method that brings both real-world and virtual objects into a common coordinate system so that distant virtual objects do not obscure nearby real-world objects in an AR application. This method captures and processes RGB-D data in real-time, allowing the method to be used in a variety of environments and scenarios. A case study shows the effectiveness and usability of the proposed method to correctly occlude real-world and virtual objects and provide a more realistic representation of the combined real and virtual environments in an AR application. The results of the case study show that the proposed method can detect at least 20 real-world objects with potential to be incorrectly occluded while processing and fixing occlusion errors at least 5 times per second.


Sign in / Sign up

Export Citation Format

Share Document