3D tracking in unknown environments using on-line keypoint learning for mobile augmented reality

Author(s):  
Gerhard Schall ◽  
Helmut Grabner ◽  
Michael Grabner ◽  
Paul Wohlhart ◽  
Dieter Schmalstieg ◽  
...  
Author(s):  
A.-M. Boutsi ◽  
C. Ioannidis ◽  
S. Soile

Abstract. Mobile Augmented Reality (MAR) aligns toward current technological advances with more intuitive interfaces, realistic graphic content and flexible development processes. The case of overlaying precise 3D representations exploits their high penetration to induct users to a world where data are perceived as real counterparts. The work presented in this paper integrates web-like concepts with hybrid mobile tools to visualize high-quality and complex 3D geometry on the real environment. The implementation involves two different operational mechanisms: anchors and location-sensitive tracking. Three scenarios, for indoors and outdoors are developed using open-source and with no limit on distribution SDKs, APIs and rendering engines. The JavaScript-driven prototype consolidates some of the overarching principles of AR, such as pose estimation, registration and 3D tracking to an interactive User Interface under the scene graph concept. The 3D overlays are shown to the end user i) on top of an image target ii) on real-world planar surfaces and iii) at predefined points of interest (POI). The evaluation in terms of performance, rendering efficacy and responsiveness is made through various testing strategies: system and trace logs, profiling and ‗end-to-end‖ tests. The final benchmarking elucidates the slow and computationally intensive procedures induced by the big data rendering and optimization patterns are proposed to mitigate the performance impact to the non-native technologies.


Author(s):  
A. Kharroubi ◽  
R. Billen ◽  
F. Poux

Abstract. Mobile Augmented Reality (MAR) attracts significant research and development efforts from both the industry and academia, but rarely integrate massive 3D dataset’s interactions. The emergence of dedicated AR devices and powerful Software Development Kit (ARCore for android and ARKit for iOS) improves performance on mobile devices (Smartphones and tablets). This is aided by new sensor integration and advances in computer vision that fuels the development of MAR. In this paper, we propose a direct integration of massive 3D point clouds with semantics in a web-based marker-less mobile Augmented Reality (AR) application for real-time visualization. We specifically investigate challenges linked to point cloud data structure and semantic injection. Our solution consolidates some of the overarching principles of AR, of which pose estimation, registration and 3D tracking. The developed AR system is tested on mobile phones web-browsers providing clear insights on the performance of the system. Promising results highlight a number of frame per second varying between 27 and 60 for a real-time point budget of 4.3 million points. The point cloud tested is composed of 29 million points and shows how our indexation strategy permits the integration of massive point clouds aiming at the point budget. The results also gives research directions concerning the dependence and delay related to the quality of the network connection, and the battery consumption since portable sensors are used all the time.


Sign in / Sign up

Export Citation Format

Share Document