Feature Points Extraction for Camera Tracking in Augmented Reality System

Author(s):  
Boo-Gyum Kim ◽  
Jong-Soo Choi ◽  
Jin-Tae Kim
2021 ◽  
Vol 9 (2) ◽  
pp. 209
Author(s):  
Sungin Choi ◽  
Jung-Seo Park

As the scale of offshore plants has gradually increased, the amount of management points has significantly increased. Therefore, there are needs for innovative process control, quality management, and an installation support system to improve productivity and efficiency for timely construction. In this paper, we introduce a novel approach to deal with these issues using augmented reality (AR) technology. The core of successful AR implementation is up to scene matching through accurate pose (position and alignment) estimation using an AR camera. To achieve this, this paper first introduces an accurate marker registration technique that can be used in huge structures. In order to improve the precision of marker registration, we propose a method that utilizes the natural feature points and the marker corner points in the optimization step simultaneously. Subsequently, a method of precisely generating AR scenes by utilizing these registered markers is described. Finally, to validate the proposed method, the best practices and its effects are introduced. Based on the proposed AR system, construction workers are now able to quickly navigate to onboard destinations by themselves. In addition, they are able to intuitively install and inspect outfitting parts without paper drawings. Through field tests and surveys, we confirm that AR-based inspection has a significant time-saving effect compared to conventional drawing-based inspection.


Sensors ◽  
2021 ◽  
Vol 21 (9) ◽  
pp. 3061
Author(s):  
Alice Lo Valvo ◽  
Daniele Croce ◽  
Domenico Garlisi ◽  
Fabrizio Giuliano ◽  
Laura Giarré ◽  
...  

In recent years, we have assisted with an impressive advance in augmented reality systems and computer vision algorithms, based on image processing and artificial intelligence. Thanks to these technologies, mainstream smartphones are able to estimate their own motion in 3D space with high accuracy. In this paper, we exploit such technologies to support the autonomous mobility of people with visual disabilities, identifying pre-defined virtual paths and providing context information, reducing the distance between the digital and real worlds. In particular, we present ARIANNA+, an extension of ARIANNA, a system explicitly designed for visually impaired people for indoor and outdoor localization and navigation. While ARIANNA is based on the assumption that landmarks, such as QR codes, and physical paths (composed of colored tapes, painted lines, or tactile pavings) are deployed in the environment and recognized by the camera of a common smartphone, ARIANNA+ eliminates the need for any physical support thanks to the ARKit library, which we exploit to build a completely virtual path. Moreover, ARIANNA+ adds the possibility for the users to have enhanced interactions with the surrounding environment, through convolutional neural networks (CNNs) trained to recognize objects or buildings and enabling the possibility of accessing contents associated with them. By using a common smartphone as a mediation instrument with the environment, ARIANNA+ leverages augmented reality and machine learning for enhancing physical accessibility. The proposed system allows visually impaired people to easily navigate in indoor and outdoor scenarios simply by loading a previously recorded virtual path and providing automatic guidance along the route, through haptic, speech, and sound feedback.


2013 ◽  
Vol 60 (9) ◽  
pp. 2636-2644 ◽  
Author(s):  
Hussam Al-Deen Ashab ◽  
Victoria A. Lessoway ◽  
Siavash Khallaghi ◽  
Alexis Cheng ◽  
Robert Rohling ◽  
...  

2009 ◽  
Vol 5 (4) ◽  
pp. 415-422 ◽  
Author(s):  
Ramesh Thoranaghatte ◽  
Jaime Garcia ◽  
Marco Caversaccio ◽  
Daniel Widmer ◽  
Miguel A. Gonzalez Ballester ◽  
...  

2018 ◽  
Vol 5 ◽  
Author(s):  
Kaj Helin ◽  
Timo Kuula ◽  
Carlo Vizzi ◽  
Jaakko Karjalainen ◽  
Alla Vovk

Sign in / Sign up

Export Citation Format

Share Document