Tracking Multiple Collocated HTC Vive Setups in a Common Coordinate System

Author(s):  
Tim Weissker ◽  
Philipp Tornow ◽  
Bernd Froehlich
Author(s):  
Kevin Lesniak ◽  
Conrad S. Tucker

The method presented in this work reduces the frequency of virtual objects incorrectly occluding real-world objects in Augmented Reality (AR) applications. Current AR rendering methods cannot properly represent occlusion between real and virtual objects because the objects are not represented in a common coordinate system. These occlusion errors can lead users to have an incorrect perception of the environment around them when using an AR application, namely not knowing a real-world object is present due to a virtual object incorrectly occluding it and incorrect perception of depth or distance by the user due to incorrect occlusions. The authors of this paper present a method that brings both real-world and virtual objects into a common coordinate system so that distant virtual objects do not obscure nearby real-world objects in an AR application. This method captures and processes RGB-D data in real-time, allowing the method to be used in a variety of environments and scenarios. A case study shows the effectiveness and usability of the proposed method to correctly occlude real-world and virtual objects and provide a more realistic representation of the combined real and virtual environments in an AR application. The results of the case study show that the proposed method can detect at least 20 real-world objects with potential to be incorrectly occluded while processing and fixing occlusion errors at least 5 times per second.


1982 ◽  
Vol 3 ◽  
pp. 353 ◽  
Author(s):  
Henry H. Brecher

While carrying out photogrammetric measurements to provide surface velocities and elevations for use in studies of the equilibrium and dynamics of Byrd Glacier, I noted that comparison of elevations obtained by ground surveys in 1978-79 with US Geological Survey topographic maps made from 1960-62 aerial photography indicated a very large apparent lowering of the glacier surface in this short timeinterval. The apparent lowering varied between 50 and 150 m along a 60 km section of the glacier for which data were available (Brecher 1980). The ground measurements were estimated to be in error by no more than 3 in but the accuracy of elevations on the maps was unknown. Because these are reconnaissance maps, however, substantial errors would not be unexpected. It was therefore necessary to obtain more accurate glacier surface elevations for 1960–62 in order to determine whether the lowering is real. Photogrammetric strip triangulations of three individual strips of photography, two taken in November 1960 and the third in February 1963, which cover the region of the greatest apparent lowering, have now been completed. The old strips were oriented to fixed points on the two “banks” of the glacier derived for this purpose from the 1978–79 photogrammetric work, thus bringing the measurements from the old and new photography into a common coordinate system. The glacier surface elevations for 1960–62 are the same as those obtained from the 1978–79 ground survey and photogrammetry. While it is difficult to give measures of accuracy of the results since no independent data are available for comparison, internal evidence indicates that precision higher than the expected 10 m has been achieved in the measurements. It can thus be stated unambiguously that no detectable surface lowering has occurred on any of the parts of the glacier which have been investigated.


2019 ◽  
Vol 11 (21) ◽  
pp. 2469
Author(s):  
Siekański ◽  
Paśko ◽  
Malowany ◽  
Malesa

Unmanned aerial vehicles (UAVs) are widely used to protect critical infrastructure objects, and they are most often equipped with one or more RGB cameras and, sometimes, with a thermal imaging camera as well. To obtain as much information as possible from them, they should be combined or fused. This article presents a situation in which data from RGB (visible, VIS) and thermovision (infrared, IR) cameras and 3D data have been combined in a common coordinate system. A specially designed calibration target was developed to enable the geometric calibration of IR and VIS cameras in the same coordinate system. 3D data are compatible with the VIS coordinate system when the structure from motion (SfM) algorithm is used. The main focus of this article is to provide the spatial coherence between these data in the case of relative camera movement, which usually results in a miscalibration of the system. Therefore, a new algorithm for the detection of sensor system miscalibration, based on phase correlation with automatic calibration correction in real time, is introduced.


1982 ◽  
Vol 3 ◽  
pp. 353-353 ◽  
Author(s):  
Henry H. Brecher

While carrying out photogrammetric measurements to provide surface velocities and elevations for use in studies of the equilibrium and dynamics of Byrd Glacier, I noted that comparison of elevations obtained by ground surveys in 1978-79 with US Geological Survey topographic maps made from 1960-62 aerial photography indicated a very large apparent lowering of the glacier surface in this short timeinterval. The apparent lowering varied between 50 and 150 m along a 60 km section of the glacier for which data were available (Brecher 1980).The ground measurements were estimated to be in error by no more than 3 in but the accuracy of elevations on the maps was unknown. Because these are reconnaissance maps, however, substantial errors would not be unexpected. It was therefore necessary to obtain more accurate glacier surface elevations for 1960–62 in order to determine whether the lowering is real. Photogrammetric strip triangulations of three individual strips of photography, two taken in November 1960 and the third in February 1963, which cover the region of the greatest apparent lowering, have now been completed. The old strips were oriented to fixed points on the two “banks” of the glacier derived for this purpose from the 1978–79 photogrammetric work, thus bringing the measurements from the old and new photography into a common coordinate system.The glacier surface elevations for 1960–62 are the same as those obtained from the 1978–79 ground survey and photogrammetry. While it is difficult to give measures of accuracy of the results since no independent data are available for comparison, internal evidence indicates that precision higher than the expected 10 m has been achieved in the measurements. It can thus be stated unambiguously that no detectable surface lowering has occurred on any of the parts of the glacier which have been investigated.


2019 ◽  
Vol 11 (22) ◽  
pp. 2617 ◽  
Author(s):  
Nowak ◽  
Naus ◽  
Maksimiuk

A market for small drones is developing very fast. They are used for leisure activities and exploited in commercial applications. However, there is a growing concern for accidental or even criminal misuses of these platforms. Dangerous incidents with drones are appearing more often, and have caused many institutions to start thinking about anti-drone solutions. There are many cases when building stationary systems seems to be aimless since the high cost does not correspond with, for example, threat frequency. In such cases, mobile drone countermeasure systems seem to perfectly meet demands. In modern mobile solutions, frequency modulated continuous wave (FMCW) radars are frequently used as detectors. Proper cooperation of many radars demands their measurements to be brought to a common coordinate system—azimuths must be measured in the same direction (preferably the north). It requires calibration, understood as determining constant corrections to measured angles. The article presents the author's method of fast, simultaneous calibration of many mobile FMCW radars operating in a network. It was validated using 95,000 numerical tests. The results show that the proposed method significantly improves the north orientation of the radars throughout the whole range of the initial errors. Therefore, it can be successfully used in practical applications.


1963 ◽  
Vol 16 (4) ◽  
pp. 472-475
Author(s):  
M. G. Pearson

The previous paper described how the two components of a compound system could be combined, through the medium of a pictorial display, in a common coordinate system. This essential process can be carried out by an airborne digital computer with considerable advantage, and with a degree of refinement—e.g. data smoothing— which would be beyond the scope of the simpler manually-operated system. In other words, the computer can introduce a second mode of compound operation which differs from the simple relationship between position-fixing and D.R. systems described earlier, in that by continuously combining the incoming information from the position-fixing system and the dead-reckoning device it can deliver an output of a substantially higher quality than that of either input taken separately.


Author(s):  
Roman Shults ◽  
Asset Urazaliev ◽  
Andriy Annenkov ◽  
Olena Nesterenko ◽  
Oksana Kucherenko ◽  
...  

During reconstruction and restoration of city geodetic networks, there is quite a common problem that is related to the nonhomogeneity of existing geodetic networks. In any city, local authorities operate with their coordinate systems. Such conditions lead to inconsistency between data of different services. There is only one way how to overcome the problem that lies in the creation and deployment of the new common coordinate system for the whole city. But such an approach has a lack connected with the necessity of transformation parameters acquisition for the latest and old coordinate systems. Insofar as old coordinate systems had been created with different accuracy, using various equipment, and measuring technologies, it is not possible to consider them as homogeneous. It means that we cannot use a classical conformal Helmert transformation to link different coordinate systems. In the presented paper were studied the different approaches for transformation parameters acquisition. A case study of the Almaty city coordinate system was researched and compared the following methods: Helmert transformation, bilinear transformation, the second and third-order regression transformation, and the fourth-order conformal polynomial transformation. It was found out that neither of the considered methods maintains the necessary transformation accuracy (>5 cm). That is why the creation of the transformation field using the finite element method (FEM) was suggested. The whole city was divided into triangles using Delaunay triangulation. For each triangle, the transformation parameters were found using affine transformation with the necessary accuracy.


2022 ◽  
Author(s):  
Andrew Jones ◽  
F. William Townes ◽  
Didong Li ◽  
Barbara E Engelhardt

Spatially-resolved genomic technologies have allowed us to study the physical organization of cells and tissues, and promise an understanding of the local interactions between cells. However, it remains difficult to precisely align spatial observations across slices, samples, scales, individuals, and technologies. Here, we propose a probabilistic model that aligns a set of spatially-resolved genomics and histology slices onto a known or unknown common coordinate system into which the samples are aligned both spatially and in terms of the phenotypic readouts (e.g., gene or protein expression levels, cell density, open chromatin regions). Our method consists of a two-layer Gaussian process: the first layer maps the observed samples' spatial locations into a common coordinate system, and the second layer maps from the common coordinate system to the observed readouts. Our approach also allows for slices to be mapped to a known template coordinate space if one exists. We show that our registration approach enables complex downstream spatially-aware analyses of spatial genomics data at multiple resolutions that are impossible or inaccurate with unaligned data, including an analysis of variance, differential expression across the z-axis, and association tests across multiple data modalities.


2008 ◽  
Author(s):  
Janakiram Dandibhotla ◽  
Kevin Gary

In a surgical environment, there are many tools and objects (including the patient) being used. In a computer-aided surgery, the position of each of these tools in the operating room is very critical and generally will be tracked in their own coordinate systems. To show the surgeon a real time picture of the operating environment on a computer monitor, we need to know the position of each object with respect to one common coordinate system, which in turn can be achieved by knowing each coordinate system and the transforms between them. Image-Guided Surgical Toolkit is a software toolkit designed to enable biomedical researchers to rapidly prototype and create new applications for image-guided surgery. In IGSTK, in which the coordinate systems and the transforms are being successfully used, there is no central data structure or repository, which will hold all the coordinate systems and the transforms between them. Such a data structure could help the IGSTK software developers to have more confidence in the code they have written. This project develops a tool, which will create such a data structure and dynamically show the changes to it, to help such software developers to write better code.


Sign in / Sign up

Export Citation Format

Share Document