scholarly journals Repair of Geological Models Based on Multiple Material Marching Cubes

Mathematics ◽  
2021 ◽  
Vol 9 (18) ◽  
pp. 2207 ◽  
Author(s):  
Benyu Li ◽  
Deyun Zhong ◽  
Liguan Wang

In this paper, we present a multi-domain implicit surface reconstruction algorithm for geological modeling based on the labeling of voxel points. The improved algorithm sets a label for each voxel point to represent the type of its geological domain and then obtains all the voxel points in the void areas. After that, the improved algorithm modifies the labels of the voxel points in the void areas and finally reconstructs the geological models through the Multiple Material Marching Cubes (M3C) algorithm. The improved algorithm solves the problems of some unexpected overlaps and voids in geological modeling by setting and modifying the labels of the voxel points. Our key contribution is proposing a labeling processing method to repair the overlap and void defects generated in the geological modeling and realizing the improved M3C algorithm. The experimental results of some geological models show the performance of the improved method. Compared with the original method, the improved method can repair the overlap and void defects in geological modeling to ensure the raw structural adjacency relationships of the geological bodies.

Mathematics ◽  
2021 ◽  
Vol 9 (15) ◽  
pp. 1819
Author(s):  
Tiandong Shi ◽  
Deyun Zhong ◽  
Liguan Wang

The effect of geological modeling largely depends on the normal estimation results of geological sampling points. However, due to the sparse and uneven characteristics of geological sampling points, the results of normal estimation have great uncertainty. This paper proposes a geological modeling method based on the dynamic normal estimation of sparse point clouds. The improved method consists of three stages: (1) using an improved local plane fitting method to estimate the normals of the point clouds; (2) using an improved minimum spanning tree method to redirect the normals of the point clouds; (3) using an implicit function to construct a geological model. The innovation of this method is an iterative estimation of the point cloud normal. The geological engineer adjusts the normal direction of some point clouds according to the geological law, and then the method uses these correct point cloud normals as a reference to estimate the normals of all point clouds. By continuously repeating the iterative process, the normal estimation result will be more accurate. Experimental results show that compared with the original method, the improved method is more suitable for the normal estimation of sparse point clouds by adjusting normals, according to prior knowledge, dynamically.


2002 ◽  
Vol 2 (4) ◽  
pp. 277-284 ◽  
Author(s):  
Yutaka Ohtake ◽  
Alexander G. Belyaev

A new method for improving polygonizations of implicit surfaces with sharp features is proposed. The method is based on the observation that, given an implicit surface with sharp features, a triangle mesh whose triangles are tangent to the implicit surface at certain inner triangle points gives a better approximation of the implicit surface than the standard Marching Cubes mesh [Lorensen, W.E., and Cline, H.E., 1987, Computer Graphics (Proceedings of SIGGRAPH ’87), 21(3), pp. 163–169] (in our experiments we use VTK Marching Cubes [Schroeder, W., Martin, K., and Lorensen, W., 1998, The Visualization Toolkit: An Object-Oriented Approach to 3-D Graphics, Prentice Hall]). First, given an initial triangle mesh, its dual mesh composed of the triangle centroids is considered. Then the dual mesh is modified such that its vertices are placed on the implicit surface and the mesh dual to the modified dual mesh is considered. Finally the vertex positions of that “double dual” mesh are optimized by minimizing a quadratic energy measuring a deviation of the mesh normals from the implicit surface normals computed at the vertices of the modified dual mesh. In order to achieve an accurate approximation of fine surface features, these basic steps are combined with adaptive mesh subdivision and curvature-weighted vertex resampling. The proposed method outperforms approaches based on the mesh evolution paradigm in speed and accuracy.


2012 ◽  
Vol 546-547 ◽  
pp. 1495-1500
Author(s):  
Min Zhang ◽  
Yu Hou ◽  
Liang Wei Yao

There are some disadvantages of the Apriori algorithm,such as too many scan of the database and many redundant middle itemsets to be generated. In this paper, we propose an improved algorithm, OApriori, with a synthetical method: (1) pruning strategy, (2) connection strategy, (3) reducing the scanning scale of database. We have performed extensive experiments and compared the performance of two algorithms. It was found that the improved algorithm reduces the counts of unnecessary candidate itemsets, accelerates the speed of the algorithm.


2020 ◽  
Vol 21 (14) ◽  
pp. 5134 ◽  
Author(s):  
Shosuke Ito ◽  
Sandra Del Bino ◽  
Tomohisa Hirobe ◽  
Kazumasa Wakamatsu

Alkaline hydrogen peroxide oxidation (AHPO) of eumelanin and pheomelanin, two major classes of melanin pigments, affords pyrrole-2,3,5-tricarboxylic acid (PTCA), pyrrole-2,3-dicarboxylic acid (PDCA) and pyrrole-2,3,4,5-tetracarboxylic acid (PTeCA) from eumelanin and thiazole-2,4,5-tricarboxylic acid (TTCA) and thiazole-4,5-dicarboxylic acid (TDCA) from pheomelanin. Quantification of these five markers by HPLC provides useful information on the quantity and structural diversity of melanins in various biological samples. HPLC analysis of these markers using the original method of 0.1 M potassium phosphate buffer (pH 2.1):methanol = 99:1 (85:15 for PTeCA) on a reversed-phase column had some problems, including the short lifetime of the column and, except for the major eumelanin marker PTCA, other markers were occasionally overlapped by interfering peaks in samples containing only trace levels of these markers. These problems can be overcome by the addition of an ion pair reagent for anions, such as tetra-n-butylammonium bromide (1 mM), to retard the elution of di-, tri- and tetra-carboxylic acids. The methanol concentration was increased to 17% (30% for PTeCA) and the linearity, reproducibility, and recovery of the markers with this improved method is good to excellent. This improved HPLC method was compared to the original method using synthetic melanins, mouse hair, human hair, and human epidermal samples. In addition to PTCA, TTCA, a major marker for pheomelanin, showed excellent correlations between both HPLC methods. The other markers showed an attenuation of the interfering peaks with the improved method. We recommend this improved HPLC method for the quantitative analysis of melanin markers following AHPO because of its simplicity, accuracy, and reproducibility.


2019 ◽  
Vol 12 (1) ◽  
pp. 1-32 ◽  
Author(s):  
Miguel de la Varga ◽  
Alexander Schaaf ◽  
Florian Wellmann

Abstract. The representation of subsurface structures is an essential aspect of a wide variety of geoscientific investigations and applications, ranging from geofluid reservoir studies, over raw material investigations, to geosequestration, as well as many branches of geoscientific research and applications in geological surveys. A wide range of methods exist to generate geological models. However, the powerful methods are behind a paywall in expensive commercial packages. We present here a full open-source geomodeling method, based on an implicit potential-field interpolation approach. The interpolation algorithm is comparable to implementations in commercial packages and capable of constructing complex full 3-D geological models, including fault networks, fault–surface interactions, unconformities and dome structures. This algorithm is implemented in the programming language Python, making use of a highly efficient underlying library for efficient code generation (Theano) that enables a direct execution on GPUs. The functionality can be separated into the core aspects required to generate 3-D geological models and additional assets for advanced scientific investigations. These assets provide the full power behind our approach, as they enable the link to machine-learning and Bayesian inference frameworks and thus a path to stochastic geological modeling and inversions. In addition, we provide methods to analyze model topology and to compute gravity fields on the basis of the geological models and assigned density values. In summary, we provide a basis for open scientific research using geological models, with the aim to foster reproducible research in the field of geomodeling.


2013 ◽  
Vol 340 ◽  
pp. 867-870
Author(s):  
Quan Hou Li ◽  
Chun Yu Zhang ◽  
Yuan Feng Zhang

The research of visualization will be carried out in the aspects of parameters distribution, dynamic history matching and geological models and numerical simulation of reservoirs respectively. In terms of the research on reservoir parameters, we can improve the accuracy of the prediction through comparing, analyzing and modeling the data of old wells and the secondary explained data. As to dynamic history matching and numerical simulation, by combination of dynamic and statistic methods, we can modify the deficiency of the traditional method. As for geological modeling, we can utilize the characters of logging responses and current mathematical theory to establish modeling. Only by combination of the dynamic and static methods, we can achieve actual visualization of oil-field development.


2015 ◽  
Vol 2015 ◽  
pp. 1-12 ◽  
Author(s):  
Zhong Qu ◽  
Si-Peng Lin ◽  
Fang-Rong Ju ◽  
Ling Liu

The traditional image stitching result based on the SIFT feature points extraction, to a certain extent, has distortion errors. The panorama, especially, would get more seriously distorted when compositing a panoramic result using a long image sequence. To achieve the goal of creating a high-quality panorama, the improved algorithm is proposed in this paper, including altering the way of selecting the reference image and putting forward a method that can compute the transformation matrix for any image of the sequence to align with the reference image in the same coordinate space. Additionally, the improved stitching method dynamically selects the next input image based on the number of SIFT matching points. Compared with the traditional stitching process, the improved method increases the number of matching feature points and reduces SIFT feature detection area of the reference image. The experimental results show that the improved method can not only accelerate the efficiency of image stitching processing, but also reduce the panoramic distortion errors, and finally we can obtain a pleasing panoramic result.


Author(s):  
Hans J. Deeg

The Kwee van Woerden (KvW) method for the determination of eclipse minimum times has been a staple in eclipsing binary research for decades, due its simplicity and independence of external input parameters. However, its estimates of the timing error have been known to be of low reliability. During the analysis of very precise photometry of CM Draconis eclipses from TESS space mission data, KvW’s original equation for the timing error estimate produced numerical errors, which evidenced a fundamental problem in this equation. This contribution introduces an improved way to calculate the timing error with the KvW method. A code that implements this improved method, together with several further updates over the original method is presented as well. An example application on the CM Draconis light curves from TESS is given, where we show that its timing error estimates of about 1 second are in excellent agreement with error estimates obtained by other means.


Sign in / Sign up

Export Citation Format

Share Document