scholarly journals An efficient method to improve the quality of tetrahedron mesh with MFRC

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Yuzheng Ma ◽  
Monan Wang

AbstractIn this paper, we proposed a novel operation to reconstruction tetrahedrons within a certain region, which we call MFRC (Multi-face reconstruction). During the existing tetrahedral mesh improvement methods, the flip operation is one of the very important components. However, due to the limited area affected by the flip, the improvement of the mesh quality by the flip operation is also very limited. The proposed MFRC algorithm solves this problem. MFRC can reconstruct the local mesh in a larger range and can find the optimal tetrahedron division in the target area within acceptable time complexity. Therefore, based on the MFRC algorithm, we combined other operations including smoothing, edge removal, face removal, and vertex insertion/deletion to develop an effective mesh quality improvement method. Numerical experiments of dozens of meshes show that the algorithm can effectively improve the low-quality elements in the tetrahedral mesh, and can effectively reduce the running time, which has important significance for the quality improvement of large-scale mesh.

2021 ◽  
Vol 11 (12) ◽  
pp. 5543
Author(s):  
Ning Xi ◽  
Yinjie Sun ◽  
Lei Xiao ◽  
Gang Mei

Mesh quality is a critical issue in numerical computing because it directly impacts both computational efficiency and accuracy. Tetrahedral meshes are widely used in various engineering and science applications. However, in large-scale and complicated application scenarios, there are a large number of tetrahedrons, and in this case, the improvement of mesh quality is computationally expensive. Laplacian mesh smoothing is a simple mesh optimization method that improves mesh quality by changing the locations of nodes. In this paper, by exploiting the parallelism features of the modern graphics processing unit (GPU), we specifically designed a parallel adaptive Laplacian smoothing algorithm for improving the quality of large-scale tetrahedral meshes. In the proposed adaptive algorithm, we defined the aspect ratio as a metric to judge the mesh quality after each iteration to ensure that every smoothing improves the mesh quality. The adaptive algorithm avoids the shortcoming of the ordinary Laplacian algorithm to create potential invalid elements in the concave area. We conducted 5 groups of comparative experimental tests to evaluate the performance of the proposed parallel algorithm. The results demonstrated that the proposed adaptive algorithm is up to 23 times faster than the serial algorithms; and the accuracy of the tetrahedral mesh is satisfactorily improved after adaptive Laplacian mesh smoothing. Compared with the ordinary Laplacian algorithm, the proposed adaptive Laplacian algorithm is more applicable, and can effectively deal with those tetrahedrons with extremely poor quality. This indicates that the proposed parallel algorithm can be applied to improve the mesh quality in large-scale and complicated application scenarios.


2021 ◽  
Author(s):  
wei peng ◽  
Xinguang Wu ◽  
Yidong Bao ◽  
Chaoyang Zhang ◽  
Weixi Ji

Abstract Hexahedral mesh is of great value in the analysis of mechanical structure, and the mesh quality has an important impact on the efficiency and accuracy of the analysis. This paper presents a quality improvement method for hexahedral meshes, which consists of node classification, geometric constraints based single hexahedron regularization and local hexahedral mesh stitching. The nodes are divided into different types and the corresponding geometric constraints are established in single hexahedron regularization to keep the geometric shapes of original mesh. In contrast to the global optimization strategies, we perform the hexahedral mesh stitching operation within a few local regions surrounding elements with undesired quality, which can effectively improve the quality of the mesh with less consuming time. A number of mesh quality improvements for hexahedral meshes generated by a variety of methods are introduced to demonstrate the effectiveness of our method.


2008 ◽  
Vol 33-37 ◽  
pp. 833-838
Author(s):  
Yoshitaka Wada ◽  
Takuji Hayashi ◽  
Masanori Kikuchi ◽  
Fei Xu

Due to more complex and severe design restrictions, more effective and faster finite element analyses are demanded. There are several ways to compute FE analysis efficiently: parallel computing, fast iterative or direct solvers, adaptive analysis and so on. One of the most effective analysis ways is the combination of adaptive analysis and multigrid iterative solver, because an adaptive analysis requires several meshes with difference resolutions and multigrid solver utilizes such meshes to accelerate its computation. However, convergence of multigrid solver is largely affected by initial shape of each element. An effective mesh improvement method is proposed here. It is the combination of mesh coarsening and refinement. A good mesh can be obtained by the method to be applied to an initial mesh, and better convergence is achieved by the improved initial mesh.


2021 ◽  
Author(s):  
Gregory Wagner ◽  
Andre Souza ◽  
Adeline Hillier ◽  
Ali Ramadhan ◽  
Raffaele Ferrari

<p>Parameterizations of turbulent mixing in the ocean surface boundary layer (OSBL) are key Earth System Model (ESM) components that modulate the communication of heat and carbon between the atmosphere and ocean interior. OSBL turbulence parameterizations are formulated in terms of unknown free parameters estimated from observational or synthetic data. In this work we describe the development and use of a synthetic dataset called the “LESbrary” generated by a large number of idealized, high-fidelity, limited-area large eddy simulations (LES) of OSBL turbulent mixing. We describe how the LESbrary design leverages a detailed understanding of OSBL conditions derived from observations and large scale models to span the range of realistically diverse physical scenarios. The result is a diverse library of well-characterized “synthetic observations” that can be readily assimilated for the calibration of realistic OSBL parameterizations in isolation from other ESM model components. We apply LESbrary data to calibrate free parameters, develop prior estimates of parameter uncertainty, and evaluate model errors in two OSBL parameterizations for use in predictive ESMs.</p>


2017 ◽  
Vol 10 (5) ◽  
pp. 2031-2055 ◽  
Author(s):  
Thomas Schwitalla ◽  
Hans-Stefan Bauer ◽  
Volker Wulfmeyer ◽  
Kirsten Warrach-Sagi

Abstract. Increasing computational resources and the demands of impact modelers, stake holders, and society envision seasonal and climate simulations with the convection-permitting resolution. So far such a resolution is only achieved with a limited-area model whose results are impacted by zonal and meridional boundaries. Here, we present the setup of a latitude-belt domain that reduces disturbances originating from the western and eastern boundaries and therefore allows for studying the impact of model resolution and physical parameterization. The Weather Research and Forecasting (WRF) model coupled to the NOAH land–surface model was operated during July and August 2013 at two different horizontal resolutions, namely 0.03 (HIRES) and 0.12° (LOWRES). Both simulations were forced by the European Centre for Medium-Range Weather Forecasts (ECMWF) operational analysis data at the northern and southern domain boundaries, and the high-resolution Operational Sea Surface Temperature and Sea Ice Analysis (OSTIA) data at the sea surface.The simulations are compared to the operational ECMWF analysis for the representation of large-scale features. To analyze the simulated precipitation, the operational ECMWF forecast, the CPC MORPHing (CMORPH), and the ENSEMBLES gridded observation precipitation data set (E-OBS) were used as references.Analyzing pressure, geopotential height, wind, and temperature fields as well as precipitation revealed (1) a benefit from the higher resolution concerning the reduction of monthly biases, root mean square error, and an improved Pearson skill score, and (2) deficiencies in the physical parameterizations leading to notable biases in distinct regions like the polar Atlantic for the LOWRES simulation, the North Pacific, and Inner Mongolia for both resolutions.In summary, the application of a latitude belt on a convection-permitting resolution shows promising results that are beneficial for future seasonal forecasting.


2012 ◽  
Vol 27 (1) ◽  
pp. 124-140 ◽  
Author(s):  
Bin Liu ◽  
Lian Xie

Abstract Accurately forecasting a tropical cyclone’s (TC) track and intensity remains one of the top priorities in weather forecasting. A dynamical downscaling approach based on the scale-selective data assimilation (SSDA) method is applied to demonstrate its effectiveness in TC track and intensity forecasting. The SSDA approach retains the merits of global models in representing large-scale environmental flows and regional models in describing small-scale characteristics. The regional model is driven from the model domain interior by assimilating large-scale flows from global models, as well as from the model lateral boundaries by the conventional sponge zone relaxation. By using Hurricane Felix (2007) as a demonstration case, it is shown that, by assimilating large-scale flows from the Global Forecast System (GFS) forecasts into the regional model, the SSDA experiments perform better than both the original GFS forecasts and the control experiments, in which the regional model is only driven by lateral boundary conditions. The overall mean track forecast error for the SSDA experiments is reduced by over 40% relative to the control experiments, and by about 30% relative to the GFS forecasts, respectively. In terms of TC intensity, benefiting from higher grid resolution that better represents regional and small-scale processes, both the control and SSDA runs outperform the GFS forecasts. The SSDA runs show approximately 14% less overall mean intensity forecast error than do the control runs. It should be noted that, for the Felix case, the advantage of SSDA becomes more evident for forecasts with a lead time longer than 48 h.


Sign in / Sign up

Export Citation Format

Share Document