scholarly journals Density inversion of selected microgravity anomalies using L2-smoothing and minimum support focusing stabilizers

2021 ◽  
Vol 51 (1) ◽  
pp. 63-81
Author(s):  
Ivan ZVARA ◽  
Roman PAŠTEKA ◽  
Roland KARCOL

Interpretation and inversion of microgravity anomalies belong to important tasks of near-surface geophysics, mostly in cavities detection in engineering, environmental and archaeological applications. One of the mostly used concepts of inversion in applied gravimetry is based on the approximation of the model space by means of 2D or 3D elementary sources with the aim to estimate their densities by means of the solution of a corresponding linear equation system. There were published several approaches trying to obtain correct and realistic results, which describe real parameters of the sources. In the proposed contribution we analyse the properties of two additional functionals, which describe additional properties of the searched solution – namely so-called L2-smoothing and minimum support focusing stabilizers. For the inversion itself, we have used the regularized conjugate gradient method. We have studied properties of these two stabilizers in the case of one synthetic model and one real-world dataset (microgravity data from St. Nicholas church in Trnava). Results have shown that proposed algorithm with the minimum support stabilizer can generate satisfactory model results, from which we can describe real geometry, dimensions and physical properties of interpreted cavities.

2016 ◽  
Vol 78 (8-2) ◽  
Author(s):  
Norma Alias ◽  
Nadia Nofri Yeni Suhari ◽  
Hafizah Farhah Saipan Saipol ◽  
Abdullah Aysh Dahawi ◽  
Masyitah Mohd Saidi ◽  
...  

This paper proposed the several real life applications for big data analytic using parallel computing software. Some parallel computing software under consideration are Parallel Virtual Machine, MATLAB Distributed Computing Server and Compute Unified Device Architecture to simulate the big data problems. The parallel computing is able to overcome the poor performance at the runtime, speedup and efficiency of programming in sequential computing. The mathematical models for the big data analytic are based on partial differential equations and obtained the large sparse matrices from discretization and development of the linear equation system. Iterative numerical schemes are used to solve the problems. Thus, the process of computational problems are summarized in parallel algorithm. Therefore, the parallel algorithm development is based on domain decomposition of problems and the architecture of difference parallel computing software. The parallel performance evaluations for distributed and shared memory architecture are investigated in terms of speedup, efficiency, effectiveness and temporal performance.


2021 ◽  
Author(s):  
Maike Offer ◽  
Riccardo Scandroglio ◽  
Daniel Draebing ◽  
Michael Krautblatter

<p>Warming of permafrost in steep rock walls decreases their mechanical stability and could triggers rockfalls and rockslides. However, the direct link between climate change and permafrost degradation is seldom quantified with precise monitoring techniques and long-term time series. Where boreholes are not possible, laboratory-calibrated Electrical Resistivity Tomography (ERT) is presumably the most accurate quantitative permafrost monitoring technique providing a sensitive record for frozen vs. unfrozen bedrock. Recently, 4D inversions allow also quantification of frozen bedrock extension and of its changes with time (Scandroglio et al., in review).</p><p>In this study we (i) evaluate the influence of the inversion parameters on the volumes and (ii) connect the volumetric changes with measured mechanical consequences.</p><p>The ERT time-serie was recorded between 2006 and 2019 in steep bedrock at the permafrost affected Steintälli Ridge (3100 m asl). Accurately positioned 205 drilled-in steel electrodes in 5 parallel lines across the rock ridge have been repeatedly measured with similar hardware and are compared to laboratory temperature-resistivity (T–ρ) calibration of water-saturated samples from the field. Inversions were conducted using the open-source software BERT for the first time with the aim of estimating permafrost volumetric changes over a decade.</p><p>(i) Here we present a sensitivity analysis of the outcomes by testing various plausible inversion set-ups. Results are computed with different input data filters, data error model, regularization parameter (λ), model roughness reweighting and time-lapse constraints. The model with the largest permafrost degradation was obtained without any time-lapse constraints, whereas constraining each model with the prior measurement results in the smallest degradation. Important changes are also connected to the data error estimation, while other setting seems to have less influence on the frozen volume. All inversions confirmed a drastic permafrost degradation in the last 13 years with an average reduction of 3.900±600 m<sup>3</sup> (60±10% of the starting volume), well in agreement with the measured air temperatures increase.</p><p>(ii) Average bedrock thawing rate of ~300 m<sup>3</sup>/a is expected to significantly influence the stability of the ridge. Resistivity changes are especially evident on the south-west exposed side and in the core of the ridge and are here connected to deformations measured with tape extensometer, in order to precisely estimate the mechanical consequences of bedrock warming.</p><p>In summary, the strong degradation of permafrost in the last decade it’s here confirmed since inversion settings only have minor influence on volume quantification. Internal thermal dynamics need correlation with measured external deformation for a correct interpretation of stability consequences. These results are a fundamental benchmark for evaluating mountain permafrost degradation in relation to climate change and demonstrate the key role of temperature-calibrated 4D ERT.</p><p> </p><p>Reference:</p><p>Scandroglio, R. et al. (in review) ‘4D-Quantification of alpine permafrost degradation in steep rock walls using a laboratory-calibrated ERT approach’, <em>Near Surface Geophysics</em>.</p>


2021 ◽  
Vol 1 (1) ◽  
pp. 119-123
Author(s):  
Nurhayati Abbas ◽  
Nancy Katili ◽  
Dwi Hardianty Djoyosuroto

This research is motivated by the lack of mathematics teaching materials that can make students learn on their own. The teaching material can be created by teachers as they are the ones who possess the knowledge about their students’ characteristics. Further, learning materials are a set of materials (information, tools, or texts) that can aid teachers and students to carry out the learning process. The two-variable linear equation system (SPLDV) is one of the mathematics materials taught to eighth-grade students of junior high school; it contains problems related to daily life. However, it is found that this material is still difficult to master by most students. Therefore, it is necessary to develop the SPLDV teaching materials that can help students learn and solve problems as well as be used as examples by teachers in developing other materials. This research aimed to make problem-based SPLDV teaching materials. The research method refers to the Four-D Model by Thiagarajan, Semmel, and Semmel (1974). It consisted of defining, designing, developing, and disseminating. The results showed that problem-based SPLDV teaching materials could be used in learning activities as the students and the teachers had shown their positive responses after going through expert assessments. This study also suggested that the teachers use this teaching material and adopt teaching materials for other similar materials.


Sign in / Sign up

Export Citation Format

Share Document