Analysis of depth resolution in potential-field inversion

Geophysics ◽  
2005 ◽  
Vol 70 (6) ◽  
pp. A1-A11 ◽  
Author(s):  
Maurizio Fedi ◽  
Per Christian Hansen ◽  
Valeria Paoletti

We study the inversion of potential fields and evaluate the degree of depth resolution achievable for a given problem. To this end, we introduce a powerful new tool: the depth-resolution plot (DRP). The DRP allows a theoretical study of how much the depth resolution in a potential-field inversion is influenced by the way the problem is discretized and regularized. The DRP also allows a careful study of the influence of various kinds of ambiguities, such as those from data errors or of a purely algebraic nature. The achievable depth resolution is related to the given discretization, regularization, and data noise level. We compute DRP by means of singular-value decomposition (SVD) or its generalization (GSVD), depending on the particular regularization method chosen. To illustrate the use of the DRP, we assume a source volume of specified depth and horizontal extent in which the solution is piecewise constant within a 3D grid of blocks. We consider various linear regularization terms in a Tikhonov (damped least-squares) formulation, some based on using higher-order derivatives in the objective function. DRPs are illustrated for both synthetic and real data. Our analysis shows that if the algebraic ambiguity is not too large and a suitable smoothing norm is used, some depth resolution can be obtained without resorting to any subjective choice of depth weighting.

Geophysics ◽  
2014 ◽  
Vol 79 (4) ◽  
pp. A33-A38 ◽  
Author(s):  
Valeria Paoletti ◽  
Per Christian Hansen ◽  
Mads Friis Hansen ◽  
Maurizio Fedi

In potential-field inversion, careful management of singular value decomposition components is crucial for obtaining information about the source distribution with respect to depth. In principle, the depth-resolution plot provides a convenient visual tool for this analysis, but its computational complexity has hitherto prevented application to large-scale problems. To analyze depth resolution in such problems, we developed a variant ApproxDRP, which is based on an iterative algorithm and therefore suited for large-scale problems because we avoid matrix factorizations and the associated demands on memory and computing time. We used the ApproxDRP to study retrievable depth resolution in inversion of the gravity field of the Neapolitan Volcanic Area. Our main contribution is the combined use of the Lanczos bidiagonalization algorithm, established in the scientific computing community, and the depth-resolution plot defined in the geoscience community.


Geophysics ◽  
2006 ◽  
Vol 71 (5) ◽  
pp. U67-U76 ◽  
Author(s):  
Robert J. Ferguson

The possibility of improving regularization/datuming of seismic data is investigated by treating wavefield extrapolation as an inversion problem. Weighted, damped least squares is then used to produce the regularized/datumed wavefield. Regularization/datuming is extremely costly because of computing the Hessian, so an efficient approximation is introduced. Approximation is achieved by computing a limited number of diagonals in the operators involved. Real and synthetic data examples demonstrate the utility of this approach. For synthetic data, regularization/datuming is demonstrated for large extrapolation distances using a highly irregular recording array. Without approximation, regularization/datuming returns a regularized wavefield with reduced operator artifacts when compared to a nonregularizing method such as generalized phase shift plus interpolation (PSPI). Approximate regularization/datuming returns a regularized wavefield for approximately two orders of magnitude less in cost; but it is dip limited, though in a controllable way, compared to the full method. The Foothills structural data set, a freely available data set from the Rocky Mountains of Canada, demonstrates application to real data. The data have highly irregular sampling along the shot coordinate, and they suffer from significant near-surface effects. Approximate regularization/datuming returns common receiver data that are superior in appearance compared to conventional datuming.


Geophysics ◽  
1989 ◽  
Vol 54 (4) ◽  
pp. 497-507 ◽  
Author(s):  
Jorge W. D. Leão ◽  
João B. C. Silva

We present a new approach to perform any linear transformation of gridded potential field data using the equivalent‐layer principle. It is particularly efficient for processing areas with a large amount of data. An N × N data window is inverted using an M × M equivalent layer, with M greater than N so that the equivalent sources extend beyond the data window. Only the transformed field at the center of the data window is computed by premultiplying the equivalent source matrix by the row of the Green’s matrix (associated with the desired transformation) corresponding to the center of the data window. Since the inversion and the multiplication by the Green’s matrix are independent of the data, they are performed beforehand and just once for given values of N, M, and the depth of the equivalent layer. As a result, a grid operator for the desired transformation is obtained which is applied to the data by a procedure similar to discrete convolution. The application of this procedure in reducing synthetic anomalies to the pole and computing magnetization intensity maps shows that grid operators with N = 7 and M = 15 are sufficient to process large areas containing several interfering sources. The use of a damping factor allows the computation of meaningful maps even for unstable transformations in the presence of noise. Also, an equivalent layer larger than the data window takes into account part of the interfering sources so that a smaller damping factor is employed as compared with other damped inversion methods. Transformations of real data from Xingú River Basin and Amazon Basin, Brazil, demonstrate the contribution of this procedure for improvement of a preliminary geologic interpretation with minimum a priori information.


ACTA IMEKO ◽  
2016 ◽  
Vol 5 (4) ◽  
pp. 49 ◽  
Author(s):  
Zsolt János Viharos ◽  
Jeno Csanaki ◽  
János Nacsa ◽  
Márton Edelényi ◽  
Csaba Péntek ◽  
...  

<p>The paper introduces a methodology to define production trend classes and also the results to serve with trend prognosis in a given manufacturing situation. The prognosis is valid for one, selected production measure (e.g. a quality dimension of one product, like diameters, angles, surface roughness, pressure, basis position, etc.) but the applied model takes into account the past values of many other, related production data collected typically on the shop-floor, too. Consequently, it is useful in batch or (customized) mass production environments. The proposed solution is applicable to realize production control inside the tolerance limits to proactively avoid the production process going outside from the given upper and lower tolerance limits.</p><p>The solution was developed and validated on real data collected on the shop-floor; the paper also summarizes the validated application results of the proposed methodology.</p>


2021 ◽  
Vol 2021 ◽  
pp. 1-17
Author(s):  
Filip Lorenz ◽  
Vit Janos ◽  
Dusan Teichmann ◽  
Michal Dorda

The article addresses creation of a mathematical model for a real problem regarding time coordination of periodic train connections operated on single-track lines. The individual train connections are dispatched with a predefined tact, and their arrivals at and departures to predefined railway stations (transfer nodes) need to be coordinated one another. In addition, because the train connections are operated on single-track lines, trains that pass each other in a predefined railway stations must be also coordinated. To optimize the process, mathematical programming methods are used. The presented article includes a mathematical model of the given task, and the proposed model is tested with real data. The calculation experiments were implemented using optimization software Xpress-IVE.


Author(s):  
Vladimir T. Tepkeev ◽  

The article discusses the historiographic aspect of the KalmykTibetan relationships in the second quarter of the XVIII century. There were several significant events in the given period including the arrival to Kalmykia from Tibet of Shakur-lama who was chosen as the supreme lama of the KalmykBuddhists, the sending of the Kalmyk envoys to Tibet in 1729–1735 and in 1737, the giving of the ‘khanate’ muniment to the governor Tseren-Donduk by DalaiLama VII. The article analyzes the author’s publications of the last decade that made a great contribution to the topic research. There are several new significant works on the historiography of Kalmyk-Tibetan relationships including theses and monographs. Among the contemporary research there should be mentioned works of A. A. Kurapov, A. V. Tsuryumov, B. U. Kitinov and E. P. Bakaeva who managed to introduce a great number of unknown before sources on different languages. The further study of the history of the Kalmyk-Tibetan relationships that are relevant nowadays appears to be a high-potential field in the Mongolian Studies.


2018 ◽  
Vol 2018 ◽  
pp. 1-12 ◽  
Author(s):  
Suleman Nasiru

The need to develop generalizations of existing statistical distributions to make them more flexible in modeling real data sets is vital in parametric statistical modeling and inference. Thus, this study develops a new class of distributions called the extended odd Fréchet family of distributions for modifying existing standard distributions. Two special models named the extended odd Fréchet Nadarajah-Haghighi and extended odd Fréchet Weibull distributions are proposed using the developed family. The densities and the hazard rate functions of the two special distributions exhibit different kinds of monotonic and nonmonotonic shapes. The maximum likelihood method is used to develop estimators for the parameters of the new class of distributions. The application of the special distributions is illustrated by means of a real data set. The results revealed that the special distributions developed from the new family can provide reasonable parametric fit to the given data set compared to other existing distributions.


Sign in / Sign up

Export Citation Format

Share Document