scholarly journals Probabilistic moveout analysis by time warping

Geophysics ◽  
2020 ◽  
Vol 85 (1) ◽  
pp. U1-U20
Author(s):  
Yanadet Sripanich ◽  
Sergey Fomel ◽  
Jeannot Trampert ◽  
William Burnett ◽  
Thomas Hess

Parameter estimation from reflection moveout analysis represents one of the most fundamental problems in subsurface model building. We have developed an efficient moveout inversion method based on the process of automatic flattening of common-midpoint (CMP) gathers using local slopes. We find that as a by-product of this flattening process, we can also estimate reflection traveltimes corresponding to the flattened CMP gathers. This traveltime information allows us to construct a highly overdetermined system and subsequently invert for moveout parameters including normal-moveout velocities and quartic coefficients related to anisotropy. We use the 3D generalized moveout approximation (GMA), which can accurately capture the effects of complex anisotropy on reflection traveltimes as the basis for our moveout inversion. Due to the cheap forward traveltime computations by GMA, we use a Monte Carlo inversion scheme for improved handling of the nonlinearity between the reflection traveltimes and moveout parameters. This choice also allows us to set up a probabilistic inversion workflow within a Bayesian framework, in which we can obtain the posterior probability distributions that contain valuable statistical information on estimated parameters such as uncertainty and correlations. We use synthetic and real data examples including the data from the SEAM Phase II unconventional reservoir model to demonstrate the performance of our method and discuss insights into the problem of moveout inversion gained from analyzing the posterior probability distributions. Our results suggest that the solutions to the problem of traveltime-only moveout inversion from 2D CMP gathers are relatively well constrained by the data. However, parameter estimation from 3D CMP gathers associated with more moveout parameters and complex anisotropic models are generally nonunique, and there are trade-offs among inverted parameters, especially the quartic coefficients.

Geophysics ◽  
2020 ◽  
pp. 1-98
Author(s):  
Bo Yu ◽  
Hui Zhou ◽  
Lingqian Wang ◽  
Wenling Liu

Bayesian statistical inversion can integrate diverse datasets to infer the posterior probability distributions of subsurface elastic properties. However, certain existing methods may suffer from two issues in practical applications, namely spatial discontinuities and the uncertainty caused by the low-quality seismic traces. These limitations are evident in prestack statistical inversion since some traces in prestack angle gathers may be missing or low-quality. We propose a prestack Bayesian statistical inversion method constrained by reflection features to alleviate these issues. Based on a Bayesian linearized inversion framework, the proposed inversion approach is implemented by integrating the prestack seismic data with reflection features. The reflection features are captured from the poststack seismic profile, and they represent the relationships of the reflection coefficients between different traces. By utilizing the proposed approach, we are able to achieve superior inversion results and to evaluate inversion uncertainty simultaneously even with the low-quality prestack seismic data. The results of the synthetic and field data tests confirm the theoretical and practical effects of the reflection features on improving inversion continuity and accuracy and reducing inversion uncertainty. Moreover, this work gives a novel way to integrate the information of geological structures in statistical inversion methods. Other geological information, which can be linearized accurately or approximately, can be utilized in this manner.


2020 ◽  
Vol 126 (4) ◽  
pp. 687-699 ◽  
Author(s):  
Véronique Letort ◽  
Sylvie Sabatier ◽  
Michelle Pamelas Okoma ◽  
Marc Jaeger ◽  
Philippe de Reffye

Abstract Background and Aims Using internal trophic pressure as a regulating variable to model the complex interaction loops between organogenesis, production of assimilates and partitioning in functional–structural models of plant growth has attracted increasing interest in recent years. However, this approach is hampered by the fact that internal trophic pressure is a non-measurable quantity that can be assessed only through model parametric estimation, for which the methodology is not straightforward, especially when the model is stochastic. Methods A stochastic GreenLab model of plant growth (called ‘GL4’) is developed with a feedback effect of internal trophic competition, represented by the ratio of biomass supply to demand (Q/D), on organogenesis. A methodology for its parameter estimation is presented and applied to a dataset of 15 two-year-old Coffea canephora trees. Based on the fitting results, variations in Q/D are reconstructed and analysed in relation to the estimated variations in organogenesis parameters. Key Results Our stochastic retroactive model was able to simulate realistically the progressive set-up of young plant architecture and the branch pruning effect. Parameter estimation using real data for Coffea trees provided access to the internal trophic dynamics. These dynamics correlated with the organogenesis probabilities during the establishment phase. Conclusions The model can satisfactorily reproduce the measured data, thus opening up promising avenues for further applying this original procedure to other experimental data. The framework developed can serve as a model-based toolkit to reconstruct the hidden internal trophic dynamics of plant growth.


Geophysics ◽  
1999 ◽  
Vol 64 (3) ◽  
pp. 820-837 ◽  
Author(s):  
Vladimir Grechka ◽  
Ilya Tsvankin

Orthorhombic symmetry describes several azimuthally anisotropic models typical for fractured formations, such as those containing two orthogonal crack systems or parallel vertical cracks in a VTI (transversely isotropic with a vertical symmetry axis) background. Here, we present a methodology for inverting multiazimuth P-wave reflection traveltimes for the parameters of vertically inhomogeneous orthorhombic media. Our approach is based on the general analytic representation of normal‐moveout (NMO) velocity as an ellipse in the horizontal plane. A minimum of three differently oriented common‐midpoint (CMP) lines (or a “wideazimuth” 3-D survey) is needed to reconstruct the ellipse and thus obtain NMO velocity in any azimuthal direction. Then, the orientation and the semiaxes of the NMO ellipse, which are dependent on both anisotropy and heterogeneity, can be inverted for the medium parameters. Our analytic and numerical study shows that for the model of a homogeneous orthorhombic layer above a dipping reflector, the exact P-wave NMO velocity is determined by the symmetry‐plane orientation and five parameters: the NMO velocities from a horizontal reflector measured in the symmetry planes [[Formula: see text]] and three anisotropic coefficients η(1,2,3) introduced by analogy with the Alkhalifah‐Tsvankin parameter η for VTI media. The importance of the medium parameterization in terms of the η coefficients goes well beyond the NMO-velocity function. By generating migration impulse responses, we demonstrate that the parameters [Formula: see text] and η(1,2,3) are sufficient to perform all time processing steps (normal‐moveout and dip‐moveout corrections, prestack and poststack time migration) in orthorhombic models. The velocities [Formula: see text] and the orientation of the vertical symmetry planes can be found using the azimuthally dependent NMO velocity from a horizontal reflector. Then the NMO ellipse of at least one dipping event is additionally needed to obtain the coefficients η(1,2,3) that control the dip dependence of normal moveout. We discuss the stability of the inversion procedure and specify the constraints on the dip and azimuth of the reflector; for instance, for all three η coefficients to be resolved individually, the dip plane of the reflector should not coincide with either of the symmetry planes. To carry out parameter estimation in vertically inhomogeneous orthorhombic media, we apply the generalized Dix equation of Grechka, Tsvankin and Cohen, which operates with the matrices responsible for interval NMO ellipses rather than with the NMO velocities themselves. Our algorithm is designed to find the interval values of [Formula: see text] and η(1,2,3) using moveout from horizontal and dipping reflectors measured at different vertical times (i.e., only surface P-wave data are needed). Application to a synthetic multiazimuth P-wave data set over a layered orthorhombic medium with depth‐varying orientation of the symmetry planes verifies the accuracy of the inversion method.


1997 ◽  
Vol 36 (5) ◽  
pp. 177-184
Author(s):  
Lennart Heip ◽  
Johan Van Assel ◽  
Patrick Swartenbroekx

Within the framework of an EC-funded SPRINT-project, a sewer flow quality model of a typical rural Flemish catchment was set up. The applicability of such a model is demonstrated. Furthermore a methodology for model building, data collection and model calibration and verification is proposed. To this end an intensive 9 month measuring campaign was undertaken. The hydraulic behaviour of the sewer network was continuously monitored during those 9 months. During both dry weather flow (DWF) and wet weather flow (WWF) a number of sewage samples were taken and analysed for BOD, COD, TKN, TP and TSS. This resulted in 286 WWF and 269 DWF samples. The model was calibrated and verified with these data. Finally a software independent methodology for interpretation of the model results is proposed.


2021 ◽  
Vol 9 (5) ◽  
pp. 467
Author(s):  
Mostafa Farrag ◽  
Gerald Corzo Perez ◽  
Dimitri Solomatine

Many grid-based spatial hydrological models suffer from the complexity of setting up a coherent spatial structure to calibrate such a complex, highly parameterized system. There are essential aspects of model-building to be taken into account: spatial resolution, the routing equation limitations, and calibration of spatial parameters, and their influence on modeling results, all are decisions that are often made without adequate analysis. In this research, an experimental analysis of grid discretization level, an analysis of processes integration, and the routing concepts are analyzed. The HBV-96 model is set up for each cell, and later on, cells are integrated into an interlinked modeling system (Hapi). The Jiboa River Basin in El Salvador is used as a case study. The first concept tested is the model structure temporal responses, which are highly linked to the runoff dynamics. By changing the runoff generation model description, we explore the responses to events. Two routing models are considered: Muskingum, which routes the runoff from each cell following the river network, and Maxbas, which routes the runoff directly to the outlet. The second concept is the spatial representation, where the model is built and tested for different spatial resolutions (500 m, 1 km, 2 km, and 4 km). The results show that the spatial sensitivity of the resolution is highly linked to the routing method, and it was found that routing sensitivity influenced the model performance more than the spatial discretization, and allowing for coarser discretization makes the model simpler and computationally faster. Slight performance improvement is gained by using different parameters’ values for each cell. It was found that the 2 km cell size corresponds to the least model error values. The proposed hydrological modeling codes have been published as open-source.


Author(s):  
Richard Steinberg ◽  
Raytheon Company ◽  
Alice Diggs ◽  
Raytheon Company ◽  
Jade Driggs

Verification and validation (V&V) for human performance models (HPMs) can be likened to building a house with no bricks, since it is difficult to obtain metrics to validate a model when the system is still in development. HPMs are effective for performing trade-offs between the human system designs factors including number of operators needed, the role of automated tasks versus operator tasks, and member task responsibilities required to operate a system. On a recent government contract, our team used a human performance model to provide additional analysis beyond traditional trade studies. Our team verified the contractually mandated staff size for using the system. This task demanded that the model have sufficient fidelity to provide information for high confidence staffing decisions. It required a method for verifying and validating the model and its results to ensure that it accurately reflected the real world. The situation caused a dilemma because there was no actual system to gather real data to use to validate the model. It is a challenge to validate human performance models, since they support design decisions prior to system. For example, crew models are typically inform the design, staffing needs, and the requirements for each operator’s user interface prior to development. This paper discusses a successful case study for how our team met the V&V challenges with the US Air Force model accreditation authority and successfully accredited our human performance model with enough fidelity for requirements testing on an Air Force Command and Control program.


2015 ◽  
Vol 2015 ◽  
pp. 1-10 ◽  
Author(s):  
Marcelo H. Alencar ◽  
Adiel T. de Almeida

This paper proposes a multicriteria decision model based on MAUT (Multiattribute Utility Theory) incorporated into an RCM (Reliability Centered Maintenance) approach in order to provide a better assessment of the consequences of failure, allowing a more effective maintenance planning. MAUT provides an evaluation of probability distributions on each attribute as well as trade-offs involving lotteries. The model proposed takes advantage of such evaluations and it also restructures consequence groups established in an RCM approach into new five dimensions. As a result, overall indices of utility are computed for each failure mode analyzed. With these values, the ranking of the alternatives is established. The decision-maker’s preferences are taken into account so that the final result for each failure mode incorporates subjective aspects based on the decision-maker’s perceptions and behavior.


2020 ◽  
Vol 0 (0) ◽  
Author(s):  
Zahra Amini Farsani ◽  
Volker J. Schmid

AbstractCo-localization analysis is a popular method for quantitative analysis in fluorescence microscopy imaging. The localization of marked proteins in the cell nucleus allows a deep insight into biological processes in the nucleus. Several metrics have been developed for measuring the co-localization of two markers, however, they depend on subjective thresholding of background and the assumption of linearity. We propose a robust method to estimate the bivariate distribution function of two color channels. From this, we can quantify their co- or anti-colocalization. The proposed method is a combination of the Maximum Entropy Method (MEM) and a Gaussian Copula, which we call the Maximum Entropy Copula (MEC). This new method can measure the spatial and nonlinear correlation of signals to determine the marker colocalization in fluorescence microscopy images. The proposed method is compared with MEM for bivariate probability distributions. The new colocalization metric is validated on simulated and real data. The results show that MEC can determine co- and anti-colocalization even in high background settings. MEC can, therefore, be used as a robust tool for colocalization analysis.


1998 ◽  
Vol 5 (2) ◽  
pp. 93-104 ◽  
Author(s):  
D. Harris ◽  
M. Menabde ◽  
A. Seed ◽  
G. Austin

Abstract. The theory of scale similarity and breakdown coefficients is applied here to intermittent rainfall data consisting of time series and spatial rain fields. The probability distributions (pdf) of the logarithm of the breakdown coefficients are the principal descriptor used. Rain fields are distinguished as being either multiscaling or multiaffine depending on whether the pdfs of breakdown coefficients are scale similar or scale dependent, respectively. Parameter  estimation techniques are developed which are applicable to both multiscaling and multiaffine fields. The scale parameter (width), σ, of the pdfs of the log-breakdown coefficients is a measure of the intermittency of a field. For multiaffine fields, this scale parameter is found to increase with scale in a power-law fashion consistent with a bounded-cascade picture of rainfall modelling. The resulting power-law exponent, H, is indicative of the smoothness of the field. Some details of breakdown coefficient analysis are addressed and a theoretical link between this analysis and moment scaling analysis is also presented. Breakdown coefficient properties of cascades are also investigated in the context of parameter estimation for modelling purposes.


Sign in / Sign up

Export Citation Format

Share Document