Single-station SVD-based polarization filtering of ground roll: Perfection and investigation of limitations and pitfalls

Geophysics ◽  
2012 ◽  
Vol 77 (2) ◽  
pp. V41-V59 ◽  
Author(s):  
Olena Tiapkina ◽  
Martin Landrø ◽  
Yuriy Tyapkin ◽  
Brian Link

The advent of single receiver point, multi-component geophones has necessitated that ground roll be removed in the processing flow rather than through acquisition design. A wide class of processing methods for ground-roll elimination is polarization filtering. A number of these methods use singular value decomposition (SVD) or some related transformations. We focus on a single-station SVD-based polarization filter that we consider to be one of the best in the industry. The method is comprised of two stages: (1) ground-roll detection and (2) ground-roll estimation and filtering. To detect the ground roll, a special attribute dependent on the singular values of a three-column matrix formed by a sliding time window is used. The ground roll is approximated and subtracted using the first two eigenimages of this matrix. To limit the possible damage to the signal, the filter operates within the record intervals where the ground roll is detected and within the ground-roll frequency bandwidth only. We improve the ground-roll detector to make it theoretically insensitive to ambient noise and more sensitive to the presence of ground roll. The advantage of the new detector is demonstrated on synthetic and field data sets. We estimate theoretically and with synthetic data the attenuation of the underlying reflections that can be caused by the polarization filter. We show that the underlying signal always loses almost all the energy on the vertical component and on the horizontal component in the ground-roll propagation plane and within the ground-roll frequency bandwidth. The only signal component, if it exists, that can retain a significant part of its energy is the horizontal component orthogonal to the above plane. When 2D 3C field operations are conducted, the signal particle motion can deviate from the ground-roll propagation plane and can therefore retain some of its energy due to a set of offline reflections. In the case of 3D 3C seismic surveys, the reflected signal always deviates from the ground-roll propagation plane on the receiver lines that do not contain the source. This is confirmed with a 2.5D 3C synthetic data set. We discuss when the ability of the filter to effectively subtract the ground roll may, or may not, allow us to ignore the inevitable harm that is done to the underlying reflected waves.

2020 ◽  
Vol 223 (2) ◽  
pp. 1040-1053
Author(s):  
Tianjian Cheng ◽  
Brady R Cox ◽  
Joseph P Vantassel ◽  
Lance Manuel

SUMMARY The horizontal-to-vertical spectral ratio (HVSR) of ambient noise is commonly used to infer a site's resonance frequency (${f_{0,site}}$). HVSR calculations are performed most commonly using the Fourier amplitude spectrum obtained from a single merged horizontal component (e.g. the geometric mean component) from a three-component sensor. However, the use of a single merged horizontal component implicitly relies on the assumptions of azimuthally isotropic seismic noise and 1-D surface and subsurface conditions. These assumptions may not be justified at many sites, leading to azimuthal variability in HVSR measurements that cannot be accounted for using a single merged component. This paper proposes a new statistical method to account for azimuthal variability in the peak frequency of HVSR curves (${f_{0,HVSR}}$). The method uses rotated horizontal components at evenly distributed azimuthal intervals to investigate and quantify azimuthal variability. To ensure unbiased statistics for ${f_{0,HVSR}}$ are obtained, a frequency-domain window-rejection algorithm is applied at each azimuth to automatically remove contaminated time windows in which the ${f_{0,HVSR}}$ values are statistical outliers relative to those obtained from the majority of windows at that azimuth. Then, a weighting scheme is used to account for different numbers of accepted time windows at each azimuth. The new method is applied to a data set of 114 HVSR measurements with significant azimuthal variability in ${f_{0,HVSR}}$, and is shown to reliably account for this variability. The methodology is also extended to the estimation of a complete lognormal-median HVSR curve that accounts for azimuthal variability. To encourage the adoption of this statistical approach to accounting for azimuthal variability in single-station HVSR measurements, the methods presented in this paper have been incorporated into hvsrpy, an open-source Python package for HVSR processing.


2021 ◽  
Vol 23 (4) ◽  
pp. 745-756
Author(s):  
Yi Lyu ◽  
Yijie Jiang ◽  
Qichen Zhang ◽  
Ci Chen

Remaining useful life (RUL) prediction plays a crucial role in decision-making in conditionbased maintenance for preventing catastrophic field failure. For degradation-failed products, the data of performance deterioration process are the key for lifetime estimation. Deep learning has been proved to have excellent performance in RUL prediction given that the degradation data are sufficiently large. However, in some applications, the degradation data are insufficient, under which how to improve the prediction accuracy is yet a challenging problem. To tackle such a challenge, we propose a novel deep learning-based RUL prediction framework by amplifying the degradation dataset. Specifically, we leverage the cycle-consistent generative adversarial network to generate the synthetic data, based on which the original degradation dataset is amplified so that the data characteristics hidden in the sample space could be captured. Moreover, the sliding time window strategy and deep bidirectional long short-term memory network are employed to complete the RUL prediction framework. We show the effectiveness of the proposed method by running it on the turbine engine data set from the National Aeronautics and Space Administration. The comparative experiments show that our method outperforms a case without the use of the synthetically generated data.


Geophysics ◽  
2006 ◽  
Vol 71 (6) ◽  
pp. J71-J80 ◽  
Author(s):  
Maria A. Annecchione ◽  
Pierre Keating ◽  
Michel Chouteau

Airborne gravimeters based on inertial navigation system (INS) technology are capable, in theory, of providing direct observations of the horizontal components of anomalous gravity. However, their accuracy and usefulness in geophysical or geological applications is unknown. Determining the accuracy of airborne horizontal component data is complicated by the lack of ground-surveyed control data. We determine the accuracy of airborne vector gravity data internally using repeatedly flown line data. Multilevel wavelet analyses of raw vector gravity data elucidate the limiting error source for the horizontal components. We demonstrate the usefulness of the airborne horizontal component data by performing Euler deconvolutions on real vector gravity data. The accuracy of the horizontal components is lower than the accuracy of the vertical component. Wavelet analyses of data from a test flight over Alexandria, Ontario, Canada, show that the main source of error limiting the accuracy of the horizontal components is time-dependent platform alignment errors. Euler deconvolutions performed on the Timmins data set show that the horizontal components help in constraining the 3D locations of regional geological features. It is thus concluded that the quality of the airborne horizontal component data is sufficient to motivate their use in resource exploration and geological applications.


Author(s):  
David M. Wittman

Galilean relativity is a useful description of nature at low speed. Galileo found that the vertical component of a projectile’s velocity evolves independently of its horizontal component. In a frame that moves horizontally along with the projectile, for example, the projectile appears to go straight up and down exactly as if it had been launched vertically. The laws of motion in one dimension are independent of any motion in the other dimensions. This leads to the idea that the laws of motion (and all other laws of physics) are equally valid in any inertial frame: the principle of relativity. This principle implies that no inertial frame can be considered “really stationary” or “really moving.” There is no absolute standard of velocity (contrast this with acceleration where Newton’s first law provides an absolute standard). We discuss some apparent counterexamples in everyday experience, and show how everyday experience can be misleading.


Author(s):  
Raul E. Avelar ◽  
Karen Dixon ◽  
Boniphace Kutela ◽  
Sam Klump ◽  
Beth Wemple ◽  
...  

The calibration of safety performance functions (SPFs) is a mechanism included in the Highway Safety Manual (HSM) to adjust SPFs in the HSM for use in intended jurisdictions. Critically, the quality of the calibration procedure must be assessed before using the calibrated SPFs. Multiple resources to aid practitioners in calibrating SPFs have been developed in the years following the publication of the HSM 1st edition. Similarly, the literature suggests multiple ways to assess the goodness-of-fit (GOF) of a calibrated SPF to a data set from a given jurisdiction. This paper uses the calibration results of multiple intersection SPFs to a large Mississippi safety database to examine the relations between multiple GOF metrics. The goal is to develop a sensible single index that leverages the joint information from multiple GOF metrics to assess overall quality of calibration. A factor analysis applied to the calibration results revealed three underlying factors explaining 76% of the variability in the data. From these results, the authors developed an index and performed a sensitivity analysis. The key metrics were found to be, in descending order: the deviation of the cumulative residual (CURE) plot from the 95% confidence area, the mean absolute deviation, the modified R-squared, and the value of the calibration factor. This paper also presents comparisons between the index and alternative scoring strategies, as well as an effort to verify the results using synthetic data. The developed index is recommended to comprehensively assess the quality of the calibrated intersection SPFs.


Water ◽  
2021 ◽  
Vol 13 (1) ◽  
pp. 107
Author(s):  
Elahe Jamalinia ◽  
Faraz S. Tehrani ◽  
Susan C. Steele-Dunne ◽  
Philip J. Vardon

Climatic conditions and vegetation cover influence water flux in a dike, and potentially the dike stability. A comprehensive numerical simulation is computationally too expensive to be used for the near real-time analysis of a dike network. Therefore, this study investigates a random forest (RF) regressor to build a data-driven surrogate for a numerical model to forecast the temporal macro-stability of dikes. To that end, daily inputs and outputs of a ten-year coupled numerical simulation of an idealised dike (2009–2019) are used to create a synthetic data set, comprising features that can be observed from a dike surface, with the calculated factor of safety (FoS) as the target variable. The data set before 2018 is split into training and testing sets to build and train the RF. The predicted FoS is strongly correlated with the numerical FoS for data that belong to the test set (before 2018). However, the trained model shows lower performance for data in the evaluation set (after 2018) if further surface cracking occurs. This proof-of-concept shows that a data-driven surrogate can be used to determine dike stability for conditions similar to the training data, which could be used to identify vulnerable locations in a dike network for further examination.


Geophysics ◽  
2006 ◽  
Vol 71 (5) ◽  
pp. U67-U76 ◽  
Author(s):  
Robert J. Ferguson

The possibility of improving regularization/datuming of seismic data is investigated by treating wavefield extrapolation as an inversion problem. Weighted, damped least squares is then used to produce the regularized/datumed wavefield. Regularization/datuming is extremely costly because of computing the Hessian, so an efficient approximation is introduced. Approximation is achieved by computing a limited number of diagonals in the operators involved. Real and synthetic data examples demonstrate the utility of this approach. For synthetic data, regularization/datuming is demonstrated for large extrapolation distances using a highly irregular recording array. Without approximation, regularization/datuming returns a regularized wavefield with reduced operator artifacts when compared to a nonregularizing method such as generalized phase shift plus interpolation (PSPI). Approximate regularization/datuming returns a regularized wavefield for approximately two orders of magnitude less in cost; but it is dip limited, though in a controllable way, compared to the full method. The Foothills structural data set, a freely available data set from the Rocky Mountains of Canada, demonstrates application to real data. The data have highly irregular sampling along the shot coordinate, and they suffer from significant near-surface effects. Approximate regularization/datuming returns common receiver data that are superior in appearance compared to conventional datuming.


Sign in / Sign up

Export Citation Format

Share Document