scholarly journals Exploration of the data space via trans-dimensional sampling: the case study of seismic double difference data

2021 ◽  
Author(s):  
Nicola Piana Agostinetti ◽  
Giulia Sgattoni

Abstract. Double differences (DD) seismic data are widely used to define elasticity distribution in the Earth's interior, and its variation in time. DD data are often pre-processed from earthquakes recordings through expert-opinion, where couples of earthquakes are selected based on some user-defined criteria, and DD data are computed from the selected couples. We develop a novel methodology for preparing DD seismic data based on a trans-dimensional algorithm, without imposing pre-defined criteria on the selection of couples of events. We apply it to a seismic database recorded on the flank of Katla volcano (Iceland), where elasticity variations in time has been indicated. Our approach quantitatively defines the presence of changepoints that separate the seismic events in time-windows. Within each time-window, the DD data are consistent with the hypothesis of time-invariant elasticity in the subsurface, and DD data can be safely used in subsequent analysis. Due to the parsimonious behavior of the trans-dimensional algorithm, only changepoints supported by the data are retrieved. Our results indicate that: (a) retrieved changepoints are consistent with first-order variations in the data (i.e. most striking changes in the DD data are correctly reproduced in the changepoint distribution in time); (b) changepoint locations in time do correlate neither with changes in seismicity rate, nor with changes in waveforms similarity (measured through the cross-correlation coefficients); and (c) noteworthy, the changepoint distribution in time seems to be insensitive to variations in the seismic network geometry during the experiment. Our results proofs that trans-dimensional algorithms can be positively applied to pre-processing of geophysical data before the application of standard routines (i.e. before using them to solve standard geophysical inverse problems) in the so called exploration of the data space.

Solid Earth ◽  
2021 ◽  
Vol 12 (12) ◽  
pp. 2717-2733
Author(s):  
Nicola Piana Agostinetti ◽  
Giulia Sgattoni

Abstract. Double-difference (DD) seismic data are widely used to define elasticity distribution in the Earth's interior and its variation in time. DD data are often pre-processed from earthquake recordings through expert opinion, whereby pairs of earthquakes are selected based on some user-defined criteria and DD data are computed from the selected pairs. We develop a novel methodology for preparing DD seismic data based on a trans-dimensional algorithm, without imposing pre-defined criteria on the selection of event pairs. We apply it to a seismic database recorded on the flank of Katla volcano (Iceland), where elasticity variations in time have been indicated. Our approach quantitatively defines the presence of changepoints that separate the seismic events in time windows. Within each time window, the DD data are consistent with the hypothesis of time-invariant elasticity in the subsurface, and DD data can be safely used in subsequent analysis. Due to the parsimonious behaviour of the trans-dimensional algorithm, only changepoints supported by the data are retrieved. Our results indicate the following: (a) retrieved changepoints are consistent with first-order variations in the data (i.e. most striking changes in the amplitude of DD data are correctly reproduced in the changepoint distribution in time); (b) changepoint locations in time correlate neither with changes in seismicity rate nor with changes in waveform similarity (measured through the cross-correlation coefficients); and (c) the changepoint distribution in time seems to be insensitive to variations in the seismic network geometry during the experiment. Our results demonstrate that trans-dimensional algorithms can be effectively applied to pre-processing of geophysical data before the application of standard routines (e.g. before using them to solve standard geophysical inverse problems).


Author(s):  
Ricardo Marquez ◽  
Carlos F. M. Coimbra

This work presents an alternative, time-window invariant metric for evaluating the quality of solar forecasting models. Conventional approaches use statistical quantities such as the root-mean-square-error and/or the correlation coefficients to evaluate model quality. The straightforward use of statistical quantities to assign forecasting quality can be misleading because these metrics do not convey a measure of the variability of the time-series included in the solar irradiance data. In contrast, the quality metric proposed here, which is defined as the ratio of solar uncertainty to solar variability, compares forecasting error with solar variability directly. By making the forecasting error to variability comparisons for different time windows, we show that this ratio is essentially a statistical invariant for each forecasting model employed, i. e., the ratio is preserved for widely different time horizons.


Geophysics ◽  
2010 ◽  
Vol 75 (6) ◽  
pp. V111-V118 ◽  
Author(s):  
Okwudili Orji ◽  
Walter Söllner ◽  
Leiv Jacob Gelius

Sea-surface profile and reflection coefficient estimates are vital input parameters to various seismic data processing applications. The common assumption of a flat sea surface when processing seismic data can lead to misinterpretations and mislocations of events. A new method of imaging the sea surface from decomposed wavefields has been developed. Wavefield separation is applied to the data acquired by a towed dual-sensor streamer containing collocated pressure and vertical particle velocity sensors to obtain upgoing and downgoing wavefields of the related sensors. Time-gated upgoing and downgoing wavefields corresponding to a given sensor are then extrapolated to the sea surface where an imaging condition is applied so that the time-invariant shape of the sea surface can be recovered. By sliding the data time-window, the temporal changes of the sea surface can be correspondingly estimated. Ray tracing and finite-difference methods were used to generate different controlled data sets used in this feasibility study to demonstrate the imaging principle and to test the image accuracy. The method was also tested on a first field data example of a marginal weather line from the North Sea.


2019 ◽  
Vol 11 (3) ◽  
pp. 758
Author(s):  
Laijun Zhao ◽  
Xiaoli Wang ◽  
Johan Stoeter ◽  
Yan Sun ◽  
Huiyong Li ◽  
...  

Combined conventional ground transport with a subway system for line-haul transport for intra-city express delivery is a new transportation mode. Subway transportation can be used in the line-haul transportation of intra-city express delivery services to reduce cost, improve efficiency, raise customer satisfaction, and alleviate road congestion and air pollution. To achieve this, we developed a path optimization model (POM) with time windows for intra-city express delivery, which makes use of the subway system. Our model integrated the subway system with ground transportation in order to minimize the total delivery time. It considered the time window requirements of the senders and the recipients, and was constrained by the frequency of trains on the subway line. To solve the POM, we designed a genetic algorithm. The model was tested in a case study of a courier company in Shanghai, China. Meanwhile, based on the basic scenario, the corresponding solutions of the four different scenarios of the model are carried out. Then, we further analyzed the influence of the number of vehicles, the frequency of trains on the subway line, and the client delivery time window on the total delivery time, client time window satisfaction, and courier company costs based on the basic scenario. The results demonstrated that the total delivery time and the total time outside the time window decreased as the number of vehicles increased; the total delivery time and the total time outside the time window decreased as the delivery frequency along the subway line increased; the total delivery time and the total time outside the time window decreased as the sender’s time window increased. However, when the sender’s time window increased beyond a certain threshold, the total delivery time and the total time outside the time window no longer decreased greatly. The case study results can guide courier companies in path optimization for intra-city express delivery vehicles in combination with the subway network.


Geophysics ◽  
2006 ◽  
Vol 71 (2) ◽  
pp. V31-V40 ◽  
Author(s):  
Stephen J. Arrowsmith ◽  
Leo Eisner

A fast, fully automatic technique to identify microseismic multiplets in borehole seismic data is developed. The technique may be applied in real time to either continuous data or detected-event data for a number of three-component receivers and does not require prior information such as P- or S-wave time picks. Peak crosscorrelation coefficients, evaluated in the frequency domain, are used as the basis for identifying microseismic doublets. The peak crosscorrelation coefficient at each receiver is evaluated with a weighted arithmetic average of the normalized correlation coefficients of each component. Each component is weighted by the maximum amplitude of the signal for that component to reduce the effect of noise on the calculations. The weighted average correlations are averaged over all receivers in a time window centered on a fixed lag time. The size of the time window is determined from the dominant period in the signal, and the lag time is the time that maximizes the average correlation coefficient. The technique is applied to a three-component passive seismic data set recorded at the Valhall field, North Sea. A large number of microseismic doublets are identified that can be grouped into multiplets, reducing the total number of absolute event locations by a factor of two. Seven large multiplets reflect the repeated multiple rerupturing (up to 30 times on a single fault) and significant stress release. Two major faults dominate the seismic activity, causing at least one-fourth of the observed events.


Author(s):  
Xinxiang Zhang ◽  
Stephen Arrowsmith ◽  
Sotirios Tsongas ◽  
Chris Hayward ◽  
Haoran Meng ◽  
...  

Abstract Ground motions associated with aircraft overflights can cover a significant portion of the seismic data collected by shallowly emplaced seismometers, such as new nodal and Distributed Acoustic Sensing systems. This article describes the first published framework for automated detection of aircraft on single channel and multichannel seismic data. The seismic data are converted to spectrograms in a sliding time window and classified as aircraft or nonaircraft in each window using a deep convolutional neural network trained with analyst-labeled data. A majority voting scheme is used to convert the output from the sequence of sliding time windows onto a decision time sequence for each channel and to combine the binary classifications on the decision time sequences across multiple channels. Precision, recall, and F-score are used to quantify the detection performance of the algorithm on nodal data using fourfold time-series cross validation. By applying our framework to data from the Sage Brush Flats nodal array in Southern California, we provide a benchmark performance and demonstrate the advantage of using an array of sensors.


Geophysics ◽  
2003 ◽  
Vol 68 (1) ◽  
pp. 370-380 ◽  
Author(s):  
H. H. Hardy ◽  
Richard A. Beier ◽  
Jonathan D. Gaston

Local estimates of amplitude, frequency, and phase have been used in the past to characterize seismic data. In particular, these attributes have sometimes been successfully related to well attributes at the reservoir scale (net pay thickness, sand fraction, etc.). This paper introduces a method called SINFIT for computing local amplitude, frequency, and phase estimates of seismic traces over short‐time windows. The SINFIT method uses a sine‐curve fitting approach. The method is shown to give more accurate and robust frequency estimates than four other common methods on a set of test traces where the true frequency components are known. The four methods compared with SINFIT are instantaneous frequency, zero‐crossings, short‐time Fourier analysis, and a more recent time‐frequency method called AOK. In a field case with fluvial sands, an average frequency over a 30‐ms time window of seismic data correlates with estimated shale volume from well logs. The SINFIT method gives an average frequency attribute that more strongly correlates with shale volume than corresponding attributes from any of the other four methods.


Author(s):  
Xinyang Tao ◽  
Tangbin Xia ◽  
Lifeng Xi

This paper focuses on series systems' dynamic opportunistic maintenance scheduling. Based on the machine-level predictive maintenance (PdM) method, a novel TOC–VLLTW methodology combined theory of constraints (TOC) policy and variable lead-lag time window (VLLTW) policy is proposed. The TOC policy provides machines' priorities according to their PdM durations to decrease system downtime when scheduling opportunistic maintenance. The VLLTW policy provides variable lead-lag time windows against different machines, allowing for more flexible and economic system opportunistic maintenance schedules. This proposed methodology is demonstrated through the case study based on the collected reliability information from a quayside container system. The results can effectively prove the effectiveness of the TOC–VLLTW methodology.


Sign in / Sign up

Export Citation Format

Share Document