Estimating microseismic detectability of the surface-monitoring network using downhole-monitoring array

2018 ◽  
Vol 6 (3) ◽  
pp. SH107-SH115
Author(s):  
Paweł Wandycz ◽  
Eryk Święch ◽  
Leo Eisner ◽  
Andrzej Pasternacki ◽  
Denis Anikiev ◽  
...  

We have analyzed microseismic monitoring data sets obtained from the surface and downhole-monitoring arrays recorded during the first experiment of hydraulic fracturing in Poland. Using the downhole-monitoring network, we were able to record and locate 844 microseismic events, including 10 perforation shots from six stages of the stimulation. We detected 2 perforation shots and no microseismic events using the surface array, which was operational only during the first two stages of the stimulation. To explain the poor detectability of the surface array, we analyzed the spectral content of the events from the downhole-monitoring array. We found that the detectability of the perforation shots on the surface array was consistent with the low-frequency part of the signal on the downhole recordings. Our observation is in agreement with the fact that microseismic events with low-frequency signal weaker than the two detected perforation shots were not detected by the surface-monitoring array. Using the low-frequency part of the spectra of the events recorded by the downhole array, we predicted the surface-array detection threshold. We found that some events from the later stages could have been detected if only the surface array had been operational during that time.

Geophysics ◽  
2019 ◽  
Vol 84 (4) ◽  
pp. KS143-KS153
Author(s):  
Jubran Akram ◽  
Daniel Peter ◽  
David Eaton

Event detection is an essential component of microseismic data analysis. This process is typically carried out using a short- and long-term-average-ratio (STA/LTA) method, which is simple and computationally efficient but often yields inconsistent results for noisy data sets. We have aimed to optimize the performance of the STA/LTA method by testing different input forms of 3C waveform data and different characteristic functions (CFs), including a proposed [Formula: see text]-mean CF. These tests are evaluated using receiver operating characteristic (ROC) analysis and are compared based on synthetic and field data examples. Our analysis indicates that the STA/LTA method using a [Formula: see text]-mean CF improves the detection sensitivity and yields more robust event detection on noisy data sets than some previous approaches. In addition, microseismic events are detected efficiently on field data examples using the same detection threshold obtained from the ROC analysis on synthetic data examples. We recommend the use of the Youden index based on ROC analysis using a training subset, extracted from the continuous data, to further improve the detection threshold for field microseismic data.


Entropy ◽  
2021 ◽  
Vol 23 (4) ◽  
pp. 459
Author(s):  
Anastasios A. Tsonis ◽  
Geli Wang ◽  
Wenxu Lu ◽  
Sergey Kravtsov ◽  
Christopher Essex ◽  
...  

Proxy temperature data records featuring local time series, regional averages from areas all around the globe, as well as global averages, are analyzed using the Slow Feature Analysis (SFA) method. As explained in the paper, SFA is much more effective than the traditional Fourier analysis in identifying slow-varying (low-frequency) signals in data sets of a limited length. We find the existence of a striking gap from ~1000 to about ~20,000 years, which separates intrinsic climatic oscillations with periods ranging from ~ 60 years to ~1000 years, from the longer time-scale periodicities (20,000 yr +) involving external forcing associated with Milankovitch cycles. The absence of natural oscillations with periods within the gap is consistent with cumulative evidence based on past data analyses, as well as with earlier theoretical and modeling studies.


Genetics ◽  
1997 ◽  
Vol 147 (4) ◽  
pp. 1855-1861 ◽  
Author(s):  
Montgomery Slatkin ◽  
Bruce Rannala

Abstract A theory is developed that provides the sampling distribution of low frequency alleles at a single locus under the assumption that each allele is the result of a unique mutation. The numbers of copies of each allele is assumed to follow a linear birth-death process with sampling. If the population is of constant size, standard results from theory of birth-death processes show that the distribution of numbers of copies of each allele is logarithmic and that the joint distribution of numbers of copies of k alleles found in a sample of size n follows the Ewens sampling distribution. If the population from which the sample was obtained was increasing in size, if there are different selective classes of alleles, or if there are differences in penetrance among alleles, the Ewens distribution no longer applies. Likelihood functions for a given set of observations are obtained under different alternative hypotheses. These results are applied to published data from the BRCA1 locus (associated with early onset breast cancer) and the factor VIII locus (associated with hemophilia A) in humans. In both cases, the sampling distribution of alleles allows rejection of the null hypothesis, but relatively small deviations from the null model can account for the data. In particular, roughly the same population growth rate appears consistent with both data sets.


2018 ◽  
Vol 22 (6) ◽  
pp. 3105-3124 ◽  
Author(s):  
Zilefac Elvis Asong ◽  
Howard Simon Wheater ◽  
Barrie Bonsal ◽  
Saman Razavi ◽  
Sopan Kurkute

Abstract. Drought is a recurring extreme climate event and among the most costly natural disasters in the world. This is particularly true over Canada, where drought is both a frequent and damaging phenomenon with impacts on regional water resources, agriculture, industry, aquatic ecosystems, and health. However, nationwide drought assessments are currently lacking and impacted by limited ground-based observations. This study provides a comprehensive analysis of historical droughts over the whole of Canada, including the role of large-scale teleconnections. Drought events are characterized by the Standardized Precipitation Evapotranspiration Index (SPEI) over various temporal scales (1, 3, 6, and 12 consecutive months, 6 months from April to September, and 12 months from October to September) applied to different gridded monthly data sets for the period 1950–2013. The Mann–Kendall test, rotated empirical orthogonal function, continuous wavelet transform, and wavelet coherence analyses are used, respectively, to investigate the trend, spatio-temporal patterns, periodicity, and teleconnectivity of drought events. Results indicate that southern (northern) parts of the country experienced significant trends towards drier (wetter) conditions although substantial variability exists. Two spatially well-defined regions with different temporal evolution of droughts were identified – the Canadian Prairies and northern central Canada. The analyses also revealed the presence of a dominant periodicity of between 8 and 32 months in the Prairie region and between 8 and 40 months in the northern central region. These cycles of low-frequency variability are found to be associated principally with the Pacific–North American (PNA) and Multivariate El Niño/Southern Oscillation Index (MEI) relative to other considered large-scale climate indices. This study is the first of its kind to identify dominant periodicities in drought variability over the whole of Canada in terms of when the drought events occur, their duration, and how often they occur.


Author(s):  
Pradeep Lall ◽  
Tony Thomas

Electronics in automotive underhood environments is used for a number of safety critical functions. Reliable continued operation of electronic safety systems without catastrophic failure is important for safe operation of the vehicle. There is need for prognostication methods, which can be integrated, with on-board sensors for assessment of accrued damage and impending failure. In this paper, leadfree electronic assemblies consisting of daisy-chained parts have been subjected to high temperature vibration at 5g and 155°C. Spectrogram has been used to identify the emergence of new low frequency components with damage progression in electronic assemblies. Principal component analysis has been used to reduce the dimensionality of large data-sets and identify patterns without the loss of features that signify damage progression and impending failure. Variance of the principal components of the instantaneous frequency has been shown to exhibit an increasing trend during the initial damage progression, attaining a maximum value and decreasing prior to failure. The unique behavior of the instantaneous frequency over the period of vibration can be used as a health-monitoring feature for identifying the impending failures in automotive electronics. Further, damage progression has been studied using Empirical Mode Decomposition (EMD) technique in order to decompose the signals into Independent Mode Functions (IMF). The IMF’s were investigated based on their kurtosis values and a reconstructed strain signal was formulated with all IMF’s greater than a kurtosis value of three. PCA analysis on the reconstructed strain signal gave better patterns that can be used for prognostication of the life of the components.


2021 ◽  
Author(s):  
◽  
Zheng Wei

<p>The research first proposes a vocabulary learning technique: the word part technique, and then tests its effectiveness in aiding vocabulary learning and retention. The first part of the thesis centers around the idea that the knowledge of the first 2000 words language learners already possess may give them easier access to words of other frequency levels because the root parts of the low frequency new words share form and meaning similarities with the high frequency known words. The research addresses the issue at two stages: to quantify the information concerning the number of words able to be accessed through the analysis of the word roots, and to analyze the pedagogical usefulness of the accessible words. A Comprehensive Etymological Dictionary of the English Language (Klein, 1966) was used as the source to show the possible formal and meaning connections among words. All the words in the first 2000 word list were first looked up individually and all the cognates provided under each of these words were collected and placed under each of the high frequency words if they meet the requirement that their roots share more than one letter and/or more than one phoneme with the roots of the first 2000 known words. After the data was roughly gathered, three criteria were applied to filter the data, namely, the frequency criterion, the meaning criterion and form criterion. In applying the frequency criterion, words with frequency levels lower than the tenth thousand were removed from the data. In applying the meaning criterion, hints were given to show the semantic relations between the higher frequency words and the first 2000 thousand words. The hints were then rated on the scale for measuring meaning transparency. Words that were rated at level 5 on the scale were considered inaccessible; words that were rated at levels 1, 2a, 2b, 2c, and 3a were considered easy to access. In applying the form criterion, calculations were done for each semantically accessible word to show their phonological similarity and orthographic similarity in relation to the known word. The words whose phonological or orthographical similarity scores were larger than 0.5 were considered to be phonologically or orthographically easy to access. Finally the "find" function of Microsoft Word was used to check the data by picking up any words that might have been missed in the first round of data gathering. The above procedures resulted in 2156 word families that are able to be accessed through the meaning and form relations with the first 2000 words in their root parts. Among the 2156 word families, 739 can be accessed easily and are therefore more pedagogically useful and 259 can be accessed, but with difficulty. 21 pedagogically useful form constants were selected because they can give access to more unknown lower frequency words than other form constants. In the second part of the thesis, an experiment was conducted to test the effectiveness of the word part technique in comparison with the keyword technique and self-strategy learning. The results show that with the experienced Chinese EFL learners, the keyword technique is slightly inferior to the word part technique and the self-strategy learning.</p>


2020 ◽  
Vol 54 ◽  
pp. 129-136
Author(s):  
Camilla Rossi ◽  
Francesco Grigoli ◽  
Simone Cesca ◽  
Sebastian Heimann ◽  
Paolo Gasperini ◽  
...  

Abstract. Geothermal systems in the Hengill volcanic area, SW Iceland, started to be exploited for electrical power and heat production since the late 1960s. Today the two largest operating geothermal power plants are located at Nesjavellir and Hellisheiði. This area is a complex tectonic and geothermal site, located at the triple junction between the Reykjanes Peninsula (RP), the Western Volcanic Zone (WVZ), and the South Iceland Seismic Zone (SISZ). The region is seismically highly active with several thousand earthquakes located yearly. The origin of such earthquakes may be either natural or anthropogenic. The analysis of microseismicity can provide useful information on natural active processes in tectonic, geothermal and volcanic environments as well as on physical mechanisms governing induced events. Here, we investigate the microseismicity occurring in Hengill area, using a very dense broadband seismic monitoring network deployed in Hellisheiði since November 2018, and apply sophisticated full-waveform based method for detection and location. Improved locations and first characterization indicate that it is possible to identify different types of microseismic clusters, which are associated with either production/injection or the tectonic setting of the geothermal area.


2014 ◽  
Vol 3 (2) ◽  
pp. 153-177 ◽  
Author(s):  
P. Robert ◽  
N. Cornilleau-Wehrlin ◽  
R. Piberne ◽  
Y. de Conchy ◽  
C. Lacombe ◽  
...  

Abstract. The main part of the Cluster Spatio-Temporal Analysis of Field Fluctuations (STAFF) experiment consists of triaxial search coils allowing the measurements of the three magnetic components of the waves from 0.1 Hz up to 4 kHz. Two sets of data are produced, one by a module to filter and transmit the corresponding waveform up to either 10 or 180 Hz (STAFF-SC), and the second by the onboard Spectrum Analyser (STAFF-SA) to compute the elements of the spectral matrix for five components of the waves, 3 × B and 2 × E (from the EFW experiment), in the frequency range 8 Hz to 4 kHz. In order to understand the way the output signals of the search coils are calibrated, the transfer functions of the different parts of the instrument are described as well as the way to transform telemetry data into physical units across various coordinate systems from the spinning sensors to a fixed and known frame. The instrument sensitivity is discussed. Cross-calibration inside STAFF (SC and SA) is presented. Results of cross-calibration between the STAFF search coils and the Cluster Fluxgate Magnetometer (FGM) data are discussed. It is shown that these cross-calibrations lead to an agreement between both data sets at low frequency within a 2% error. By means of statistics done over 10 yr, it is shown that the functionalities and characteristics of both instruments have not changed during this period.


2018 ◽  
Vol 6 (3) ◽  
pp. SH39-SH48 ◽  
Author(s):  
Wojciech Gajek ◽  
Jacek Trojanowski ◽  
Michał Malinowski ◽  
Marek Jarosiński ◽  
Marko Riedel

A precise velocity model is necessary to obtain reliable locations of microseismic events, which provide information about the effectiveness of the hydraulic stimulation. Seismic anisotropy plays an important role in microseismic event location by imposing the dependency between wave velocities and its propagation direction. Building an anisotropic velocity model that accounts for that effect allows for more accurate location of microseismic events. We have used downhole microseismic records from a pilot hydraulic fracturing experiment in Lower-Paleozoic shale gas play in the Baltic Basin, Northern Poland, to obtain accurate microseismic events locations. We have developed a workflow for a vertical transverse isotropy velocity model construction when facing a challenging absence of horizontally polarized S-waves in perforation shot data, which carry information about Thomsen’s [Formula: see text] parameter and provide valuable constraints for locating microseismic events. We extract effective [Formula: see text], [Formula: see text] and [Formula: see text], [Formula: see text] for each layer from the P- and SV-wave arrivals of perforation shots, whereas the unresolved [Formula: see text] is retrieved afterward from the SH-SV-wave delay time of selected microseismic events. An inverted velocity model provides more reliable location of microseismic events, which then becomes an essential input for evaluating the hydraulic stimulation job effectiveness in the geomechanical context. We evaluate the influence of the preexisting fracture sets and obliquity between the borehole trajectory and principal horizontal stress direction on the hydraulic treatment performance. The fracturing fluid migrates to previously fractured zones, while the growth of the microseismic volume in consecutive stages is caused by increased penetration of the above-lying lithologic formations.


Geophysics ◽  
2021 ◽  
pp. 1-66
Author(s):  
Guanqun Sheng ◽  
Shuangyu Yang ◽  
Xiaolong Guo ◽  
Xingong Tang

Arrival-time picking of microseismic events is a critical procedure in microseismic data processing. However, as field monitoring data contain many microseismic events with low signal-to-noise ratios (SNRs), traditional arrival-time picking methods based on the instantaneous characteristics of seismic signals cannot meet the picking accuracy and efficiency requirements of microseismic monitoring owing to the large volume of monitoring data. Conversely, methods based on deep neural networks can significantly improve arrival-time picking accuracy and efficiency in low-SNR environments. Therefore, we propose a deep convolutional network that combines the U-net and DenseNet approaches to pick arrival times automatically. This novel network, called MSNet not only retains the spatial information of any input signal or profile based on the U-net, but also extracts and integrates more essential features of events and non-events through dense blocks, thereby further improving the picking accuracy and efficiency. An effective workflow is developed to verify the superiority of the proposed method. First, we describe the structure of MSNet and the workflow of the proposed picking method. Then, datasets are constructed using variable microseismic traces from field microseismic monitoring records and from the finite-difference forward modeling of microseismic data to train the network. Subsequently, hyperparameter tuning is conducted to optimize the MSNet. Finally, we test the MSNet using modeled signals with different SNRs and field microseismic data from different monitoring areas. By comparing the picking results of the proposed method with the results of U-net and short-term average and long-term average (STA/LTA) methods, the effectiveness of the proposed method is verified. The arrival picking results of synthetic data and microseismic field data show that the proposed network has increased adaptability and can achieve high accuracy for picking the arrival-time of microseismic events.


Sign in / Sign up

Export Citation Format

Share Document