3D distributed and dispersed source array acquisition and data processing

2020 ◽  
Vol 39 (6) ◽  
pp. 392-400
Author(s):  
Constantinos Tsingas ◽  
Mohammed S. Almubarak ◽  
Woodon Jeong ◽  
Abdulrahman Al Shuhail ◽  
Zygmunt Trzesniowski

Numerous field acquisition examples and case studies have demonstrated the importance of recording, processing, and interpreting broadband land data. In most seismic acquisition surveys, three main objectives should be considered: (1) dense spatial source and receiver locations to achieve optimum subsurface illumination and wavefield sampling; (2) coverage of the full frequency spectrum, i.e., broadband acquisition; and (3) cost efficiency. Consequently, an effort has been made to improve the manufacturing of seismic vibratory sources by providing the ability to emit both lower (approximately 1.5 Hz) and higher frequencies (approximately 120 Hz) and of receivers by utilizing single, denser, and lighter digital sensors. All these developments achieve both operational (i.e., weight, optimized power consumption) and geophysical benefits (i.e., amplitude and phase response, vector fidelity, tilt detection). As part of the effort to reduce the acquisition cycle time, increase productivity, and improve seismic imaging and resolution while optimizing costs, a novel seismic acquisition survey was conducted employing 24 vibrators generating two different types of sweeps in a 3D unconstrained decentralized and dispersed source array field configuration. During this novel blended acquisition design, the crew reached a maximum of 65,000 vibrator points during 24 hours of continuous recording, which represents significantly higher productivity than a conventional seismic crew operating in the same area using a nonblended centralized source mode. Applying novel and newly developed deblending algorithms, high-resolution images were obtained. In addition, two data sets (i.e., low-frequency and medium-high-frequency sources) were merged to obtain full-bandwidth broadband seismic images. Data comparisons between the distributed blended and nonblended conventional surveys, acquired by the same crew during the same time over the same area, showed that the two data sets are very similar in the poststack and prestack domains.

Author(s):  
G. Y. Fan ◽  
J. M. Cowley

It is well known that the structure information on the specimen is not always faithfully transferred through the electron microscope. Firstly, the spatial frequency spectrum is modulated by the transfer function (TF) at the focal plane. Secondly, the spectrum suffers high frequency cut-off by the aperture (or effectively damping terms such as chromatic aberration). While these do not have essential effect on imaging crystal periodicity as long as the low order Bragg spots are inside the aperture, although the contrast may be reversed, they may change the appearance of images of amorphous materials completely. Because the spectrum of amorphous materials is continuous, modulation of it emphasizes some components while weakening others. Especially the cut-off of high frequency components, which contribute to amorphous image just as strongly as low frequency components can have a fundamental effect. This can be illustrated through computer simulation. Imaging of a whitenoise object with an electron microscope without TF limitation gives Fig. 1a, which is obtained by Fourier transformation of a constant amplitude combined with random phases generated by computer.


Entropy ◽  
2021 ◽  
Vol 23 (4) ◽  
pp. 459
Author(s):  
Anastasios A. Tsonis ◽  
Geli Wang ◽  
Wenxu Lu ◽  
Sergey Kravtsov ◽  
Christopher Essex ◽  
...  

Proxy temperature data records featuring local time series, regional averages from areas all around the globe, as well as global averages, are analyzed using the Slow Feature Analysis (SFA) method. As explained in the paper, SFA is much more effective than the traditional Fourier analysis in identifying slow-varying (low-frequency) signals in data sets of a limited length. We find the existence of a striking gap from ~1000 to about ~20,000 years, which separates intrinsic climatic oscillations with periods ranging from ~ 60 years to ~1000 years, from the longer time-scale periodicities (20,000 yr +) involving external forcing associated with Milankovitch cycles. The absence of natural oscillations with periods within the gap is consistent with cumulative evidence based on past data analyses, as well as with earlier theoretical and modeling studies.


Genetics ◽  
1997 ◽  
Vol 147 (4) ◽  
pp. 1855-1861 ◽  
Author(s):  
Montgomery Slatkin ◽  
Bruce Rannala

Abstract A theory is developed that provides the sampling distribution of low frequency alleles at a single locus under the assumption that each allele is the result of a unique mutation. The numbers of copies of each allele is assumed to follow a linear birth-death process with sampling. If the population is of constant size, standard results from theory of birth-death processes show that the distribution of numbers of copies of each allele is logarithmic and that the joint distribution of numbers of copies of k alleles found in a sample of size n follows the Ewens sampling distribution. If the population from which the sample was obtained was increasing in size, if there are different selective classes of alleles, or if there are differences in penetrance among alleles, the Ewens distribution no longer applies. Likelihood functions for a given set of observations are obtained under different alternative hypotheses. These results are applied to published data from the BRCA1 locus (associated with early onset breast cancer) and the factor VIII locus (associated with hemophilia A) in humans. In both cases, the sampling distribution of alleles allows rejection of the null hypothesis, but relatively small deviations from the null model can account for the data. In particular, roughly the same population growth rate appears consistent with both data sets.


2018 ◽  
Vol 22 (6) ◽  
pp. 3105-3124 ◽  
Author(s):  
Zilefac Elvis Asong ◽  
Howard Simon Wheater ◽  
Barrie Bonsal ◽  
Saman Razavi ◽  
Sopan Kurkute

Abstract. Drought is a recurring extreme climate event and among the most costly natural disasters in the world. This is particularly true over Canada, where drought is both a frequent and damaging phenomenon with impacts on regional water resources, agriculture, industry, aquatic ecosystems, and health. However, nationwide drought assessments are currently lacking and impacted by limited ground-based observations. This study provides a comprehensive analysis of historical droughts over the whole of Canada, including the role of large-scale teleconnections. Drought events are characterized by the Standardized Precipitation Evapotranspiration Index (SPEI) over various temporal scales (1, 3, 6, and 12 consecutive months, 6 months from April to September, and 12 months from October to September) applied to different gridded monthly data sets for the period 1950–2013. The Mann–Kendall test, rotated empirical orthogonal function, continuous wavelet transform, and wavelet coherence analyses are used, respectively, to investigate the trend, spatio-temporal patterns, periodicity, and teleconnectivity of drought events. Results indicate that southern (northern) parts of the country experienced significant trends towards drier (wetter) conditions although substantial variability exists. Two spatially well-defined regions with different temporal evolution of droughts were identified – the Canadian Prairies and northern central Canada. The analyses also revealed the presence of a dominant periodicity of between 8 and 32 months in the Prairie region and between 8 and 40 months in the northern central region. These cycles of low-frequency variability are found to be associated principally with the Pacific–North American (PNA) and Multivariate El Niño/Southern Oscillation Index (MEI) relative to other considered large-scale climate indices. This study is the first of its kind to identify dominant periodicities in drought variability over the whole of Canada in terms of when the drought events occur, their duration, and how often they occur.


2021 ◽  
Vol 263 (3) ◽  
pp. 3436-3447
Author(s):  
Dan Lin ◽  
Andrew Eng

Assumptions made on the ground types between sound sources and receivers can significantly impact the accuracy of environmental outdoor noise prediction. A guideline is provided in ISO 9613-2 and the value of ground factor ranges from 0 to 1, depending on the coverage of porous ground. For example, a ground absorption factor of 1 is suggested for grass ground covers. However, it is unclear if the suggested values are validated. The purpose of this study is to determine the sound absorption of different types of ground by measurements. Field noise measurements were made using an omnidirectional loudspeaker and two microphones on three different types of ground in a quiet neighborhood. One microphone was located 3ft from the loudspeaker to record near field sound levels in 1/3 and 1 octave bands every second. The other microphone was located a few hundred feet away to record far field sound in the same fashion as the near field microphone. The types of ground tested were concrete, grass, and grass with trees. Based on the measurement data, it was found that grass and trees absorb high frequency sound well and a ground factor of 1 may be used for 500Hz and up when using ISO 9613-2 methodology. However, at lower frequencies (125 Hz octave band and below), grassy ground reflects sound the same as concrete surfaces. Trees absorb more low frequency sound than grass, but less than ISO 9613-2 suggested.


2021 ◽  
pp. 106-114
Author(s):  
M.M. Zablodsky ◽  
◽  
P.B. Klendiy ◽  
O. P. Dudar ◽  
◽  
...  

The article considers the issue of studying the value of pH, substrate in the process of methane fermentation in the mesophilic regime and the influence of the electromagnetic field of industrial frequency. The aim is to investigate the influence of electromagnetic fields on the pH value of the substrate during fermentation. Different types of microorganisms are involved in the process of methanogenesis, and the decisive role in it is played by methane-forming archaea, which are most sensitive to pH and should be in the range of 6.5 - 8. Therefore, it is necessary to check the effect of low frequency electromagnetic field on substrate pH. The study was performed for 25 days on two substrates, one of which was exposed to a low-frequency electromagnetic field with an electromagnetic induction of 3.5 mT. The research results show that the pH value of the substrate exposed to the electromagnetic field during the methane fermentation process was within acceptable limits, and the second substrate decreased, that is, it was acidified. Key words: methane fermentation, substrate, pH value, electromagnetic field


2021 ◽  
Vol 40 (10) ◽  
pp. 759-767
Author(s):  
Rolf H. Baardman ◽  
Rob F. Hegge

Machine learning (ML) has proven its value in the seismic industry with successful implementations in areas of seismic interpretation such as fault and salt dome detection and velocity picking. The field of seismic processing research also is shifting toward ML applications in areas such as tomography, demultiple, and interpolation. Here, a supervised ML deblending algorithm is illustrated on a dispersed source array (DSA) data example in which both high- and low-frequency vibrators were deployed simultaneously. Training data pairs of blended and corresponding unblended data were constructed from conventional (unblended) data from another survey. From this training data, the method can automatically learn a deblending operator that is used to deblend for both the low- and the high-frequency vibrators of the DSA data. The results obtained on the DSA data are encouraging and show that the ML deblending method can offer a good performing, less user-intensive alternative to existing deblending methods.


Author(s):  
Pradeep Lall ◽  
Tony Thomas

Electronics in automotive underhood environments is used for a number of safety critical functions. Reliable continued operation of electronic safety systems without catastrophic failure is important for safe operation of the vehicle. There is need for prognostication methods, which can be integrated, with on-board sensors for assessment of accrued damage and impending failure. In this paper, leadfree electronic assemblies consisting of daisy-chained parts have been subjected to high temperature vibration at 5g and 155°C. Spectrogram has been used to identify the emergence of new low frequency components with damage progression in electronic assemblies. Principal component analysis has been used to reduce the dimensionality of large data-sets and identify patterns without the loss of features that signify damage progression and impending failure. Variance of the principal components of the instantaneous frequency has been shown to exhibit an increasing trend during the initial damage progression, attaining a maximum value and decreasing prior to failure. The unique behavior of the instantaneous frequency over the period of vibration can be used as a health-monitoring feature for identifying the impending failures in automotive electronics. Further, damage progression has been studied using Empirical Mode Decomposition (EMD) technique in order to decompose the signals into Independent Mode Functions (IMF). The IMF’s were investigated based on their kurtosis values and a reconstructed strain signal was formulated with all IMF’s greater than a kurtosis value of three. PCA analysis on the reconstructed strain signal gave better patterns that can be used for prognostication of the life of the components.


Geophysics ◽  
2012 ◽  
Vol 77 (4) ◽  
pp. A19-A23 ◽  
Author(s):  
A. J. Berkhout

Blended source arrays are historically configured with equal source units, such as broadband vibrators (land) and broadband air-gun arrays (marine). I refer to this concept as homogeneous blending. I have proposed to extend the blending concept to inhomogeneous blending, meaning that a blended source array consists of different source units. More specifically, I proposed to replace in blended acquisition the traditional broadband sources by narrowband versions — imagine coded single air guns with different volumes or coded single narrowband vibrators with different central frequencies — together representing a dispersed source array (DSA). Similar to what we see in today’s audio systems, the DSA concept allows the design of dedicated narrowband source elements that do not suffer from the low versus high frequency compromise. In addition, the DSA concept opens the possibility to use source depths and spatial sampling intervals that are optimum for the low-, mid-, and high-frequency sources (multiscale shooting grids). DSAs are considered to be an important step in robotizing the seismic acquisition process.


1989 ◽  
Vol 7 (1) ◽  
pp. 55-84 ◽  
Author(s):  
Ronald C. Davidson ◽  
Han S. Uhm

Use is made of the Vlasov–Maxwell equations to derive an eigenvalue equation describing the extraordinary–mode stability properties of relativistic, non-neutral electron flow in high-voltage diodes. The analysis is based on well-established theoretical techniques developed in basic studies of the kinetic equilibrium and stability properties of nonneutral plasmas characterized by intense self fields. The formal eigenvalue equation is derived for extraordinary-mode flute perturbations in a planar diode. As a specific example, perturbations are considered about the choice of self-consistent Vlasov equilibrium , where . is the electron density at the cathode (x = 0), H is the energy, and Py is the canonical momentum in the Y-direction (the direction of the equilibrium electron flow). As a limiting case, the planar eigenvalue equation is further simplified for low-frequency long-wavelength perturbations with |ω − kvd, ≪ ωυ where and and ⋯c = eB0/mc, and B0ệz is the applied magnetic field in the vacuum region xb < x ≤ d. Here, the outer edge of the electron layer is located at x = xb; ω is complex oscillation frequency; k is the wavenumber in the y-direction; ωυ is the characteristic betatron frequency for oscillations in the x′-orbit about the equilibrium value x′ = x0 = xb/2; and Vd is the average electron flow velocity in the y-direction at x = x0. In simplifying the orbit integrals, a model is adopted in which the eigenfunction approximated by , where x′(t′) is the x′-orbit in the equilibrium field configuration. A detailed analysis of the resulting eigenvalue equation for , derived for low-frequency long-wavelength perturbations, is the subject of a companion paper.


Sign in / Sign up

Export Citation Format

Share Document