Addressing artifacts in PP-PS registration prior to performing joint impedance inversion

2020 ◽  
Vol 39 (1) ◽  
pp. 47-52
Author(s):  
Satinder Chopra ◽  
Ritesh Kumar Sharma

Multicomponent seismic data analysis enhances confidence in interpretation by providing mode-converted PS data for imaging the subsurface. Integrated interpretation of PP and PS data begins with the identification of reflections corresponding to similar geologic events on both data sets. This identification is accomplished by carrying out well-log correlation through the generation of PP and PS synthetic seismograms. There are a few issues associated with the approach. One of the issues is that PS data have lower resolution than PP data. This presents difficulties in the correlation of equivalent reflection events on both data sets. Even if few consistent horizons are tracked, the horizon-matching process introduces artifacts on the PS data mapped in PP time. In this paper, we elaborate on such challenges with a data set from the Anadarko Basin in the United States. We then propose a novel workflow to address the challenges.

2020 ◽  
Vol 8 (1) ◽  
pp. T141-T149
Author(s):  
Ritesh Kumar Sharma ◽  
Satinder Chopra ◽  
Larry R. Lines

Multicomponent seismic data offer several advantages for characterizing reservoirs with the use of the vertical component (PP) and mode-converted (PS) data. Joint impedance inversion inverts both of these data sets simultaneously; hence, it is considered superior to simultaneous impedance inversion. However, the success of joint impedance inversion depends on how accurately the PS data are mapped on the PP time domain. Normally, this is attempted by performing well-to-seismic ties for PP and PS data sets and matching different horizons picked on PP and PS data. Although it seems to be a straightforward approach, there are a few issues associated with it. One of them is the lower resolution of the PS data compared with the PP data that presents difficulties in the correlation of the equivalent reflection events on both the data sets. Even after a few consistent horizons get tracked, the horizon matching process introduces some artifacts on the PS data when mapped into PP time. We have evaluated such challenges using a data set from the Western Canadian Sedimentary Basin and then develop a novel workflow for addressing them. The importance of our workflow was determined by comparing data examples generated with and without its adoption.


1998 ◽  
Vol 27 (3) ◽  
pp. 351-369 ◽  
Author(s):  
MICHAEL NOBLE ◽  
SIN YI CHEUNG ◽  
GEORGE SMITH

This article briefly reviews American and British literature on welfare dynamics and examines the concepts of welfare dependency and ‘dependency culture’ with particular reference to lone parents. Using UK benefit data sets, the welfare dynamics of lone mothers are examined to explore the extent to which they inform the debates. Evidence from Housing Benefits data show that even over a relatively short time period, there is significant turnover in the benefits-dependent lone parent population with movement in and out of income support as well as movement into other family structures. Younger lone parents and owner-occupiers tend to leave the data set while older lone parents and council tenants are most likely to stay. Some owner-occupier lone parents may be relatively well off and on income support for a relatively short time between separation and a financial settlement being reached. They may also represent a more highly educated and highly skilled group with easier access to the labour market than renters. Any policy moves paralleling those in the United States to time limit benefit will disproportionately affect older lone parents.


Geophysics ◽  
2017 ◽  
Vol 82 (3) ◽  
pp. R199-R217 ◽  
Author(s):  
Xintao Chai ◽  
Shangxu Wang ◽  
Genyang Tang

Seismic data are nonstationary due to subsurface anelastic attenuation and dispersion effects. These effects, also referred to as the earth’s [Formula: see text]-filtering effects, can diminish seismic resolution. We previously developed a method of nonstationary sparse reflectivity inversion (NSRI) for resolution enhancement, which avoids the intrinsic instability associated with inverse [Formula: see text] filtering and generates superior [Formula: see text] compensation results. Applying NSRI to data sets that contain multiples (addressing surface-related multiples only) requires a demultiple preprocessing step because NSRI cannot distinguish primaries from multiples and will treat them as interference convolved with incorrect [Formula: see text] values. However, multiples contain information about subsurface properties. To use information carried by multiples, with the feedback model and NSRI theory, we adapt NSRI to the context of nonstationary seismic data with surface-related multiples. Consequently, not only are the benefits of NSRI (e.g., circumventing the intrinsic instability associated with inverse [Formula: see text] filtering) extended, but also multiples are considered. Our method is limited to be a 1D implementation. Theoretical and numerical analyses verify that given a wavelet, the input [Formula: see text] values primarily affect the inverted reflectivities and exert little effect on the estimated multiples; i.e., multiple estimation need not consider [Formula: see text] filtering effects explicitly. However, there are benefits for NSRI considering multiples. The periodicity and amplitude of the multiples imply the position of the reflectivities and amplitude of the wavelet. Multiples assist in overcoming scaling and shifting ambiguities of conventional problems in which multiples are not considered. Experiments using a 1D algorithm on a synthetic data set, the publicly available Pluto 1.5 data set, and a marine data set support the aforementioned findings and reveal the stability, capabilities, and limitations of the proposed method.


Geophysics ◽  
2018 ◽  
Vol 83 (4) ◽  
pp. M41-M48 ◽  
Author(s):  
Hongwei Liu ◽  
Mustafa Naser Al-Ali

The ideal approach for continuous reservoir monitoring allows generation of fast and accurate images to cope with the massive data sets acquired for such a task. Conventionally, rigorous depth-oriented velocity-estimation methods are performed to produce sufficiently accurate velocity models. Unlike the traditional way, the target-oriented imaging technology based on the common-focus point (CFP) theory can be an alternative for continuous reservoir monitoring. The solution is based on a robust data-driven iterative operator updating strategy without deriving a detailed velocity model. The same focusing operator is applied on successive 3D seismic data sets for the first time to generate efficient and accurate 4D target-oriented seismic stacked images from time-lapse field seismic data sets acquired in a [Formula: see text] injection project in Saudi Arabia. Using the focusing operator, target-oriented prestack angle domain common-image gathers (ADCIGs) could be derived to perform amplitude-versus-angle analysis. To preserve the amplitude information in the ADCIGs, an amplitude-balancing factor is applied by embedding a synthetic data set using the real acquisition geometry to remove the geometry imprint artifact. Applying the CFP-based target-oriented imaging to time-lapse data sets revealed changes at the reservoir level in the poststack and prestack time-lapse signals, which is consistent with the [Formula: see text] injection history and rock physics.


2020 ◽  
Vol 41 (4/5) ◽  
pp. 247-268 ◽  
Author(s):  
Starr Hoffman ◽  
Samantha Godbey

PurposeThis paper explores trends over time in library staffing and staffing expenditures among two- and four-year colleges and universities in the United States.Design/methodology/approachResearchers merged and analyzed data from 1996 to 2016 from the National Center for Education Statistics for over 3,500 libraries at postsecondary institutions. This study is primarily descriptive in nature and addresses the research questions: How do staffing trends in academic libraries over this period of time relate to Carnegie classification and institution size? How do trends in library staffing expenditures over this period of time correspond to these same variables?FindingsAcross all institutions, on average, total library staff decreased from 1998 to 2012. Numbers of librarians declined at master’s and doctoral institutions between 1998 and 2016. Numbers of students per librarian increased over time in each Carnegie and size category. Average inflation-adjusted staffing expenditures have remained steady for master's, baccalaureate and associate's institutions. Salaries as a percent of library budget decreased only among doctoral institutions and institutions with 20,000 or more students.Originality/valueThis is a valuable study of trends over time, which has been difficult without downloading and merging separate data sets from multiple government sources. As a result, few studies have taken such an approach to this data. Consequently, institutions and libraries are making decisions about resource allocation based on only a fraction of the available data. Academic libraries can use this study and the resulting data set to benchmark key staffing characteristics.


2018 ◽  
Vol 40 ◽  
pp. 06021
Author(s):  
David Abraham ◽  
Tate McAlpin ◽  
Keaton Jones

The movement of bed forms (sand dunes) in large sand-bed rivers is being used to determine the transport rate of bed load. The ISSDOTv2 (Integrated Section Surface Difference Over Time version 2) methodology uses time sequenced differences of measured bathymetric surfaces to compute the bed-load transport rate. The method was verified using flume studies [1]. In general, the method provides very consistent and repeatable results, and also shows very good fidelity with most other measurement techniques. Over the last 7 years we have measured, computed and compiled what we believe to be the most extensive data set anywhere of bed-load measurements on large, sand bed rivers. Most of the measurements have been taken on the Mississippi, Missouri, Ohio and Snake Rivers in the United States. For cases where multiple measurements were made at varying flow rates, bed-load rating curves have been produced. This paper will provide references for the methodology, but is intended more to discuss the measurements, the resulting data sets, and current and potential uses for the bed-load data.


2019 ◽  
Vol 38 (2) ◽  
pp. 144-150 ◽  
Author(s):  
Marianne Rauch-Davies ◽  
David Langton ◽  
Michael Bradshaw ◽  
Allon Bartana ◽  
Dan Kosloff ◽  
...  

With readily available wide-azimuth, onshore, 3D seismic data, the search for attributes utilizing the azimuthal information is ongoing. Theoretically, in the presence of ordered fracturing, the seismic wavefront shape changes from spherical to nonspherical with the propagation velocity being faster parallel to the fracturing and slower perpendicular to the fracture direction. This concept has been adopted and is used to map fracture direction and density within unconventional reservoirs. More specifically, azimuthal variations in normal moveout velocity or migration velocity are often used to infer natural fracture orientation. Analyses of recent results have called into question whether azimuthal velocity linked to intrinsic azimuthal velocity variations can actually be detected from seismic data. By use of 3D orthorhombic anisotropic elastic simulation, we test whether fracture orientation and intensity can be detected from seismic data. We construct two subsurface models based on interpreted subsurface layer structure of the Anadarko Basin in Oklahoma. For the first model, the material parameters in the layers are constant vertically transverse isotropic (VTI) in all intervals. The second model was constructed the same way as the base model for all layers above the Woodford Shale Formation. For the shale layer, orthorhombic properties were introduced. In addition, a thicker wedge layer was added below the shale layer. Using the constructed model, synthetic seismic data were produced by means of 3D anisotropic elastic simulation resulting in two data sets: VTI and orthorhombic. The simulated data set was depth migrated using the VTI subsurface model. After migration, the residual moveouts on the migrated gathers were analyzed. The analysis of the depth-migrated model data indicates that for the typical layer thicknesses of the Woodford Shale layer in the Anadarko Basin, observed and modeled percentage of anisotropy and target depth, the effect of intrinsic anisotropy is too small to be detected in real seismic data.


2005 ◽  
Vol 2005 (1) ◽  
pp. 143-147
Author(s):  
Daniel R. Norton

ABSTRACT The annual volume of oil spilled into the marine environment by tank vessels (tank barges and tanks hips) is analyzed against the total annual volume of oil transported by tank vessels in order to determine any correlational relationship. U.S. Coast Guard data was used to provide the volume of oil (petroleum) spilled into the marine environment each year by tank vessels. Data from the U.S. Army Corps of Engineers and the U.S. Department of Transportation's (US DOT) National Transportation Statistics (NTS) were used for the annual volume of oil transported via tank vessels in the United States. This data is provided in the form of tonnage and ton-miles, respectively. Each data set has inherent benefits and weaknesses. For the analysis the volume of oil transported was used as the explanatory variable (x) and the volume of oil spilled into the marine environment as the response variable (y). Both data sets were tested for correlation. A weak relationship, r = −0.38 was found using tonnage, and no further analysis was performed. A moderately strong relationship, r = 0.79, was found using ton-miles. Further analysis using regression analysis and a plot of residuals showed the data to be satisfactory with no sign of lurking variables, but with the year 1990 being a possible outlier.


Geophysics ◽  
1999 ◽  
Vol 64 (5) ◽  
pp. 1630-1636 ◽  
Author(s):  
Ayon K. Dey ◽  
Larry R. Lines

In seismic exploration, statistical wavelet estimation and deconvolution are standard tools. Both of these processes assume randomness in the seismic reflectivity sequence. The validity of this assumption is examined by using well‐log synthetic seismograms and by using a procedure for evaluating the resulting deconvolutions. With real data, we compare our wavelet estimations with the in‐situ recording of the wavelet from a vertical seismic profile (VSP). As a result of our examination of the randomness assumption, we present a fairly simple test that can be used to evaluate the validity of a randomness assumption. From our test of seismic data in Alberta, we conclude that the assumption of reflectivity randomness is less of a problem in deconvolution than other assumptions such as phase and stationarity.


Geophysics ◽  
2009 ◽  
Vol 74 (5) ◽  
pp. R59-R67 ◽  
Author(s):  
Igor B. Morozov ◽  
Jinfeng Ma

The seismic-impedance inversion problem is underconstrained inherently and does not allow the use of rigorous joint inversion. In the absence of a true inverse, a reliable solution free from subjective parameters can be obtained by defining a set of physical constraints that should be satisfied by the resulting images. A method for constructing synthetic logs is proposed that explicitly and accurately satisfies (1) the convolutional equation, (2) time-depth constraints of the seismic data, (3) a background low-frequency model from logs or seismic/geologic interpretation, and (4) spectral amplitudes and geostatistical information from spatially interpolated well logs. The resulting synthetic log sections or volumes are interpretable in standard ways. Unlike broadly used joint-inversion algorithms, the method contains no subjectively selected user parameters, utilizes the log data more completely, and assesses intermediate results. The procedure is simple and tolerant to noise, and it leads to higher-resolution images. Separating the seismic and subseismic frequency bands also simplifies data processing for acoustic-impedance (AI) inversion. For example, zero-phase deconvolution and true-amplitude processing of seismic data are not required and are included automatically in this method. The approach is applicable to 2D and 3D data sets and to multiple pre- and poststack seismic attributes. It has been tested on inversions for AI and true-amplitude reflectivity using 2D synthetic and real-data examples.


Sign in / Sign up

Export Citation Format

Share Document