Marlim R3D: A realistic model for controlled-source electromagnetic simulations — Phase 2: The controlled-source electromagnetic data set

Geophysics ◽  
2019 ◽  
Vol 84 (5) ◽  
pp. E293-E299
Author(s):  
Jorlivan L. Correa ◽  
Paulo T. L. Menezes

Synthetic data provided by geoelectric earth models are a powerful tool to evaluate a priori a controlled-source electromagnetic (CSEM) workflow effectiveness. Marlim R3D (MR3D) is an open-source complex and realistic geoelectric model for CSEM simulations of the postsalt turbiditic reservoirs at the Brazilian offshore margin. We have developed a 3D CSEM finite-difference time-domain forward study to generate the full-azimuth CSEM data set for the MR3D earth model. To that end, we fabricated a full-azimuth survey with 45 towlines striking the north–south and east–west directions over a total of 500 receivers evenly spaced at 1 km intervals along the rugged seafloor of the MR3D model. To correctly represent the thin, disconnected, and complex geometries of the studied reservoirs, we have built a finely discretized mesh of [Formula: see text] cells leading to a large mesh with a total of approximately 90 million cells. We computed the six electromagnetic field components (Ex, Ey, Ez, Hx, Hy, and Hz) at six frequencies in the range of 0.125–1.25 Hz. In our efforts to mimic noise in real CSEM data, we summed to the data a multiplicative noise with a 1% standard deviation. Both CSEM data sets (noise free and noise added), with inline and broadside geometries, are distributed for research or commercial use, under the Creative Common License, at the Zenodo platform.

2021 ◽  
Vol 40 (9) ◽  
pp. 686-692
Author(s):  
Jorlivan L. Correa ◽  
Paulo T. L. Menezes

Synthetic data provided by earth models are essential to investigate several geologic problems. Marlim R3D (MR3D) is an open-source realistic earth modeling project for electromagnetic simulations of the postsalt reservoirs of the Brazilian offshore margin. In phase 3, we have conducted a 3D marine magnetotelluric (MMT) study with the finite-difference method to generate the synthetic magnetotelluric (MT) data set for the MR3D earth model. To that end, we upscaled the original controlled-source electromagnetic model to preserve all local-scale features, such as the thin-layer turbidite reservoirs, and to include several geologic regional features, such as the coastline, land topography, basement rocks representing the continental crust, and mantle rocks. Then, we simulated an MMT survey with 500 receivers evenly spaced at 1 km intervals along the rugged seafloor of the MR3D model. To accurately represent the MMT model with a 329 × 329 × 200 km volume, we have produced a mesh with 161 × 136 × 242 cells (approximately 5.3 million cells). We computed the full MT and tipper tensor at 25 periods in the time range of 1–10,000 s. The data set, the model, and companion material are freely distributed for research or commercial use under the Creative Commons License at the Zenodo platform.


2014 ◽  
Vol 7 (3) ◽  
pp. 781-797 ◽  
Author(s):  
P. Paatero ◽  
S. Eberly ◽  
S. G. Brown ◽  
G. A. Norris

Abstract. The EPA PMF (Environmental Protection Agency positive matrix factorization) version 5.0 and the underlying multilinear engine-executable ME-2 contain three methods for estimating uncertainty in factor analytic models: classical bootstrap (BS), displacement of factor elements (DISP), and bootstrap enhanced by displacement of factor elements (BS-DISP). The goal of these methods is to capture the uncertainty of PMF analyses due to random errors and rotational ambiguity. It is shown that the three methods complement each other: depending on characteristics of the data set, one method may provide better results than the other two. Results are presented using synthetic data sets, including interpretation of diagnostics, and recommendations are given for parameters to report when documenting uncertainty estimates from EPA PMF or ME-2 applications.


Geophysics ◽  
2006 ◽  
Vol 71 (5) ◽  
pp. C81-C92 ◽  
Author(s):  
Helene Hafslund Veire ◽  
Hilde Grude Borgos ◽  
Martin Landrø

Effects of pressure and fluid saturation can have the same degree of impact on seismic amplitudes and differential traveltimes in the reservoir interval; thus, they are often inseparable by analysis of a single stacked seismic data set. In such cases, time-lapse AVO analysis offers an opportunity to discriminate between the two effects. We quantify the uncertainty in estimations to utilize information about pressure- and saturation-related changes in reservoir modeling and simulation. One way of analyzing uncertainties is to formulate the problem in a Bayesian framework. Here, the solution of the problem will be represented by a probability density function (PDF), providing estimations of uncertainties as well as direct estimations of the properties. A stochastic model for estimation of pressure and saturation changes from time-lapse seismic AVO data is investigated within a Bayesian framework. Well-known rock physical relationships are used to set up a prior stochastic model. PP reflection coefficient differences are used to establish a likelihood model for linking reservoir variables and time-lapse seismic data. The methodology incorporates correlation between different variables of the model as well as spatial dependencies for each of the variables. In addition, information about possible bottlenecks causing large uncertainties in the estimations can be identified through sensitivity analysis of the system. The method has been tested on 1D synthetic data and on field time-lapse seismic AVO data from the Gullfaks Field in the North Sea.


Author(s):  
Danlei Xu ◽  
Lan Du ◽  
Hongwei Liu ◽  
Penghui Wang

A Bayesian classifier for sparsity-promoting feature selection is developed in this paper, where a set of nonlinear mappings for the original data is performed as a pre-processing step. The linear classification model with such mappings from the original input space to a nonlinear transformation space can not only construct the nonlinear classification boundary, but also realize the feature selection for the original data. A zero-mean Gaussian prior with Gamma precision and a finite approximation of Beta process prior are used to promote sparsity in the utilization of features and nonlinear mappings in our model, respectively. We derive the Variational Bayesian (VB) inference algorithm for the proposed linear classifier. Experimental results based on the synthetic data set, measured radar data set, high-dimensional gene expression data set, and several benchmark data sets demonstrate the aggressive and robust feature selection capability and comparable classification accuracy of our method comparing with some other existing classifiers.


Geophysics ◽  
2017 ◽  
Vol 82 (3) ◽  
pp. R199-R217 ◽  
Author(s):  
Xintao Chai ◽  
Shangxu Wang ◽  
Genyang Tang

Seismic data are nonstationary due to subsurface anelastic attenuation and dispersion effects. These effects, also referred to as the earth’s [Formula: see text]-filtering effects, can diminish seismic resolution. We previously developed a method of nonstationary sparse reflectivity inversion (NSRI) for resolution enhancement, which avoids the intrinsic instability associated with inverse [Formula: see text] filtering and generates superior [Formula: see text] compensation results. Applying NSRI to data sets that contain multiples (addressing surface-related multiples only) requires a demultiple preprocessing step because NSRI cannot distinguish primaries from multiples and will treat them as interference convolved with incorrect [Formula: see text] values. However, multiples contain information about subsurface properties. To use information carried by multiples, with the feedback model and NSRI theory, we adapt NSRI to the context of nonstationary seismic data with surface-related multiples. Consequently, not only are the benefits of NSRI (e.g., circumventing the intrinsic instability associated with inverse [Formula: see text] filtering) extended, but also multiples are considered. Our method is limited to be a 1D implementation. Theoretical and numerical analyses verify that given a wavelet, the input [Formula: see text] values primarily affect the inverted reflectivities and exert little effect on the estimated multiples; i.e., multiple estimation need not consider [Formula: see text] filtering effects explicitly. However, there are benefits for NSRI considering multiples. The periodicity and amplitude of the multiples imply the position of the reflectivities and amplitude of the wavelet. Multiples assist in overcoming scaling and shifting ambiguities of conventional problems in which multiples are not considered. Experiments using a 1D algorithm on a synthetic data set, the publicly available Pluto 1.5 data set, and a marine data set support the aforementioned findings and reveal the stability, capabilities, and limitations of the proposed method.


2010 ◽  
Vol 10 (6) ◽  
pp. 13755-13796 ◽  
Author(s):  
D. A. Hegg ◽  
S. G. Warren ◽  
T. C. Grenfell ◽  
S. J. Doherty ◽  
A. D. Clarke

Abstract. Two data sets consisting of measurements of light absorbing aerosols (LAA) in arctic snow together with suites of other corresponding chemical constituents are presented; the first from Siberia, Greenland and near the North Pole obtained in 2008, and the second from the Canadian arctic obtained in 2009. A preliminary differentiation of the LAA into black carbon (BC) and non-BC LAA is done. Source attribution of the light absorbing aerosols was done using a positive matrix factorization (PMF) model. Four sources were found for each data set (crop and grass burning, boreal biomass burning, pollution and marine). For both data sets, the crops and grass biomass burning was the main source of both LAA species, suggesting the non-BC LAA was brown carbon. Depth profiles at most of the sites allowed assessment of the seasonal variation in the source strengths. The biomass burning sources dominated in the spring but pollution played a more significant (though rarely dominant) role in the fall, winter and, for Greenland, summer. The PMF analysis is consistent with trajectory analysis and satellite fire maps.


2021 ◽  
Author(s):  
Kezia Lange ◽  
Andreas C. Meier ◽  
Michel Van Roozendael ◽  
Thomas Wagner ◽  
Thomas Ruhtz ◽  
...  

<p>Airborne imaging DOAS and ground-based stationary and mobile DOAS measurements were conducted during the ESA funded S5P-VAL-DE-Ruhr campaign in September 2020 in the Ruhr area. The Ruhr area is located in Western Germany and is a pollution hotspot in Europe with urban character as well as large industrial emitters. The measurements are used to validate data from the Sentinel-5P TROPOspheric Monitoring Instrument (TROPOMI) with focus on the NO<sub>2</sub> tropospheric vertical column product.</p><p>Seven flights were performed with the airborne imaging DOAS instrument, AirMAP, providing continuous maps of NO<sub>2</sub> in the layers below the aircraft. These flights cover many S5P ground pixels within an area of about 40 km side length and were accompanied by ground-based stationary measurements and three mobile car DOAS instruments. Stationary measurements were conducted by two Pandora, two zenith-sky and two MAX-DOAS instruments distributed over three target areas, partly as long-term measurements over a one-year period.</p><p>Airborne and ground-based measurements were compared to evaluate the representativeness of the measurements in time and space. With a resolution of about 100 x 30 m<sup>2</sup>, the AirMAP data creates a link between the ground-based and the TROPOMI measurements with a resolution of 3.5 x 5.5 km<sup>2</sup> and is therefore well suited to validate TROPOMI's tropospheric NO<sub>2</sub> vertical column.</p><p>The measurements on the seven flight days show strong variability depending on the different target areas, the weekday and meteorological conditions. We found an overall low bias of the TROPOMI operational NO<sub>2</sub> data for all three target areas but with varying magnitude for different days. The campaign data set is compared to custom TROPOMI NO<sub>2</sub> products, using different auxiliary data, such as albedo or a priori vertical profiles to evaluate the influence on the TROPOMI data product. Analyzing and comparing the different data sets provides more insight into the high spatial and temporal heterogeneity in NO<sub>2</sub> and its impact on satellite observations and their validation.</p>


Author(s):  
James B. Elsner ◽  
Thomas H. Jagger

Hurricane data originate from careful analysis of past storms by operational meteorologists. The data include estimates of the hurricane position and intensity at 6-hourly intervals. Information related to landfall time, local wind speeds, damages, and deaths, as well as cyclone size, are included. The data are archived by season. Some effort is needed to make the data useful for hurricane climate studies. In this chapter, we describe the data sets used throughout this book. We show you a work flow that includes importing, interpolating, smoothing, and adding attributes. We also show you how to create subsets of the data. Code in this chapter is more complicated and it can take longer to run. You can skip this material on first reading and continue with model building in Chapter 7. You can return here when you have an updated version of the data that includes the most recent years. Most statistical models in this book use the best-track data. Here we describe these data and provide original source material. We also explain how to smooth and interpolate them. Interpolations are needed for regional hurricane analyses. The best-track data set contains the 6-hourly center locations and intensities of all known tropical cyclones across the North Atlantic basin, including the Gulf of Mexico and Caribbean Sea. The data set is called HURDAT for HURricane DATa. It is maintained by the U.S. National Oceanic and Atmospheric Administration (NOAA) at the National Hurricane Center (NHC). Center locations are given in geographic coordinates (in tenths of degrees) and the intensities, representing the one-minute near-surface (∼10 m) wind speeds, are given in knots (1 kt = .5144 m s−1) and the minimum central pressures are given in millibars (1 mb = 1 hPa). The data are provided in 6-hourly intervals starting at 00 UTC (Universal Time Coordinate). The version of HURDAT file used here contains cyclones over the period 1851 through 2010 inclusive. Information on the history and origin of these data is found in Jarvinen et al (1984). The file has a logical structure that makes it easy to read with a FORTRAN program. Each cyclone contains a header record, a series of data records, and a trailer record.


Ocean Science ◽  
2006 ◽  
Vol 2 (1) ◽  
pp. 11-18 ◽  
Author(s):  
A. Henry-Edwards ◽  
M. Tomczak

Abstract. A water mass analysis method based on a constrained minimization technique is developed to derive water property changes in water mass formation regions from oceanographic station data taken at significant distance from the formation regions. The method is tested with two synthetic data sets, designed to mirror conditions in the North Atlantic at the Bermuda BATS time series station. The method requires careful definition of constraints before it produces reliable results. It is shown that an analysis of the error fields under different constraint assumptions can identify which properties vary most over the period of the observations. The method reproduces the synthetic data sets extremely well if all properties other than those that are identified as undergoing significant variations are held constant during the minimization.


Geophysics ◽  
2012 ◽  
Vol 77 (5) ◽  
pp. WC81-WC93 ◽  
Author(s):  
Michal Malinowski ◽  
Ernst Schetselaar ◽  
Donald J. White

We applied seismic modeling for a detailed 3D geologic model of the Flin Flon mining camp (Canada) to address some imaging and interpretation issues related to a [Formula: see text] 3D survey acquired in the camp and described in a complementary paper (part 1). A 3D geologic volumetric model of the camp was created based on a compilation of geologic data constraints from drillholes, surface geologic mapping, interpretation of 2D seismic profiles, and 3D surface and grid geostatistical modeling techniques. The 3D modeling methodology was based on a hierarchical approach to account for the heterogeneous spatial distribution of geologic constraints. Elastic parameters were assigned within the model based on core sample measurements and correlation with the different lithologies. The phase-screen algorithm used for seismic modeling was validated against analytic and finite-difference solutions to ensure that it provided accurate amplitude-variation-with-offset behavior for dipping strata. Synthetic data were generated to form zero-offset (stack) volume and also a complete prestack data set using the geometry of the real 3D survey. We found that the ability to detect a clear signature of the volcanogenic massive sulfide with ore deposits is dependent on the mineralization type (pyrite versus pyrrhotite rich ore), especially when ore-host rock interaction is considered. In the presence of an increasing fraction of the host rhyolite rock within the model volume, the response from the lower impedance pyrrhotite ore is masked by that of the rhyolite. Migration tests showed that poststack migration effectively enhances noisy 3D DMO data and provides comparable results to more computationally expensive prestack time migration. Amplitude anomalies identified in the original 3D data, which were not predicted by our modeling, could represent potential exploration targets in an undeveloped part of the camp, assuming that our a priori earth model is sufficiently accurate.


Sign in / Sign up

Export Citation Format

Share Document