High-resolution geophysical characterization of shallow-water wetlands

Geophysics ◽  
2006 ◽  
Vol 71 (4) ◽  
pp. B101-B109 ◽  
Author(s):  
Nasser Mansoor ◽  
Lee Slater ◽  
Francisco Artigas ◽  
Esben Auken

We describe a procedure for rapid characterization of shallow-water, contaminated wetlands. Terrain-conductivity (TC), vertical-magnetic-gradiometry, and surface-water-chemistry data were obtained from a shallow-draft paddleboat operable in as little as [Formula: see text] of water. Measurements were taken every [Formula: see text], with data-acquisition rates exceeding [Formula: see text] of line ([Formula: see text] data points) per 8-hr field day. We applied this procedure to an urban wetland that is affected by point and nonpoint sources of pollution. We used a one-dimensional, laterally constrained inversion algorithm to invert the apparent-conductivity data set obtained from the TC survey and to create a pseudo-2D image of sediment conductivity. The continuously recorded surface-water depth and conductivity values were input as a priori information in the inversion. We used soil chemistry determined for 28 sediment samples collected from the site, as well as lithologic logs from across the wetland, to constrain interpretation of the geophysical data. The inverted sediment conductivity describes a pattern of contamination probably attributable to leachates from adjacent landfills and/or to saltwater ingress from a partial tidal connection that is not obvious in the surface-water data. Magnetic-gradiometry values and the in-phase component of an EM31 response both reflect primarily the distribution of junk metal associated with a legacy of illegal dumping. Historic aerial photographs suggest that this distribution reflects land-use history and defines the maximum previous extent of an adjacent landfill and a pattern of dumping correlated with historic roadways.

Paleobiology ◽  
2016 ◽  
Vol 43 (1) ◽  
pp. 68-84 ◽  
Author(s):  
Bradley Deline ◽  
William I. Ausich

AbstractA priori choices in the detail and breadth of a study are important in addressing scientific hypotheses. In particular, choices in the number and type of characters can greatly influence the results in studies of morphological diversity. A new character suite was constructed to examine trends in the disparity of early Paleozoic crinoids. Character-based rarefaction analysis indicated that a small subset of these characters (~20% of the complete data set) could be used to capture most of the properties of the entire data set in analyses of crinoids as a whole, noncamerate crinoids, and to a lesser extent camerate crinoids. This pattern may be the result of the covariance between characters and the characterization of rare morphologies that are not represented in the primary axes in morphospace. Shifting emphasis on different body regions (oral system, calyx, periproct system, and pelma) also influenced estimates of relative disparity between subclasses of crinoids. Given these results, morphological studies should include a pilot analysis to better examine the amount and type of data needed to address specific scientific hypotheses.


2016 ◽  
Vol 4 (4) ◽  
pp. T577-T589 ◽  
Author(s):  
Haitham Hamid ◽  
Adam Pidlisecky

In complex geology, the presence of highly dipping structures can complicate impedance inversion. We have developed a structurally constrained inversion in which a computationally well-behaved objective function is minimized subject to structural constraints. This approach allows the objective function to incorporate structural orientation in the form of dips into our inversion algorithm. Our method involves a multitrace impedance inversion and a rotation of an orthogonal system of derivative operators. Local dips used to constrain the derivative operators were estimated from migrated seismic data. In addition to imposing structural constraints on the inversion model, this algorithm allows for the inclusion of a priori knowledge from boreholes. We investigated this algorithm on a complex synthetic 2D model as well as a seismic field data set. We compared the result obtained with this approach with the results from single trace-based inversion and laterally constrained inversion. The inversion carried out using dip information produces a model that has higher resolution that is more geologically realistic compared with other methods.


2010 ◽  
Vol 3 (1) ◽  
pp. 579-597 ◽  
Author(s):  
V. F. Sofieva ◽  
J. Vira ◽  
E. Kyrölä ◽  
J. Tamminen ◽  
V. Kan ◽  
...  

Abstract. In this paper, we discuss the development of the inversion algorithm for the GOMOS (Global Ozone Monitoring by Occultation of Star) instrument on board the Envisat satellite. The proposed algorithm takes accurately into account the wavelength-dependent modeling errors, which are mainly due to the incomplete scintillation correction in the stratosphere. The special attention is paid to numerical efficiency of the algorithm. The developed method is tested on a large data set and its advantages are demonstrated. Its main advantage is a proper characterization of the uncertainties of the retrieved profiles of atmospheric constituents, which is of high importance for data assimilation, trend analyses and validation.


Geophysics ◽  
1996 ◽  
Vol 61 (2) ◽  
pp. 538-548 ◽  
Author(s):  
Douglas J. LaBrecque ◽  
Michela Miletto ◽  
William Daily ◽  
Aberlardo Ramirez ◽  
Earle Owen

An Occam’s inversion algorithm for crosshole resistivity data that uses a finite‐element method forward solution is discussed. For the inverse algorithm, the earth is discretized into a series of parameter blocks, each containing one or more elements. The Occam’s inversion finds the smoothest 2-D model for which the Chi‐squared statistic equals an a priori value. Synthetic model data are used to show the effects of noise and noise estimates on the resulting 2-D resistivity images. Resolution of the images decreases with increasing noise. The reconstructions are underdetermined so that at low noise levels the images converge to an asymptotic image, not the true geoelectrical section. If the estimated standard deviation is too low, the algorithm cannot achieve an adequate data fit, the resulting image becomes rough, and irregular artifacts start to appear. When the estimated standard deviation is larger than the correct value, the resolution decreases substantially (the image is too smooth). The same effects are demonstrated for field data from a site near Livermore, California. However, when the correct noise values are known, the Occam’s results are independent of the discretization used. A case history of monitoring at an enhanced oil recovery site is used to illustrate problems in comparing successive images over time from a site where the noise level changes. In this case, changes in image resolution can be misinterpreted as actual geoelectrical changes. One solution to this problem is to perform smoothest, but non‐Occam’s, inversion on later data sets using parameters found from the background data set.


2010 ◽  
Vol 3 (4) ◽  
pp. 1019-1027 ◽  
Author(s):  
V. F. Sofieva ◽  
J. Vira ◽  
E. Kyrölä ◽  
J. Tamminen ◽  
V. Kan ◽  
...  

Abstract. In this paper, we discuss the development of the inversion algorithm for the GOMOS (Global Ozone Monitoring by Occultation of Star) instrument on board the Envisat satellite. The proposed algorithm takes accurately into account the wavelength-dependent modeling errors, which are mainly due to the incomplete scintillation correction in the stratosphere. The special attention is paid to numerical efficiency of the algorithm. The developed method is tested on a large data set and its advantages are demonstrated. Its main advantage is a proper characterization of the uncertainties of the retrieved profiles of atmospheric constituents, which is of high importance for data assimilation, trend analyses and validation.


Sign in / Sign up

Export Citation Format

Share Document