Cross-Borehole ERT: Sensitivity, Model Resolution, and Field Data Quality

Author(s):  
L.M. Madsen ◽  
A.K. Kühl ◽  
L. Levy ◽  
A.V. Christiansen
Forests ◽  
2019 ◽  
Vol 10 (4) ◽  
pp. 349 ◽  
Author(s):  
Adam Berland ◽  
Lara A. Roman ◽  
Jess Vogt

Street tree inventories are a critical component of urban forest management. However, inventories conducted in the field by trained professionals are expensive and time-consuming. Inventories relying on citizen scientists or virtual surveys conducted remotely using street-level photographs may greatly reduce the costs of street tree inventories, but there are fundamental uncertainties regarding the level of data quality that can be expected from these emerging approaches to data collection. We asked 16 volunteers to inventory street trees in suburban Chicago using Google Street ViewTM imagery, and we assessed data quality by comparing their virtual survey data to field data from the same locations. We also compared virtual survey data quality according to self-rated expertise by measuring agreement within expert, intermediate, and novice analyst groups. Analyst agreement was very good for the number of trees on each street segment, and agreement was markedly lower for tree diameter class and tree identification at the genus and species levels, respectively. Interrater agreement varied by expertise, such that experts agreed with one another more often than novices for all four variables assessed. Compared to the field data, we observed substantial variability in analyst performance for diameter class estimation and tree identification, and some intermediate analysts performed as well as experts. Our findings suggest that virtual surveys may be useful for documenting the locations of street trees within a city more efficiently than field crews and with a high level of accuracy. However, tree diameter and species identification data were less reliable across all expertise groups, and especially novice analysts. Based on this analysis, virtual street tree inventories are best suited to collecting very basic information such as tree locations, or updating existing inventories to determine where trees have been planted or removed. We conclude with evidence-based recommendations for effective implementation of this type of approach.


Author(s):  
J. A. C. Meekes ◽  
J. J. G. Beckers ◽  
S. P. Wijn

2009 ◽  
Vol 66 (10) ◽  
pp. 3217-3225 ◽  
Author(s):  
Mark Kelly ◽  
John C. Wyngaard ◽  
Peter P. Sullivan

Abstract Simple rate equation models for subfilter-scale scalar and momentum fluxes have previously been developed for application in the so-called “terra incognita” of atmospheric simulations, where the model resolution is comparable to the scale of turbulence. The models performed well over land, but only the scalar flux model appeared to perform adequately over the ocean. Analysis of data from the Ocean Horizontal Array Turbulence Study (OHATS) reveals a need to account for the moving ocean–air interface in the subfilter stress model. The authors develop simple parameterizations for the effect of surface-induced pressure fluctuations on the subfilter stress, leading to good predictions of subfilter momentum flux both over land and in OHATS.


2014 ◽  
Vol 6 (2) ◽  
pp. 2169-2213
Author(s):  
T. Burschil ◽  
T. Beilecke ◽  
C. M. Krawczyk

Abstract. High-resolution reflection seismic methods are an established non-destructive tool for engineering tasks. In the near surface, shear wave reflection seismic measurements usually offer a higher spatial resolution in the same effective signal frequency spectrum than P wave data, but data quality varies more strongly. To discuss the causes of these differences, we investigated a P wave and a SH wave reflection seismic profile measured at the same location on Föhr island, and applied reflection seismic processing to the field data as well as finite difference modelling of the seismic wavefield (SOFI FD-code). The simulations calculated were adapted to the acquisition field geometry, comprising 2 m receiver distance and 4 m shot distance along the 1.5 km long P wave and 800 m long SH wave profiles. A Ricker-Wavelet and the use of absorbing frames were first order model parameters. The petrophysical parameters to populate the structural models down to 400 m depth are taken from borehole data, VSP measurements and cross-plot relations. The first simulation of the P wave wavefield was based on a simplified hydrogeological model of the survey location containing six lithostratigraphic units. Single shot data were compared and seismic sections created. Major features like direct wave, refracted waves and reflections are imaged, but the reflectors describing a prominent till layer at ca. 80 m depth was missing. Therefore, the P wave input model was refined and 16 units assigned. These define a laterally more variable velocity model (vP = 1600–2300 m s−1) leading to a much better reproduction of the field data. The SH wave model was adapted accordingly but only led to minor correlation with the field data and produced a higher signal-to-noise ratio. Therefore, we suggest to consider for future simulations additional features like intrinsic damping, thin layering, or a near surface weathering layer. These may lead to a better understanding of key parameters determining the data quality of near-surface seismic measurements.


Geophysics ◽  
2010 ◽  
Vol 75 (6) ◽  
pp. WB15-WB27 ◽  
Author(s):  
Ramesh (Neelsh) Neelamani ◽  
Christine E. Krohn ◽  
Jerry R. Krebs ◽  
Justin K. Romberg ◽  
Max Deffenbaugh ◽  
...  

The high cost of simulating densely sampled seismic forward modeling data arises from activating sources one at a time in sequence. To increase efficiency, one could leverage recent innovations in seismic field-data acquisition and activate several (e.g., 2–6) sources simultaneously during modeling. However, such approaches would suffer from degraded data quality because of the interference between the model’s responses to the simultaneous sources. Two new efficient simultaneous-source modeling approaches are proposed that rely on the novel tandem use of randomness and sparsity to construct almost noise-free model response to individual sources. In each approach, the first step is to measure the model’s cumulative response with all sources activated simultaneously using randomly scaled band-limited impulses or continuous band-limited random-noise waveforms. In the second step, the model response to each individual sourceis estimated from the cumulative receiver measurement by exploiting knowledge of the random source waveforms and the sparsity of the model response to individual sources in a known transform domain (e.g., curvelet domain). The efficiency achievable by the approaches is primarily governed by the sparsity of the model response. By invoking results from the field of compressive sensing, theoretical bounds are provided that assert that the approaches would need less modeling time for sparser (i.e., simpler or more structured) model responses. A simulated modeling example is illustrated that shows that data collected with as many as 8192 sources activated simultaneously can be separated into the 8192 individual source gathers with data quality comparable to that obtained when the sources were activated sequentially. The proposed approaches could also dramatically improve seismic field-data acquisition efficiency if the source signatures actually probing the earth can be measured accurately.


Author(s):  
Thomas Foken ◽  
Mathias Göockede ◽  
Matthias Mauder ◽  
Larry Mahrt ◽  
Brian Amiro ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document