Interpolating wide-aperture ground-penetrating radar beyond aliasing

Geophysics ◽  
2015 ◽  
Vol 80 (2) ◽  
pp. H13-H22 ◽  
Author(s):  
Saulo S. Martins ◽  
Jandyr M. Travassos

Most of the data acquisition in ground-penetrating radar is done along fixed-offset profiles, in which velocity is known only at isolated points in the survey area, at the locations of variable offset gathers such as a common midpoint. We have constructed sparse, heavily aliased, variable offset gathers from several fixed-offset, collinear, profiles. We interpolated those gathers to produce properly sampled counterparts, thus pushing data beyond aliasing. The interpolation methodology estimated nonstationary, adaptive, filter coefficients at all trace locations, including at the missing traces’ corresponding positions, filled with zeroed traces. This is followed by an inversion problem that uses the previously estimated filter coefficients to insert the new, interpolated, traces between the original ones. We extended this two-step strategy to data interpolation by employing a device in which we used filter coefficients from a denser variable offset gather to interpolate the missing traces on a few independently constructed gathers. We applied the methodology on synthetic and real data sets, the latter acquired in the interior of the Antarctic continent. The variable-offset interpolated data opened the door to prestack processing, making feasible the production of a prestack time migrated section and a 2D velocity model for the entire profile. Notwithstanding, we have used a data set obtained in Antarctica; there is no reason the same methodology could not be used somewhere else.

Geophysics ◽  
2004 ◽  
Vol 69 (2) ◽  
pp. 599-607 ◽  
Author(s):  
Hervé Perroud ◽  
Martin Tygel

We describe a new implementation of the normal‐moveout (NMO) correction that is routinely applied to common‐midpoint (CMP) reflections prior to stacking. The procedure, called nonstretch NMO, automatically avoids the undesirable stretch effects that are present in conventional NMO. Under nonstretch NMO, a significant range of large offsets that normally would be muted in the case of conventional NMO can be kept and used, thereby leading to better stack and velocity determinations. We illustrate the use of nonstretch NMO by applying it to synthetic and real data sets obtained from high‐resolution (HR) seismic and ground‐penetrating radar (GPR) measurements.


2010 ◽  
Vol 24 ◽  
pp. 23-34 ◽  
Author(s):  
S. Kadioglu ◽  
Y. K. Kadioglu

Abstract. The aim of the study is to formulate an approach to the monitoring of internal micro discontiniuties in a hybrid 2-D/3-D image of ground penetrating radar (GPR) data gathered on historical monument groups, and to indicate methodologically rearranging amplitude-color scale and its opacity functions to activate micro fractures in monument groups including three colossal women, three men, and 24 lion statues in Mustafa Kemal ATATÜRK's mausoleum (ANITKABIR) in Ankara, Turkey. Additionally, this paper illustrates the use of petrographic research to describe the monument and its groups. To achieve the aim, data measurements were carried out on the monument groups with spaced 10 cm profiles and 1.6 GHz antenna. The 3-D image was transparent 3-D volumes of the GPR data set that highlighted internal micro fractures and cavities in the statues. Rearranging appropriate amplitude-color scale and formulating the opaque of the data sets were the keys to the transparent 3-D data visualizations. As a result, the internal fractures and cavities were successfully visualized in the three women, three men and twenty-four lion statues. Micro fractures were observed particularly at the rim of the vesicular of the rocks under a polarizing microscope.


2021 ◽  
Vol 13 (12) ◽  
pp. 2384
Author(s):  
Roland Filzwieser ◽  
Vujadin Ivanišević ◽  
Geert J. Verhoeven ◽  
Christian Gugl ◽  
Klaus Löcker ◽  
...  

Large parts of the urban layout of the abandoned Roman town of Bassianae (in present-day Serbia) are still discernible on the surface today due to the deliberate and targeted quarrying of the Roman foundations. In 2014, all of the town's intramural (and some extramural) areas were surveyed using aerial photography, ground-penetrating radar, and magnetometry to analyze the site's topography and to map remaining buried structures. The surveys showed a strong agreement between the digital surface model derived from the aerial photographs and the geophysical prospection data. However, many structures could only be detected by one method, underlining the benefits of a complementary archaeological prospection approach using multiple methods. This article presents the results of the extensive surveys and their comprehensive integrative interpretation, discussing Bassianae's ground plan and urban infrastructure. Starting with an overview of this Roman town's research history, we present the details of the triple prospection approach, followed by the processing, integrative analysis, and interpretation of the acquired data sets. Finally, this newly gained information is contrasted with a plan of Roman Bassianae compiled in 1935.


2018 ◽  
Vol 11 (2) ◽  
pp. 53-67
Author(s):  
Ajay Kumar ◽  
Shishir Kumar

Several initial center selection algorithms are proposed in the literature for numerical data, but the values of the categorical data are unordered so, these methods are not applicable to a categorical data set. This article investigates the initial center selection process for the categorical data and after that present a new support based initial center selection algorithm. The proposed algorithm measures the weight of unique data points of an attribute with the help of support and then integrates these weights along the rows, to get the support of every row. Further, a data object having the largest support is chosen as an initial center followed by finding other centers that are at the greatest distance from the initially selected center. The quality of the proposed algorithm is compared with the random initial center selection method, Cao's method, Wu method and the method introduced by Khan and Ahmad. Experimental analysis on real data sets shows the effectiveness of the proposed algorithm.


Geophysics ◽  
2019 ◽  
Vol 84 (2) ◽  
pp. R165-R174 ◽  
Author(s):  
Marcelo Jorge Luz Mesquita ◽  
João Carlos Ribeiro Cruz ◽  
German Garabito Callapino

Estimation of an accurate velocity macromodel is an important step in seismic imaging. We have developed an approach based on coherence measurements and finite-offset (FO) beam stacking. The algorithm is an FO common-reflection-surface tomography, which aims to determine the best layered depth-velocity model by finding the model that maximizes a semblance objective function calculated from the amplitudes in common-midpoint (CMP) gathers stacked over a predetermined aperture. We develop the subsurface velocity model with a stack of layers separated by smooth interfaces. The algorithm is applied layer by layer from the top downward in four steps per layer. First, by automatic or manual picking, we estimate the reflection times of events that describe the interfaces in a time-migrated section. Second, we convert these times to depth using the velocity model via application of Dix’s formula and the image rays to the events. Third, by using ray tracing, we calculate kinematic parameters along the central ray and build a paraxial FO traveltime approximation for the FO common-reflection-surface method. Finally, starting from CMP gathers, we calculate the semblance of the selected events using this paraxial traveltime approximation. After repeating this algorithm for all selected CMP gathers, we use the mean semblance values as an objective function for the target layer. When this coherence measure is maximized, the model is accepted and the process is completed. Otherwise, the process restarts from step two with the updated velocity model. Because the inverse problem we are solving is nonlinear, we use very fast simulated annealing to search the velocity parameters in the target layers. We test the method on synthetic and real data sets to study its use and advantages.


2018 ◽  
Vol 2018 ◽  
pp. 1-12 ◽  
Author(s):  
Suleman Nasiru

The need to develop generalizations of existing statistical distributions to make them more flexible in modeling real data sets is vital in parametric statistical modeling and inference. Thus, this study develops a new class of distributions called the extended odd Fréchet family of distributions for modifying existing standard distributions. Two special models named the extended odd Fréchet Nadarajah-Haghighi and extended odd Fréchet Weibull distributions are proposed using the developed family. The densities and the hazard rate functions of the two special distributions exhibit different kinds of monotonic and nonmonotonic shapes. The maximum likelihood method is used to develop estimators for the parameters of the new class of distributions. The application of the special distributions is illustrated by means of a real data set. The results revealed that the special distributions developed from the new family can provide reasonable parametric fit to the given data set compared to other existing distributions.


2019 ◽  
Vol 11 (4) ◽  
pp. 405
Author(s):  
Xuan Feng ◽  
Haoqiu Zhou ◽  
Cai Liu ◽  
Yan Zhang ◽  
Wenjing Liang ◽  
...  

The subsurface target classification of ground penetrating radar (GPR) is a popular topic in the field of geophysics. Among the existing classification methods, geometrical features and polarimetric attributes of targets are primarily used. As polarimetric attributes contain more information of targets, polarimetric decomposition methods, such as H-Alpha decomposition, have been developed for target classification of GPR in recent years. However, the classification template used in H-Alpha classification is preset depending on the experience of synthetic aperture radar (SAR); therefore, it may not be suitable for GPR. Moreover, many existing classification methods require excessive human operation, particularly when outliers exist in the sample (the data set containing the features of targets); therefore, they are not efficient or intelligent. We herein propose a new machine learning method based on sample centers, i.e., particle center supported plane (PCSP). The sample center is defined as the point with the smallest sum of distances from all points in the same sample, which is considered as a better representation of the sample without significant effect of the outliers. In this proposed method, particle swarm optimization (PSO) is performed to obtain the sample centers; the new criterion for subsurface target classification is achieved. We applied this algorithm to full polarimetric GPR data measured in the laboratory and outdoors. The results indicate that, comparing with support vector machine (SVM) and classical H-Alpha classification, this new method is more efficient and the accuracy is relatively high.


Geophysics ◽  
2016 ◽  
Vol 81 (1) ◽  
pp. WA119-WA129 ◽  
Author(s):  
Anja Rutishauser ◽  
Hansruedi Maurer ◽  
Andreas Bauder

On the basis of a large data set, comprising approximately 1200 km of profile lines acquired with different helicopter-borne ground-penetrating radar (GPR) systems over temperate glaciers in the western Swiss Alps, we have analyzed the possibilities and limitations of using helicopter-borne GPR surveying to map the ice-bedrock interface. We have considered data from three different acquisition systems including (1) a low-frequency pulsed system hanging below the helicopter (BGR), (2) a stepped frequency system hanging below the helicopter (Radar Systemtechnik GmbH [RST]), and (3) a commercial system mounted directly on the helicopter skids (Geophysical Survey Systems Incorporated [GSSI]). The systems showed considerable differences in their performance. The best results were achieved with the BGR system. On average, the RST and GSSI systems yielded comparable results, but we observed significant site-specific differences. A comparison with ground-based GPR data found that the quality of helicopter-borne data is inferior, but the compelling advantages of airborne surveying still make helicopter-borne data acquisition an attractive option. Statistical analyses concerning the bedrock detectability revealed not only large differences between the different acquisition systems but also between different regions within our investigation area. The percentage of bedrock reflections identified (with respect to the overall profile length within a particular region) varied from 11.7% to 68.9%. Obvious factors for missing the bedrock reflections included large bedrock depths and steeply dipping bedrock interfaces, but we also observed that internal features within the ice body may obscure bedrock reflections. In particular, we identified a conspicuous “internal reflection band” in many profiles acquired with the GSSI system. We attribute this feature to abrupt changes of the water content within the ice, but more research is required for a better understanding of the nature of this internal reflection band.


1994 ◽  
Vol 1 (2/3) ◽  
pp. 182-190 ◽  
Author(s):  
M. Eneva

Abstract. Using finite data sets and limited size of study volumes may result in significant spurious effects when estimating the scaling properties of various physical processes. These effects are examined with an example featuring the spatial distribution of induced seismic activity in Creighton Mine (northern Ontario, Canada). The events studied in the present work occurred during a three-month period, March-May 1992, within a volume of approximate size 400 x 400 x 180 m3. Two sets of microearthquake locations are studied: Data Set 1 (14,338 events) and Data Set 2 (1654 events). Data Set 1 includes the more accurately located events and amounts to about 30 per cent of all recorded data. Data Set 2 represents a portion of the first data set that is formed by the most accurately located and the strongest microearthquakes. The spatial distribution of events in the two data sets is examined for scaling behaviour using the method of generalized correlation integrals featuring various moments q. From these, generalized correlation dimensions are estimated using the slope method. Similar estimates are made for randomly generated point sets using the same numbers of events and the same study volumes as for the real data. Uniform and monofractal random distributions are used for these simulations. In addition, samples from the real data are randomly extracted and the dimension spectra for these are examined as well. The spectra for the uniform and monofractal random generations show spurious multifractality due only to the use of finite numbers of data points and limited size of study volume. Comparing these with the spectra of dimensions for Data Set 1 and Data Set 2 allows us to estimate the bias likely to be present in the estimates for the real data. The strong multifractality suggested by the spectrum for Data Set 2 appears to be largely spurious; the spatial distribution, while different from uniform, could originate from a monofractal process. The spatial distribution of microearthquakes in Data Set 1 is either monofractal as well, or only weakly multifractal. In all similar studies, comparisons of result from real data and simulated point sets may help distinguish between genuine and artificial multifractality, without necessarily resorting to large number of data.


Sign in / Sign up

Export Citation Format

Share Document