Unambiguous signal recovery above the Nyquist using random‐sample‐interval imaging

Geophysics ◽  
1998 ◽  
Vol 63 (2) ◽  
pp. 763-771 ◽  
Author(s):  
R. Daniel Wisecup

Aliasing is generally understood to mean that sampling causes those frequencies above the Nyquist frequency to be irretrievably “mixed” with those below. As a result, the perceived need to prevent signal aliasing has played a major role in limiting useable signal bandwidth. Yet, the evidence of aliasing in multichannel seismic data is often paradoxical and contradictory, suggesting that aliasing may be more apparent than real. A simple, exact sample‐mapping methodology, random‐sample‐interval imaging, can be used to overcome aliasing in many of the processes used currently for the imaging of seismic data. The robust process recovers broadband signal, on both synthetic and real data, with frequencies significantly above the Nyquist limit predicted by the 1-D sampling theorem. The method appears to be applicable whenever the signal trajectory is intersected irregularly by a sampling grid of two or more dimensions. The results suggest that both spatial and temporal aliasing of signal may be resolved simultaneously by this strategy.

Geophysics ◽  
1999 ◽  
Vol 64 (2) ◽  
pp. 632-634
Author(s):  
Gijs J. O. Vermeer

Wisecup (1997) claims that “a simple, exact sample‐maping methodology, random‐sample‐interval imaging, can be used to overcome aliasing in many of the processes currently used for the imaging of seismic data.” Other statements are that “increased equipment costs are incurred due to the presumed (italics by GJOV) requirement for antialias filter circuitry in the recording instruments,” and “many antialias strategies currently in use may be inappropriate,” and many more such remarks. The question comes up whether this random‐sample‐interval imaging (RSI2) is really so wonderful as it is purported to be.


2020 ◽  
Vol 2020 (14) ◽  
pp. 307-1-307-6
Author(s):  
Laura Galvis ◽  
Juan M. Ramírez ◽  
Edwin Vargas ◽  
Ofelia Villarreal ◽  
William Agudelo ◽  
...  

In a 3D seismic survey, the source sampling in a regular grid is commonly limited by economic costs, geological constraints, and environmental challenges. This non-uniform sampling cannot be ignored since the lack of regularity leads to incomplete seismic data with missing 2D wavefields. Notice that the postprocessing tasks have been developed under the assumption that 3D seismic data are obtained from a regular sampling. Therefore, signal recovery from incomplete data becomes a crucial step in the seismic imaging processing flow. In this work, we propose a pre-processing step that includes the nonuniformly acquired wavefields in a finer regular grid, such that shot gathers are stacked considering the actual spatial location of the sources. Then, based on the 3D curvelet transform, a sparse signal recovery algorithm that considers an interpolation operator is employed in order to reconstruct the missing wavefields in a regular grid. The performance of the proposed seismic reconstruction approach is evaluated on a real data set.


Author(s):  
Nina Skaarup ◽  
James A. Chalmers

NOTE: This article was published in a former series of GEUS Bulletin. Please use the original series name when citing this article, for example: Skaarup, N., & Chalmers, J. A. (1998). A possible new hydrocarbon play, offshore central West Greenland. Geology of Greenland Survey Bulletin, 180, 28-30. https://doi.org/10.34194/ggub.v180.5082 _______________ The discovery of extensive seeps of crude oil onshore central West Greenland (Christiansen et al. 1992, 1994, 1995, 1996, 1997, 1998, this volume; Christiansen 1993) means that the central West Greenland area is now prospective for hydrocarbons in its own right. Analysis of the oils (Bojesen-Koefoed et al. in press) shows that their source rocks are probably nearby and, because the oils are found within the Lower Tertiary basalts, the source rocks must be below the basalts. It is therefore possible that in the offshore area oil could have migrated through the basalts and be trapped in overlying sediments. In the offshore area to the west of Disko and Nuussuaq (Fig. 1), Whittaker (1995, 1996) interpreted a few multichannel seismic lines acquired in 1990, together with some seismic data acquired by industry in the 1970s. He described a number of large rotated fault-blocks containing structural closures at top basalt level that could indicate leads capable of trapping hydrocarbons. In order to investigate Whittaker’s (1995, 1996) interpretation, in 1995 the Geological Survey of Greenland acquired 1960 km new multichannel seismic data (Fig. 1) using funds provided by the Government of Greenland, Minerals Office (now Bureau of Minerals and Petroleum) and the Danish State through the Mineral Resources Administration for Greenland. The data were acquired using the Danish Naval vessel Thetis which had been adapted to accommodate seismic equipment. The data acquired in 1995 have been integrated with the older data and an interpretation has been carried out of the structure of the top basalt reflection. This work shows a fault pattern in general agreement with that of Whittaker (1995, 1996), although there are differences in detail. In particular the largest structural closure reported by Whittaker (1995) has not been confirmed. Furthermore, one of Whittaker’s (1995) smaller leads seems to be larger than he had interpreted and may be associated with a DHI (direct hydrocarbon indicator) in the form of a ‘bright spot’.


Geophysics ◽  
2006 ◽  
Vol 71 (5) ◽  
pp. U67-U76 ◽  
Author(s):  
Robert J. Ferguson

The possibility of improving regularization/datuming of seismic data is investigated by treating wavefield extrapolation as an inversion problem. Weighted, damped least squares is then used to produce the regularized/datumed wavefield. Regularization/datuming is extremely costly because of computing the Hessian, so an efficient approximation is introduced. Approximation is achieved by computing a limited number of diagonals in the operators involved. Real and synthetic data examples demonstrate the utility of this approach. For synthetic data, regularization/datuming is demonstrated for large extrapolation distances using a highly irregular recording array. Without approximation, regularization/datuming returns a regularized wavefield with reduced operator artifacts when compared to a nonregularizing method such as generalized phase shift plus interpolation (PSPI). Approximate regularization/datuming returns a regularized wavefield for approximately two orders of magnitude less in cost; but it is dip limited, though in a controllable way, compared to the full method. The Foothills structural data set, a freely available data set from the Rocky Mountains of Canada, demonstrates application to real data. The data have highly irregular sampling along the shot coordinate, and they suffer from significant near-surface effects. Approximate regularization/datuming returns common receiver data that are superior in appearance compared to conventional datuming.


2013 ◽  
Vol 184 (1) ◽  
pp. 162-171 ◽  
Author(s):  
J.J. Galiana-Merino ◽  
J.L. Rosa-Herranz ◽  
S. Rosa-Cintas ◽  
J.J. Martinez-Espla

2021 ◽  
Vol 17 (3) ◽  
pp. e1008256
Author(s):  
Shuonan Chen ◽  
Jackson Loper ◽  
Xiaoyin Chen ◽  
Alex Vaughan ◽  
Anthony M. Zador ◽  
...  

Modern spatial transcriptomics methods can target thousands of different types of RNA transcripts in a single slice of tissue. Many biological applications demand a high spatial density of transcripts relative to the imaging resolution, leading to partial mixing of transcript rolonies in many voxels; unfortunately, current analysis methods do not perform robustly in this highly-mixed setting. Here we develop a new analysis approach, BARcode DEmixing through Non-negative Spatial Regression (BarDensr): we start with a generative model of the physical process that leads to the observed image data and then apply sparse convex optimization methods to estimate the underlying (demixed) rolony densities. We apply BarDensr to simulated and real data and find that it achieves state of the art signal recovery, particularly in densely-labeled regions or data with low spatial resolution. Finally, BarDensr is fast and parallelizable. We provide open-source code as well as an implementation for the ‘NeuroCAAS’ cloud platform.


2020 ◽  
Author(s):  
Viraj Shah ◽  
Chinmay Hegde

Abstract We consider the problem of reconstructing a signal from under-determined modulo observations (or measurements). This observation model is inspired by a (relatively) less well-known imaging mechanism called modulo imaging, which can be used to extend the dynamic range of imaging systems; variations of this model have also been studied under the category of phase unwrapping. Signal reconstruction in the under-determined regime with modulo observations is a challenging ill-posed problem, and existing reconstruction methods cannot be used directly. In this paper, we propose a novel approach to solving the inverse problem limited to two modulo periods, inspired by recent advances in algorithms for phase retrieval under sparsity constraints. We show that given a sufficient number of measurements, our algorithm perfectly recovers the underlying signal and provides improved performance over other existing algorithms. We also provide experiments validating our approach on both synthetic and real data to depict its superior performance.


Author(s):  
Robert J Marks II

The literature on the recovery of signals and images is vast (e.g., [23, 110, 112, 257, 391, 439, 791, 795, 933, 934, 937, 945, 956, 1104, 1324, 1494, 1495, 1551]). In this Chapter, the specific problem of recovering lost signal intervals from the remaining known portion of the signal is considered. Signal recovery is also a topic of Chapter 11 on POCS. To this point, sampling has been discrete. Bandlimited signals, we will show, can also be recovered from continuous samples. Our definition of continuous sampling is best presented by illustration.Asignal, f (t), is shown in Figure 10.1a, along with some possible continuous samples. Regaining f (t) from knowledge of ge(t) = f (t)Π(t/T) in Figure 10.1b is the extrapolation problem which has applications in a number of fields. In optics, for example, extrapolation in the frequency domain is termed super resolution [2, 40, 367, 444, 500, 523, 641, 720, 864, 1016, 1099, 1117]. Reconstructing f (t) from its tails [i.e., gi(t) = f (t){1 − Π(t/T)}] is the interval interpolation problem. Prediction, shown in Figure 10.1d, is the problem of recovering a signal with knowledge of that signal only for negative time. Lastly, illustrated in Figure 10.1e, is periodic continuous sampling. Here, the signal is known in sections periodically spaced at intervals of T. The duty cycle is α. Reconstruction of f (t) from this data includes a number of important reconstruction problems as special cases. (a) By keeping αT constant, we can approach the extrapolation problem by letting T go to ∞. (b) Redefine the origin in Figure 10.1e to be centered in a zero interval. Under the same assumption as (a), we can similarly approach the interpolation problem. (c) Redefine the origin as in (b). Then the interpolation problem can be solved by discarding data to make it periodically sampled. (d) Keep T constant and let α → 0. The result is reconstructing f (t) from discrete samples as discussed in Chapter 5. Indeed, this model has been used to derive the sampling theorem [246]. Figures 10.1b-e all illustrate continuously sampled versions of f (t).


Sign in / Sign up

Export Citation Format

Share Document