Simultaneous source separation using a robust Radon transform

Geophysics ◽  
2014 ◽  
Vol 79 (1) ◽  
pp. V1-V11 ◽  
Author(s):  
Amr Ibrahim ◽  
Mauricio D. Sacchi

We adopted the robust Radon transform to eliminate erratic incoherent noise that arises in common receiver gathers when simultaneous source data are acquired. The proposed robust Radon transform was posed as an inverse problem using an [Formula: see text] misfit that is not sensitive to erratic noise. The latter permitted us to design Radon algorithms that are capable of eliminating incoherent noise in common receiver gathers. We also compared nonrobust and robust Radon transforms that are implemented via a quadratic ([Formula: see text]) or a sparse ([Formula: see text]) penalty term in the cost function. The results demonstrated the importance of incorporating a robust misfit functional in the Radon transform to cope with simultaneous source interferences. Synthetic and real data examples proved that the robust Radon transform produces more accurate data estimates than least-squares and sparse Radon transforms.

2012 ◽  
Vol 588-589 ◽  
pp. 1316-1319
Author(s):  
Zhe Zheng ◽  
Li Hong Lv ◽  
Jie Jiang ◽  
Yang Zhou

With high accuracy, the channel simulator plays an important role in the docking experiment between the ground and the responder beacon. To begin with, this paper introduces the data generation algorithm including the data generation based on simulation technology, the principle of the linear least squares algorithm and then proposes the least squares quadratic spline method to generate highly accurate data in this channel simulator. Secondly, this paper introduces the system design to realize the data generation. Finally, a case which studies the approximation and an error analysis of the data generation algorithm is realized. The algorithm is considered to be accurate and easy to get the source data. The core of the algorithm is using data from Satellite Tool Kit to generate distance and speed sequence, and using the least squares to approximate real data and quadratic spline to fit for obtaining highly accurate data.


Geophysics ◽  
2018 ◽  
Vol 83 (1) ◽  
pp. V39-V48 ◽  
Author(s):  
Ali Gholami ◽  
Toktam Zand

The focusing power of the conventional hyperbolic Radon transform decreases for long-offset seismic data due to the nonhyperbolic behavior of moveout curves at far offsets. Furthermore, conventional Radon transforms are ineffective for processing data sets containing events of different shapes. The shifted hyperbola is a flexible three-parameter (zero-offset traveltime, slowness, and focusing-depth) function, which is capable of generating linear and hyperbolic shapes and improves the accuracy of the seismic traveltime approximation at far offsets. Radon transform based on shifted hyperbolas thus improves the focus of seismic events in the transform domain. We have developed a new method for effective decomposition of seismic data by using such three-parameter Radon transform. A very fast algorithm is constructed for high-resolution calculations of the new Radon transform using the recently proposed generalized Fourier slice theorem (GFST). The GFST establishes an analytic expression between the [Formula: see text] coefficients of the data and the [Formula: see text] coefficients of its Radon transform, with which a very fast switching between the model and data spaces is possible by means of interpolation procedures and fast Fourier transforms. High performance of the new algorithm is demonstrated on synthetic and real data sets for trace interpolation and linear (ground roll) noise attenuation.


2021 ◽  
Vol 11 (22) ◽  
pp. 10606
Author(s):  
Óscar Gómez-Cárdenes ◽  
José G. Marichal-Hernández ◽  
Jonas Phillip Lüke ◽  
José M. Rodríguez-Ramos

The multi-scale discrete Radon transform (DRT) calculates, with linearithmic complexity, the summation of pixels, through a set of discrete lines, covering all possible slopes and intercepts in an image, exclusively with integer arithmetic operations. An inversion algorithm exists and is exact and fast, in spite of being iterative. In this work, the DRT forward and backward pair is evolved to propose two faster algorithms: central DRT, which computes only the central portion of intercepts; and periodic DRT, which computes the line integrals on the periodic extension of the input. Both have an output of size N×4N, instead of 3N×4N, as in the original algorithm. Periodic DRT is proven to have a fast inversion, whereas central DRT does not. An interesting application of periodic DRT is its use as building a block of discrete curvelet transform. Central DRT can provide almost a 2× speedup over conventional DRT, probably becoming the faster Radon transform algorithm available, at the cost of ignoring 15% of the summations in the corners.


Geophysics ◽  
1997 ◽  
Vol 62 (1) ◽  
pp. 362-364 ◽  
Author(s):  
Ottilie F. Cools ◽  
Gérard C. Herman ◽  
Raphic M. van der Welden ◽  
Frans B. Kets

Radon transforms can be used to decompose seismic shot records into sets of plane waves and, as such, are a useful processing tool. Haneveld and Herman (1990) discussed a fast algorithm for the numerical evaluation of both the forward and inverse 2-D Radon transforms. They showed that, by rewriting the transform as a convolution, the computation time is proportional to [Formula: see text], instead of [Formula: see text] (where N denotes the number of input and output traces). In the present paper, we describe a similar method for the computation of the 3-D Radon transform for the case of rotational symmetry (see also Mallick and Frazer, 1987; McCowan and Brysk, 1989). With the aid of asymptotic techniques, the 3-D Radon transform is recast into a form similar to the 2-D Radon transform after which similar acceleration techniques are used. We have implemented and tested the fast transform on synthetic as well as on real data and found that the computation time of the fast 3-D Radon transform is indeed proportional to [Formula: see text].


2018 ◽  
Author(s):  
Ricardo Guedes ◽  
Vasco Furtado ◽  
Tarcísio Pequeno ◽  
Joel Rodrigues

UNSTRUCTURED The article investigates policies for helping emergency-centre authorities for dispatching resources aimed at reducing goals such as response time, the number of unattended calls, the attending of priority calls, and the cost of displacement of vehicles. Pareto Set is shown to be the appropriated way to support the representation of policies of dispatch since it naturally fits the challenges of multi-objective optimization. By means of the concept of Pareto dominance a set with objectives may be ordered in a way that guides the dispatch of resources. Instead of manually trying to identify the best dispatching strategy, a multi-objective evolutionary algorithm coupled with an Emergency Call Simulator uncovers automatically the best approximation of the optimal Pareto Set that would be the responsible for indicating the importance of each objective and consequently the order of attendance of the calls. The scenario of validation is a big metropolis in Brazil using one-year of real data from 911 calls. Comparisons with traditional policies proposed in the literature are done as well as other innovative policies inspired from different domains as computer science and operational research. The results show that strategy of ranking the calls from a Pareto Set discovered by the evolutionary method is a good option because it has the second best (lowest) waiting time, serves almost 100% of priority calls, is the second most economical, and is the second in attendance of calls. That is to say, it is a strategy in which the four dimensions are considered without major impairment to any of them.


2021 ◽  
Vol 11 (11) ◽  
pp. 5043
Author(s):  
Xi Chen ◽  
Bo Kang ◽  
Jefrey Lijffijt ◽  
Tijl De Bie

Many real-world problems can be formalized as predicting links in a partially observed network. Examples include Facebook friendship suggestions, the prediction of protein–protein interactions, and the identification of hidden relationships in a crime network. Several link prediction algorithms, notably those recently introduced using network embedding, are capable of doing this by just relying on the observed part of the network. Often, whether two nodes are linked can be queried, albeit at a substantial cost (e.g., by questionnaires, wet lab experiments, or undercover work). Such additional information can improve the link prediction accuracy, but owing to the cost, the queries must be made with due consideration. Thus, we argue that an active learning approach is of great potential interest and developed ALPINE (Active Link Prediction usIng Network Embedding), a framework that identifies the most useful link status by estimating the improvement in link prediction accuracy to be gained by querying it. We proposed several query strategies for use in combination with ALPINE, inspired by the optimal experimental design and active learning literature. Experimental results on real data not only showed that ALPINE was scalable and boosted link prediction accuracy with far fewer queries, but also shed light on the relative merits of the strategies, providing actionable guidance for practitioners.


Geophysics ◽  
2012 ◽  
Vol 77 (6) ◽  
pp. S131-S143 ◽  
Author(s):  
Alexander Klokov ◽  
Sergey Fomel

Common-reflection angle migration can produce migrated gathers either in the scattering-angle domain or in the dip-angle domain. The latter reveals a clear distinction between reflection and diffraction events. We derived analytical expressions for events in the dip-angle domain and found that the shape difference can be used for reflection/diffraction separation. We defined reflection and diffraction models in the Radon space. The Radon transform allowed us to isolate diffractions from reflections and noise. The separation procedure can be performed after either time migration or depth migration. Synthetic and real data examples confirmed the validity of this technique.


Geophysics ◽  
2012 ◽  
Vol 77 (3) ◽  
pp. A9-A12 ◽  
Author(s):  
Kees Wapenaar ◽  
Joost van der Neut ◽  
Jan Thorbecke

Deblending of simultaneous-source data is usually considered to be an underdetermined inverse problem, which can be solved by an iterative procedure, assuming additional constraints like sparsity and coherency. By exploiting the fact that seismic data are spatially band-limited, deblending of densely sampled sources can be carried out as a direct inversion process without imposing these constraints. We applied the method with numerically modeled data and it suppressed the crosstalk well, when the blended data consisted of responses to adjacent, densely sampled sources.


2019 ◽  
Vol 44 (4) ◽  
pp. 407-426
Author(s):  
Jedrzej Musial ◽  
Emmanuel Kieffer ◽  
Mateusz Guzek ◽  
Gregoire Danoy ◽  
Shyam S. Wagle ◽  
...  

Abstract Cloud computing has become one of the major computing paradigms. Not only the number of offered cloud services has grown exponentially but also many different providers compete and propose very similar services. This situation should eventually be beneficial for the customers, but considering that these services slightly differ functionally and non-functionally -wise (e.g., performance, reliability, security), consumers may be confused and unable to make an optimal choice. The emergence of cloud service brokers addresses these issues. A broker gathers information about services from providers and about the needs and requirements of the customers, with the final goal of finding the best match. In this paper, we formalize and study a novel problem that arises in the area of cloud brokering. In its simplest form, brokering is a trivial assignment problem, but in more complex and realistic cases this does not longer hold. The novelty of the presented problem lies in considering services which can be sold in bundles. Bundling is a common business practice, in which a set of services is sold together for the lower price than the sum of services’ prices that are included in it. This work introduces a multi-criteria optimization problem which could help customers to determine the best IT solutions according to several criteria. The Cloud Brokering with Bundles (CBB) models the different IT packages (or bundles) found on the market while minimizing (maximizing) different criteria. A proof of complexity is given for the single-objective case and experiments have been conducted with a special case of two criteria: the first one being the cost and the second is artificially generated. We also designed and developed a benchmark generator, which is based on real data gathered from 19 cloud providers. The problem is solved using an exact optimizer relying on a dichotomic search method. The results show that the dichotomic search can be successfully applied for small instances corresponding to typical cloud-brokering use cases and returns results in terms of seconds. For larger problem instances, solving times are not prohibitive, and solutions could be obtained for large, corporate clients in terms of minutes.


Sign in / Sign up

Export Citation Format

Share Document