scholarly journals Imaging with surface-related multiples to overcome large acquisition gaps

Author(s):  
Aparajita Nath ◽  
Dirk J Verschuur

Abstract To get the best result for seismic imaging using primary reflections, data with densely-spaced sources and receivers are ideally preferred. However, dense acquisition can sometimes be hindered by various obstacles, like platforms or complex topography. Such areas with large data gaps may deter exploration or monitoring, as conventional imaging strategies would either provide poor seismic images or turn out to be very expensive. Surface-related multiples travel along different paths compared to primaries, illuminating a wider subsurface area and hence making them valuable in case of data with large gaps. We propose different strategies of using surface-related multiples to get around the problem of imaging in the case of a large data gap. Conventional least-squares imaging methods that incorporate surface-related multiples do so by re-injecting the measured wavefield in the forward-modelling process, which makes it still sensitive to missing data. We introduce a ‘non-linear’ inversion approach in which the surface multiples are modelled from the original source field. This makes the method less dependent on the receiver geometry, therefore, effectively exploiting the information from surface multiples in cases of limited illumination. However, such an approach is sensitive to the knowledge of the source properties. Therefore, we propose a ‘hybrid’ method that combines the non-linear imaging method with the conventional ‘linear’ multiple imaging method, which further improves our imaging result. We test the methods on numerical as well as field data. The results indicate substantial removal of artefacts in the image derived from linear imaging methods due to incomplete data, by exploiting the surface multiples to a maximum extent.

2006 ◽  
Vol 39 (2) ◽  
pp. 262-266 ◽  
Author(s):  
R. J. Davies

Synchrotron sources offer high-brilliance X-ray beams which are ideal for spatially and time-resolved studies. Large amounts of wide- and small-angle X-ray scattering data can now be generated rapidly, for example, during routine scanning experiments. Consequently, the analysis of the large data sets produced has become a complex and pressing issue. Even relatively simple analyses become difficult when a single data set can contain many thousands of individual diffraction patterns. This article reports on a new software application for the automated analysis of scattering intensity profiles. It is capable of batch-processing thousands of individual data files without user intervention. Diffraction data can be fitted using a combination of background functions and non-linear peak functions. To compliment the batch-wise operation mode, the software includes several specialist algorithms to ensure that the results obtained are reliable. These include peak-tracking, artefact removal, function elimination and spread-estimate fitting. Furthermore, as well as non-linear fitting, the software can calculate integrated intensities and selected orientation parameters.


2003 ◽  
Vol 51 (4) ◽  
pp. 285-293 ◽  
Author(s):  
A. Abubakar ◽  
P.M. van den Berg ◽  
J.T. Fokkema

2021 ◽  
Vol 2 (43) ◽  
pp. 54-61
Author(s):  
Dmitriy A. Burynin ◽  
◽  
Aleksandr A. Smirnov

Portable spectroradiometers and hyperspectral cameras are increasingly being used to quickly assess the physiological state of plants. The operation of these devices is based on the registration of reflection or reflection and transmission spectra. (Research purpose) The research purpose is in analyzing the technical means and methods of non-invasive monitoring of the plant state based on the registration of the reflection spectra of leaves. (Materials and methods) The article presents a review of the work on the application of hyperspectral imaging methods. Authors classified and analyzed materials on spectroscopic radiometers and hyperspectral cameras, and outlined the prospects for implementation. Authors applied the methods of a systematic approach to the research problem. (Results and discussion) Hyperspectral imaging methods serve as an effective means of monitoring plants. It is possible to determine the pigment composition of plants, lack of nutrition, and detect biotic stress through hyperspectral imaging. The article presents methods of application of portable spectroradiometers and hyperspectral cameras. With the help of these devices it is possible to carry out measurements with high spectral resolution. The difficulty of accurately detecting the content of pigments in the leaves lies in the mutual overlap of the areas of light absorption by them. The main drawback of spectroradiometers is that they measure only at one point on a single sheet. The article presents the difficulties encountered in interpreting the results obtained by the hyperspectral camera. The background reflectivity of the soil, the geometry of the vegetation cover, and the uneven lighting can make errors in the measurements. (Conclusions) The article presents the disadvantages of the hyperspectral imaging method when using only the reflection spectrum. In order to increase the accuracy of the determination of pigments and stresses of various origins, it is necessary to develop a portable device that combines the methods of recording reflection and fluorescence.


1999 ◽  
pp. 51-52
Author(s):  
K. I. Marchenkov ◽  
I. W. Roxburgh ◽  
S. V. Vorontsov

Author(s):  
Diego Liberati

In many fields of research, as well as in everyday life, it often turns out that one has to face a huge amount of data, without an immediate grasp of an underlying simple structure, often existing. A typical example is the growing field of bio-informatics, where new technologies, like the so-called Micro-arrays, provide thousands of gene expressions data on a single cell in a simple and fast integrated way. On the other hand, the everyday consumer is involved in a process not so different from a logical point of view, when the data associated to his fidelity badge contribute to the large data base of many customers, whose underlying consuming trends are of interest to the distribution market. After collecting so many variables (say gene expressions, or goods) for so many records (say patients, or customers), possibly with the help of wrapping or warehousing approaches, in order to mediate among different repositories, the problem arise of reconstructing a synthetic mathematical model capturing the most important relations between variables. To this purpose, two critical problems must be solved: 1 To select the most salient variables, in order to reduce the dimensionality of the problem, thus simplifying the understanding of the solution 2 To extract underlying rules implying conjunctions and/or disjunctions between such variables, in order to have a first idea of their even non linear relations, as a first step to design a representative model, whose variables will be the selected ones When the candidate variables are selected, a mathematical model of the dynamics of the underlying generating framework is still to be produced. A first hypothesis of linearity may be investigated, usually being only a very rough approximation when the values of the variables are not close to the functioning point around which the linear approximation is computed. On the other hand, to build a non linear model is far from being easy: the structure of the non linearity needs to be a priori known, which is not usually the case. A typical approach consists in exploiting a priori knowledge to define a tentative structure, and then to refine and modify it on the training subset of data, finally retaining the structure that best fits a cross-validation on the testing subset of data. The problem is even more complex when the collected data exhibit hybrid dynamics, i.e. their evolution in time is a sequence of smooth behaviors and abrupt changes.


Sign in / Sign up

Export Citation Format

Share Document