scholarly journals 2D Deconvolution Using Adaptive Kernel

Proceedings ◽  
2019 ◽  
Vol 33 (1) ◽  
pp. 6
Author(s):  
Dirk Nille ◽  
Udo von Toussaint

An analysis tool using Adaptive Kernel to solve an ill-posed inverse problem for a 2D model space is introduced. It is applicable for linear and non-linear forward models, for example in tomography and image reconstruction. While an optimisation based on a Gaussian Approximation is possible, it becomes intractable for more than some hundred kernel functions. This is because the determinant of the Hessian of the system has be evaluated. The SVD typically used for 1D problems fails with increasing problem size. Alternatively Stochastic Trace Estimation can be used, giving a reasonable approximation. An alternative to searching for the MAP solution is to integrate using Marcov Chain Monte Carlo without the need to determine the determinant of the Hessian. This also allows to treat problems where a linear approximation is not justified.

2007 ◽  
Vol 10-12 ◽  
pp. 737-741 ◽  
Author(s):  
S.C. Wang ◽  
J. Han ◽  
Jian Feng Li ◽  
Zhi Nong Li

Because of the deficiency of fixed kernel in bilinear time-frequency distribution (TFD), i.e. for each mapping, the resulting time-frequency representation is satisfactory only for a limited class of signals, a new adaptive kernel function named the radial parabola kernel (RPK), is proposed. The RPK can adopt the optimizing method to filter cross-terms adaptively according to the signal distribution, obtain good time-frequency resolution, and offer improved TFD for a large class of signals. Compared with traditional fixed -kernel functions, such as Wigner-Ville distribution, Choi-Willams distribution and Cone-kernel distribution, the superiority of the RPK function is obvious. At last, the RPK function is applied to the analysis of vibration signals of bearing, and the result proves the RPK function an effective method in analyzing signals.


Geophysics ◽  
2012 ◽  
Vol 77 (2) ◽  
pp. R117-R127 ◽  
Author(s):  
Antoine Guitton ◽  
Gboyega Ayeni ◽  
Esteban Díaz

The waveform inversion problem is inherently ill-posed. Traditionally, regularization schemes are used to address this issue. For waveform inversion, where the model is expected to have many details reflecting the physical properties of the Earth, regularization and data fitting can work in opposite directions: the former smoothing and the latter adding details to the model. We propose constraining estimated velocity fields by reparameterizing the model. This technique, also called model-space preconditioning, is based on directional Laplacian filters: It preserves most of the details of the velocity model while smoothing the solution along known geological dips. Preconditioning also yields faster convergence at early iterations. The Laplacian filters have the property to smooth or kill local planar events according to a local dip field. By construction, these filters can be inverted and used in a preconditioned waveform inversion strategy to yield geologically meaningful models. We illustrate with 2D synthetic and field data examples how preconditioning with nonstationary directional Laplacian filters outperforms traditional waveform inversion when sparse data are inverted and when sharp velocity contrasts are present. Adding geological information with preconditioning could benefit full-waveform inversion of real data whenever irregular geometry, coherent noise and lack of low frequencies are present.


Geophysics ◽  
2003 ◽  
Vol 68 (2) ◽  
pp. 577-588 ◽  
Author(s):  
Sergey Fomel ◽  
Jon F. Claerbout

Constraining ill‐posed inverse problems often requires regularized optimization. We consider two alternative approaches to regularization. The first approach involves a column operator and an extension of the data space. It requires a regularization operator which enhances the undesirable features of the model. The second approach constructs a row operator and expands the model space. It employs a preconditioning operator which enforces a desirable behavior (such as smoothness) of the model. In large‐scale problems, when iterative optimization is incomplete, the second method is preferable, because it often leads to faster convergence. We propose a method for constructing preconditioning operators by multidimensional recursive filtering. The recursive filters are constructed by imposing helical boundary conditions. Several examples with synthetic and real data demonstrate an order of magnitude efficiency gain achieved by applying the proposed technique to data interpolation problems.


Geophysics ◽  
2021 ◽  
pp. 1-41
Author(s):  
Nasser Kazemi ◽  
Mauricio D. Sacchi

The conventional Radon transform suffers from a lack of resolution when data kinematics and amplitudes differ from those of the Radon basis functions. Also, a limited aperture of data, missing traces, aliasing, a finite number of scanned ray parameters, noise, residual statics, and amplitude variations with offset (AVO) reduce the de-correlation power of the Radon basis functions. Posing Radon transform estimation as an inverse problem by searching for a sparse model that fits the data improves the performance of the algorithm. However, due to averaging along the offset axis, the conventional Radon transform cannot preserve AVO. Accordingly, we modify the Radon basis functions by extending the model domain along the offset direction. Extending the model space helps in fitting data; however, computing the offset-extended Radon transform is an under-determined and ill-posed problem. To alleviate this shortcoming, we add model domain sparsity and smoothing constraints to yield a stable solution. We develop an algorithm using offset-extended Radon basis functions with sparsity promoting in offset-stacked Radon images in conjunction with a smoothing restriction along the offset axis. As the inverted model is sparse and fits the data, muting common-offset Radon panels based on ray-parameter/curvature is sufficient for separating primaries from multiples. We successfully apply the algorithm to suppress multiples in the presence of strong AVO on synthetic data and a real data example from the Gulf of Mexico, Mississippi Canyon. The results show that extending the Radon model space is necessary for improving the separation and suppression of the multiples in the presence of strong AVO.


1999 ◽  
Vol 11 (5) ◽  
pp. 1035-1068 ◽  
Author(s):  
David J. C. MacKay

I examine two approximate methods for computational implementation of Bayesian hierarchical models, that is, models that include unknown hyperparameters such as regularization constants and noise levels. In the evidence framework, the model parameters are integrated over, and the resulting evidence is maximized over the hyperparameters. The optimized hyperparameters are used to define a gaussian approximation to the posterior distribution. In the alternative MAP method, the true posterior probability is found by integrating over the hyperparameters. The true posterior is then maximized over the model parameters, and a gaussian approximation is made. The similarities of the two approaches and their relative merits are discussed, and comparisons are made with the ideal hierarchical Bayesian solution. In moderately ill-posed problems, integration over hyperparameters yields a probability distribution with a skew peak, which causes signifi-cant biases to arise in the MAP method. In contrast, the evidence framework is shown to introduce negligible predictive error under straightforward conditions. General lessons are drawn concerning inference in many dimensions.


Mathematics ◽  
2020 ◽  
Vol 8 (10) ◽  
pp. 1846
Author(s):  
Xin Liu ◽  
Bangxin Zhao ◽  
Wenqing He

Simultaneous feature selection and classification have been explored in the literature to extend the support vector machine (SVM) techniques by adding penalty terms to the loss function directly. However, it is the kernel function that controls the performance of the SVM, and an imbalance in the data will deteriorate the performance of an SVM. In this paper, we examine a new method of simultaneous feature selection and binary classification. Instead of incorporating the standard loss function of the SVM, a penalty is added to the data-adaptive kernel function directly to control the performance of the SVM, by firstly conformally transforming the kernel functions of the SVM, and then re-conducting an SVM classifier based on the sparse features selected. Both convex and non-convex penalties, such as least absolute shrinkage and selection (LASSO), moothly clipped absolute deviation (SCAD) and minimax concave penalty (MCP) are explored, and the oracle property of the estimator is established accordingly. An iterative optimization procedure is applied as there is no analytic form of the estimated coefficients available. Numerical comparisons show that the proposed method outperforms the competitors considered when data are imbalanced, and it performs similarly to the competitors when data are balanced. The method can be easily applied in medical images from different platforms.


Author(s):  
B. Roy Frieden

Despite the skill and determination of electro-optical system designers, the images acquired using their best designs often suffer from blur and noise. The aim of an “image enhancer” such as myself is to improve these poor images, usually by digital means, such that they better resemble the true, “optical object,” input to the system. This problem is notoriously “ill-posed,” i.e. any direct approach at inversion of the image data suffers strongly from the presence of even a small amount of noise in the data. In fact, the fluctuations engendered in neighboring output values tend to be strongly negative-correlated, so that the output spatially oscillates up and down, with large amplitude, about the true object. What can be done about this situation? As we shall see, various concepts taken from statistical communication theory have proven to be of real use in attacking this problem. We offer below a brief summary of these concepts.


Author(s):  
Melen McBride

Ethnogeriatrics is an evolving specialty in geriatric care that focuses on the health and aging issues in the context of culture for older adults from diverse ethnic backgrounds. This article is an introduction to ethnogeriatrics for healthcare professionals including speech-language pathologists (SLPs). This article focuses on significant factors that contributed to the development of ethnogeriatrics, definitions of some key concepts in ethnogeriatrics, introduces cohort analysis as a teaching and clinical tool, and presents applications for speech-language pathology with recommendations for use of cohort analysis in practice, teaching, and research activities.


2011 ◽  
Vol 21 (2) ◽  
pp. 44-54
Author(s):  
Kerry Callahan Mandulak

Spectral moment analysis (SMA) is an acoustic analysis tool that shows promise for enhancing our understanding of normal and disordered speech production. It can augment auditory-perceptual analysis used to investigate differences across speakers and groups and can provide unique information regarding specific aspects of the speech signal. The purpose of this paper is to illustrate the utility of SMA as a clinical measure for both clinical speech production assessment and research applications documenting speech outcome measurements. Although acoustic analysis has become more readily available and accessible, clinicians need training with, and exposure to, acoustic analysis methods in order to integrate them into traditional methods used to assess speech production.


Sign in / Sign up

Export Citation Format

Share Document