scholarly journals Tsallis Entropy, Likelihood, and the Robust Seismic Inversion

Entropy ◽  
2020 ◽  
Vol 22 (4) ◽  
pp. 464 ◽  
Author(s):  
Igo Pedro de Lima ◽  
Sérgio Luiz E. F. da Silva ◽  
Gilberto Corso ◽  
João M. de Araújo

The nonextensive statistical mechanics proposed by Tsallis have been successfully used to model and analyze many complex phenomena. Here, we study the role of the generalized Tsallis statistics on the inverse problem theory. Most inverse problems are formulated as an optimisation problem that aims to estimate the physical parameters of a system from indirect and partial observations. In the conventional approach, the misfit function that is to be minimized is based on the least-squares distance between the observed data and the modelled data (residuals or errors), in which the residuals are assumed to follow a Gaussian distribution. However, in many real situations, the error is typically non-Gaussian, and therefore this technique tends to fail. This problem has motivated us to study misfit functions based on non-Gaussian statistics. In this work, we derive a misfit function based on the q-Gaussian distribution associated with the maximum entropy principle in the Tsallis formalism. We tested our method in a typical geophysical data inverse problem, called post-stack inversion (PSI), in which the physical parameters to be estimated are the Earth’s reflectivity. Our results show that the PSI based on Tsallis statistics outperforms the conventional PSI, especially in the non-Gaussian noisy-data case.

Author(s):  
Kazimierz Sobczyk ◽  
Jerzy Trębicki

Abstract The statistical moments of the response of nonlinear stochastic systems are governed by an infinite hierarchy of equations. In order to evaluate the most important low order moments suitable closure schemes are necessary. In this paper a non-Gaussian closure procedure is presented which is based on the maximum entropy principle developed recently by the authors in the context of stochastic nonlinear dynamics.


1990 ◽  
Vol 27 (2) ◽  
pp. 303-313 ◽  
Author(s):  
Claudine Robert

The maximum entropy principle is used to model uncertainty by a maximum entropy distribution, subject to some appropriate linear constraints. We give an entropy concentration theorem (whose demonstration is based on large deviation techniques) which is a mathematical justification of this statistical modelling principle. Then we indicate how it can be used in artificial intelligence, and how relevant prior knowledge is provided by some classical descriptive statistical methods. It appears furthermore that the maximum entropy principle yields to a natural binding between descriptive methods and some statistical structures.


Sign in / Sign up

Export Citation Format

Share Document