The Joint Probability Distribution of Two Structure Factors and Related Conditional Distributions and Expectation Values

1972 ◽  
pp. 125-157
Author(s):  
Herbert A. Hauptman
Author(s):  
Carmelo Giacovazzo

The title of this chapter may seem a little strange; it relates Fourier syntheses, an algebraic method for calculating electron densities, to the joint probability distribution functions of structure factors, which are devoted to the probabilistic estimate of s.i.s and s.s.s. We will see that the two topics are strictly related, and that optimization of the Fourier syntheses requires previous knowledge and the use of joint probability distributions. The distributions used in Chapters 4 to 6 are able to estimate s.i. or s.s. by exploiting the information contained in the experimental diffraction moduli of the target structure (the structure one wants to phase). An important tool for such distributions are the theories of neighbourhoods and of representations, which allow us to arrange, for each invariant or seminvariant Φ, the set of amplitudes in a sequence of shells, each contained within the subsequent shell, with the property that any s.i. or s.s. may be estimated via the magnitudes constituting any shell. The resulting conditional distributions were of the type, . . . P(Φ| {R}), (7.1) . . . where {R} represents the chosen phasing shell for the observed magnitudes. The more information contained within the set of observed moduli {R}, the better will be the Φ estimate. By definition, conditional distributions (7.1) cannot change during the phasing process because prior information (i.e. the observed moduli) does not change; equation (7.1) maintains the same identical algebraic form. However, during any phasing process, various model structures progressively become available, with different degrees of correlation with the target structure. Such models are a source of supplementary information (e.g. the current model phases) which, in principle, can be exploited during the phasing procedure. If this observation is accepted, the method of joint probability distribution, as described so far, should be suitably modified. In a symbolic way, we should look for deriving conditional distributions . . . P (Φ| {R}, {Rp}) , (7.2) . . . rather than (7.1), where {Rp} represents a suitable subset of the amplitudes of the model structure factors. Such an approach modifies the traditional phasing strategy described in the preceding chapters; indeed, the set {Rp} will change during the phasing process in conjunction with the model changes, which will continuously modify the probabilities (7.2).


1999 ◽  
Vol 55 (2) ◽  
pp. 322-331 ◽  
Author(s):  
Carmelo Giacovazzo ◽  
Dritan Siliqi ◽  
Angela Altomare ◽  
Giovanni Luca Cascarano ◽  
Rosanna Rizzi ◽  
...  

The joint probability distribution function method has been developed in P1¯ for reflections with rational indices. The positional atomic parameters are considered to be the primitive random variables, uniformly distributed in the interval (0, 1), while the reflection indices are kept fixed. Owing to the rationality of the indices, distributions like P(F p 1 , F p 2 ) are found to be useful for phasing purposes, where p 1 and p 2 are any pair of vectorial indices. A variety of conditional distributions like P(|F p 1 | | |F p 2 |), P(|F p 1 | |F p 2 ), P(\varphi_{{\bf p}_1}|\,|F_{{\bf p}_1}|, F_{{\bf p}_2}) are derived, which are able to estimate the modulus and phase of F p 1 given the modulus and/or phase of F p 2 . The method has been generalized to handle the joint probability distribution of any set of structure factors, i.e. the distributions P(F 1, F 2,…, F n+1), P(|F 1| |F 2,…, F n+1) and P(\varphi1| |F|1, F 2,…, F_{n+1}) have been obtained. Some practical tests prove the efficiency of the method.


2019 ◽  
Vol 23 ◽  
pp. 271-309
Author(s):  
Joseph Muré

Models are often defined through conditional rather than joint distributions, but it can be difficult to check whether the conditional distributions are compatible, i.e. whether there exists a joint probability distribution which generates them. When they are compatible, a Gibbs sampler can be used to sample from this joint distribution. When they are not, the Gibbs sampling algorithm may still be applied, resulting in a “pseudo-Gibbs sampler”. We show its stationary probability distribution to be the optimal compromise between the conditional distributions, in the sense that it minimizes a mean squared misfit between them and its own conditional distributions. This allows us to perform Objective Bayesian analysis of correlation parameters in Kriging models by using univariate conditional Jeffreys-rule posterior distributions instead of the widely used multivariate Jeffreys-rule posterior. This strategy makes the full-Bayesian procedure tractable. Numerical examples show it has near-optimal frequentist performance in terms of prediction interval coverage.


1999 ◽  
Vol 55 (3) ◽  
pp. 512-524
Author(s):  
Carmelo Giacovazzo ◽  
Dritan Siliqi ◽  
Cristina Fernández-Castaño

The method of the joint probability distribution functions of structure factors has been extended to reflections with rational indices. The most general case, space group P1, has been considered. The positional parameters are the primitive random variables of our probabilistic approach, while the reflection indices are kept fixed. Quite general joint probability distributions have been considered from which conditional distributions have been derived: these proved applicable to the accurate estimation of the real and imaginary parts of a structure factor, given prior information on other structure factors. The method is also discussed in relation to the Hilbert-transform techniques.


Sign in / Sign up

Export Citation Format

Share Document