scholarly journals Burr, Lomax, Pareto, and Logistic Distributions from Ultrasound Speckle

2020 ◽  
Vol 42 (4-5) ◽  
pp. 203-212 ◽  
Author(s):  
Kevin J. Parker ◽  
Sedigheh S. Poul

After 100 years of theoretical treatment of speckle patterns from coherent illumination, there remain some open questions about the nature of ultrasound speckle from soft vascularized tissues. A recent hypothesis is that the fractal branching vasculature is responsible for the dominant echo pattern from organs such as the liver. In that case, an analysis of cylindrical scattering structures arranged across a power law distribution of sizes is warranted. Using a simple model of echo strength and basic transformation rules from probability, we derive the first order statistics of speckle considering the amplitude, the intensity, and the natural log of amplitude. The results are given by long tailed distributions that have been studied in the statistics literature for other fields. Examples are given from simulations and animal studies, and the theoretical fit to these preliminary data support the overall framework as a plausible model for characterizing ultrasound speckle statistics.

1995 ◽  
Vol 407 ◽  
Author(s):  
Mariela Araujo ◽  
Orlando Gonzalez

ABSTRACTWe present a simple model to explain anomalous relaxation in random porous media. The model, based on the properties of random walks on a disordered structure, is able to describe essential features of the relaxation process in terms of a one body picture, in which the many body effects are approximated by geometrical restrictions on the particles diffusion. Disorder is considered as a random variable (quenched and annealed) taken from a power-law distribution |μ|ξμ−1. Quantities relevant to relaxation phenomena, such as the characteristic function and the particle density are calculated. Different regimes are observed as a function of the disorder parameter μ. For μ > 1 the relaxation is of exponential or Debye type, and turns into a stretched exponential as μ decreases. We compare numerical predictions (based on Monte Carlo simulations) with experimental data from porous rocks obtained by Nuclear Magnetic Resonance, and numerical data from other disordered systems.


1996 ◽  
Vol 33 (7) ◽  
pp. 141-145 ◽  
Author(s):  
Muwaffaq M. Saqqar ◽  
M. B. Pescod

The performance of anoxic and facultative ponds in Jordan was investigated for 12 months. Calculated values of the first order kinetic equation rate for CBOD removal (KCBOD) has resulted in different KCBOD's for different ponds in the same month, at the same temperature. It is evident that factors other than temperature must influence values of KCBOD. The KCBOD values determined were generally lower than those reported in the literature. The maximum value found was only 0.16 (/day). A pond was emptied after 18 months of operation and sediment was found randomly distributed over the pond area, with a depth ranging from 2 to 6 cm (averaging ≈ 4 cm). A simple model has been established to estimate sediment depth (Hs in cm) in terms of the operating time in months (t).


1975 ◽  
Vol 229 (4) ◽  
pp. 1110-1115 ◽  
Author(s):  
MW Bradbury ◽  
CS Patlak ◽  
WH Oldendorf

The amount of radioactivity in brain was estimated at different times after intracarotid injection in the pentobarbital-anesthetized rat. Fifteen seconds after injection, six radiolabeled solutes minimally metabolized by brain, 3H2O , isopropanol, nicotine, antipyrine, 3-O-methyglucose, and codeine, left the brain according to first-order kinetics. Two solutes metabolized by brain, lactic acid, and heroin, behaved in a more complex fashion. The behavior of the six nonmetabolized solutes was interpreted satisfactorily by a simple model in which the brain is treated as a single compartment. From the model, uptake at 15 s as a percentage of the dose is linearly related to the permeability when the uptake is low, i.e., 30% or less. In higher uptakes blood flow becomes increasingly important. The efflux rate is similarly related to permeability and blood flow, but additionally it depends inversely on brain space. The exchanges of 3H2O, isopropanol, and nicotine were determined almost solely by blood flow and brain space. Movements of codeine and 3-O-methylglucose depended primarily on permeability and those of antipyrine on both factors.


2006 ◽  
Vol 95 (5) ◽  
pp. 3146-3153 ◽  
Author(s):  
N. Sinha ◽  
J.T.G. Brown ◽  
R.H.S. Carpenter

Saccades represent decisions, and the study of their latency has led to a neurally plausible model of the underlying mechanisms, LATER (Linear Approach to Threshold with Ergodic Rate), that can successfully predict reaction time behavior in simple decision tasks, with fixed instructions. However, if the instructions abruptly change, we have a more complex situation, known as task switching. Psychologists' explanations of the phenomena of task switching have so far tended to be qualitative rather than quantitative, and not intended to relate particularly clearly to existing models of decision making or to likely neural implementations. Here, we investigated task switching using a novel saccadic task: we presented the instructions by stimulus elements identical to those of the task itself, allowing us to compare decisions about instructions with decisions in the actual task. Our results support a relatively simple model consisting of two distinct LATER processes in series: the first detects the instruction, the second implements it.


Author(s):  
Yoshinori Shigeta ◽  
◽  
Kiyoshi Akama ◽  
Hiroshi Mabuchi ◽  
Hidekatsu Koike ◽  
...  

We present a way to convert constraint handling rules (CHRs) to equivalent transformation rules (ETRs) and demonstrate the correctness of the conversion in equivalent transformation (ET) theory. In the ET computation model, computation is regarded as equivalent transformations of a description. A description is transformed successively by ETRs. Extensively used in the domain of first-order terms, the ET computation model has also been applied to knowledge processing in such data domains as RDF, UML, and XML. A CHR is a multiheaded guarded rule that rewrites constraints into simpler ones until they are solved. CHRs and ETRs are similar in syntax but they have completely different theoretical bases for the correctness of their computation. CHRs are based on the logical equivalence of logical formulas, while ETRs are based on the set equivalence of descriptions. We convert CHRs to rules used in the ET model and demonstrate converted rules to be correct ETRs, i.e., they preserve meanings of descriptions. We discuss correspondences and differences between CHRs and ETRs in theories, giving examples of correct ETRs that cannot be represented as CHRs.


1974 ◽  
Vol 39 (1) ◽  
pp. 139-150 ◽  
Author(s):  
Neil D. Jones ◽  
Alan L. Selman

H. Scholz [11] defined the spectrum of a formula φ of first-order logic with equality to be the set of all natural numbers n for which φ has a model of cardinality n. He then asked for a characterization of spectra. Only partial progress has been made. Computational aspects of this problem have been worked on by Gunter Asser [1], A. Mostowski [9], and J. H. Bennett [2]. It is known that spectra include the Grzegorczyk class and are properly included in . However, no progress has been made toward establishing whether spectra properly include , or whether spectra are closed under complementation.A possible connection with automata theory arises from the fact that contains just those sets which are accepted by deterministic linear-bounded Turing machines (Ritchie [10]). Another resemblance lies in the fact that the same two problems (closure under complement, and proper inclusion of ) have remained open for the class of context sensitive languages for several years.In this paper we show that these similarities are not accidental—that spectra and context sensitive languages are closely related, and that their open questions are merely special cases of a family of open questions which relate to the difference (if any) between deterministic and nondeterministic time or space bounded Turing machines.In particular we show that spectra are just those sets which are acceptable by nondeterministic Turing machines in time 2cx, where c is constant and x is the length of the input. Combining this result with results of Bennett [2], Ritchie [10], Kuroda [7], and Cook [3], we obtain the “hierarchy” of classes of sets shown in Figure 1. It is of interest to note that in all of these cases the amount of unrestricted read/write memory appears to be too small to allow diagonalization within the larger classes.


Sign in / Sign up

Export Citation Format

Share Document