A Phenomenological Theory of Socioeconomic Systems with Spatial Interactions

1982 ◽  
Vol 14 (7) ◽  
pp. 869-888 ◽  
Author(s):  
P F Lesse

This paper deals with a class of models which describe spatial interactions and are based on Jaynes's principle. The variables entering these models can be partitioned in four groups: (a) probability density distributions (for example, relative traffic flows), (b) expected values (average cost of travel), (c) their duals (Lagrange multipliers, traffic impedance coefficient), and (d) operators transforming probabilities into expected values. The paper presents several dual formulations replacing the problem of maximizing entropy in terms of the group of variables (a) by equivalent extreme problems involving groups (b)-(d). These problems form the basis of a phenomenological theory. The theory makes it possible to derive useful relationships among groups (b) and (c). There are two topics discussed: (1) practical application of the theory (with examples), (2) the relationship between socioeconomic modelling and statistical mechanics.

2019 ◽  
Vol 65 (3) ◽  
pp. 101-112 ◽  
Author(s):  
M. Rogalska ◽  
J. Żelazna-Pawlicka

AbstractThe paper evaluates the relationship between the selection of the probability density function and the construction price, and the price of the building's life cycle, in relation to the deterministic cost estimate in terms of the minimum, mean, and maximum. The deterministic cost estimates were made based on the minimum, mean, and maximum prices: labor rates, indirect costs, profit, and the cost of equipment and materials. The net construction prices received were given different probability density distributions based on the minimum, mean, and maximum values. Twelve kinds of probability distributions were used: triangular, normal, lognormal, beta pert, gamma, beta, exponential, Laplace, Cauchy, Gumbel, Rayleigh, and uniform. The results of calculations with the event probability from 5 to 95% were subjected to the statistical comparative analysis. The dependencies between the results of calculations were determined, for which different probability density distributions of price factors were assumed. A certain price level was assigned to specific distributions in 6 groups based on the t-test. It was shown that each of the distributions analyzed is suitable for use, however, it has consequences in the form of a final result. The lowest final price is obtained using the gamma distribution, the highest is obtained by the beta distribution, beta pert, normal, and uniform.


Genetics ◽  
1996 ◽  
Vol 142 (4) ◽  
pp. 1357-1362
Author(s):  
François Rousset

Abstract Expected values of Wright'sF-statistics are functions of probabilities of identity in state. These values may be quite different under an infinite allele model and under stepwise mutation processes such as those occurring at microsatellite loci. However, a relationship between the probability of identity in state in stepwise mutation models and the distribution of coalescence times can be deduced from the relationship between probabilities of identity by descent and the distribution of coalescence times. The values of FIS and FST can be computed using this property. Examination of the conditional probability of identity in state given some coalescence time and of the distribution of coalescence times are also useful for explaining the properties of FIS and FST at high mutation rate loci, as shown here in an island model of population structure.


Author(s):  
Zhenyu Liu ◽  
Shien Zhou ◽  
Chan Qiu ◽  
Jianrong Tan

The performance of mechanical products is closely related to their key feature errors. It is essential to predict the final assembly variation by assembly variation analysis to ensure product performance. Rigid–flexible hybrid construction is a common type of mechanical product. Existing methods of variation analysis in which rigid and flexible parts are calculated separately are difficult to meet the requirements of these complicated mechanical products. Another methodology is a result of linear superposition with rigid and flexible errors, which cannot reveal the quantitative relationship between product assembly variation and part manufacturing error. Therefore, a kind of complicated products’ assembly variation analysis method based on rigid–flexible vector loop is proposed in this article. First, shapes of part surfaces and sidelines are estimated according to different tolerance types. Probability density distributions of discrete feature points on the surface are calculated based on the tolerance field size with statistical methods. Second, flexible parts surface is discretized into a set of multi-segment vectors to build vector-loop model. Each vector can be orthogonally decomposed into the components representing position information and error size. Combining the multi-segment vector set of flexible part with traditional rigid part vector, a uniform vector-loop model is constructed to represent rigid and flexible complicated products. Probability density distributions of discrete feature points on part surface are regarded as inputs to calculate assembly variation values of products’ key features. Compared with the existing methods, this method applies to the assembly variation prediction of complicated products that consist of both rigid and flexible parts. Impact of each rigid and flexible part’s manufacturing error on product assembly variation can be determined, and it provides the foundation of parts tolerance optimization design. Finally, an assembly example of phased array antenna verifies effectiveness of the proposed method in this article.


2018 ◽  
Author(s):  
Uwe Berger ◽  
Gerd Baumgarten ◽  
Jens Fiedler ◽  
Franz-Josef Lübken

Abstract. In this paper we present a new description about statistical probability density distributions (pdfs) of Polar Mesospheric Clouds (PMC) and noctilucent clouds (NLC). The analysis is based on observations of maximum backscatter, ice mass density, ice particle radius, and number density of ice particles measured by the ALOMAR RMR-lidar for all NLC seasons from 2002 to 2016. From this data set we derive a new class of pdfs that describe the statistics of PMC/NLC events which is different from previously statistical methods using the approach of an exponential distribution commonly named g-distribution. The new analysis describes successfully the probability statistic of ALOMAR lidar data. It turns out that the former g-function description is a special case of our new approach. In general the new statistical function can be applied to many kinds of different PMC parameters, e.g. maximum backscatter, integrated backscatter, ice mass density, ice water content, ice particle radius, ice particle number density or albedo measured by satellites. As a main advantage the new method allows to connect different observational PMC distributions of lidar, and satellite data, and also to compare with distributions from ice model studies. In particular, the statistical distributions of different ice parameters can be compared with each other on the basis of a common assessment that facilitate, for example, trend analysis of PMC/NLC.


2002 ◽  
Vol 55 (2) ◽  
pp. 213-224 ◽  
Author(s):  
Phil Belcher

This paper discusses issues relating to the practical application of the collision avoidance regulations (COLREGS) from the sociological viewpoint that rules are always contingent, defeasible and that no rule can exhaustively specify the conditions of its use. It is proposed that, due to the inherent nature of rules, the only way successfully to manage collision risks at sea, is physically to separate opposing traffic flows, so as to remove the interpretative and mutually co-ordinating factors from the COLREGS.


Author(s):  
Kotchapong Sumanonta ◽  
Pasist Suwanapingkarl ◽  
Pisit Liutanakul

This article presents a novel model for the equivalent circuit of a photovoltaic module. This circuit consists of the following important parameters: a single diode, series resistance (Rs) and parallel resistance (Rp) that can be directly adjusted according to ambient temperature and the irradiance. The single diode in the circuit is directly related to the ideality factor (m), which represents the relationship between the materials and significant structures of PV module such as mono crystalline, multi crystalline and thin film technology.  Especially, the proposed model in this article is to present the simplified model that can calculate the results of I-V curves faster and more accurate than other methods of the previous models. This can show that the proposed models are more suitable for the practical application. In addition, the results of the proposed model are validated by the datasheet, the practical data in the laboratory (indoor test) and the onsite data (outdoor test). This ensures that the less than 0.1% absolute errors of the model can be accepted.


2015 ◽  
Vol 12 (1) ◽  
pp. 33-48 ◽  
Author(s):  
Balázs Kósa ◽  
Márton Balassi ◽  
Péter Englert ◽  
Attila Kiss

In our paper we compare two centrality measures of networks, betweenness and Linerank. Betweenness is widely used, however, its computation is expensive for large networks. Calculating Linerank remains manageable even for graphs of billion nodes, it was offered as a substitute of betweenness in [12]. To the best of our knowledge the relationship between these measures has never been seriously examined. We calculate the Pearson?s and Spearman?s correlation coefficients for both node and edge variants of these measures. For edges the correlation tends to be rather low. Our tests with the Girvan-Newman algorithm [16] also underline that edge betweenness cannot be substituted with edge Linerank. The results for the node variants are more promising. The correlation coefficients are close to 1. Notwithstanding, the practical application in which the robustness of social and web graphs is examined node betweenness still outperforms node Linerank. We also clarify how Linerank should be computed on undirected graphs.


2020 ◽  
Vol 4 ◽  
pp. 85
Author(s):  
G. A. Lalazissis ◽  
C. P. Panos

A recently proposed semiphenomenological density distribution for neutrons and protons in nuclei is discussed. This density was derived using the separation energies of the last neutron or proton. A com­parison is made with the symmetrised Fermi density distribution with parameters determined by fitting electron scattering experimental data and with a Fermi density with parameters coming from a recent anal­ysis of pionic atoms. Theoretical expressions for rms radii for neutron, proton and matter distributions are proposed, which give the average trend of the variation of these quantities as functions of Ν, Ζ and A respectively. To facilitate the use of the new density all the parameters needed in a practical application are tabulated for a series of nuclei. Some applications of the new density are also discussed.


Sign in / Sign up

Export Citation Format

Share Document