probabilistic function
Recently Published Documents


TOTAL DOCUMENTS

22
(FIVE YEARS 8)

H-INDEX

5
(FIVE YEARS 1)

2021 ◽  
Vol 18 (5) ◽  
pp. 10-27
Author(s):  
Е. D. Solozhentsev

Th e paper analyzes the state and management of country’s economics. A tuple of event-driven optimal management as a method of artifi cial intelligence has been developed. Th e characteristics of event-driven quality management of associative and structurally complex systems and processes are given. Th e events and probabilities in the management of economics and the state are considered. A measure of invalidation has been introduced for parameters. Th e method of synthesis of the probability of an event based on expert information is presented. Th e necessity of orthogonalization of the logical function and the transition to the probabilistic function have been substantiated. Th e eff ect of repeated initiating events is considered. One-dimensional optimization of the system on a logical model instead of arithmetic multiparameter optimization is presented. Schemes for managing of development and exit of economics from stagnation are given. Th e tools for event-driven quality management of systems and processes are described. Th e analysis of the shortcomings of the existing economic theory and the possibility of their elimination is carried out.


2021 ◽  
Vol 5 (1) ◽  
pp. 445-456
Author(s):  
U. Usman ◽  
M. Waziri ◽  
Faruk Manu ◽  
Y. Zakari ◽  
H. G. Dikko

This research reports on the performance of two re-sampling methods (Bootstrap and Jackknife) relationship and significance of social-economic factors (age, gender, marital status and settlement) and modes of HIV/AIDS transmission to the HIV/AIDS spread. Logistic regression model, a form of probabilistic function for binary response was used to relate social-economic factors (age, sex, marital status and settlement) to HIV/AIDS spread. The statistical predictive model was used to project the likelihood response of HIV/AIDS spread with a larger population using 10,000 Bootstrap re-sampled observations and Jackknife re-sampled. From the analysis obtained from the two re-sampling methods, we can conclude that HIV transmission in Kebbi state is higher among the married couples than single individuals and concentrate more in the rural areas.


2021 ◽  
Author(s):  
Haotian Teng ◽  
Ye Yuan ◽  
Ziv Bar-Joseph

ABSTRACTMotivationRecent advancements in fluorescence in situ hybridization (FISH) techniques enable them to concurrently obtain information on the location and gene expression of single cells. A key question in the initial analysis of such spatial transcriptomics data is the assignment of cell types. To date, most studies used methods that only rely on the expression levels of the genes in each cell for such assignments. To fully utilize the data and to improve the ability to identify novel sub-types we developed a new method, FICT, which combines both expression and neighborhood information when assigning cell types.ResultsFICT optimizes a probabilistic function that we formalize and for which we provide learning and inference algorithms. We used FICT to analyze both simulated and several real spatial transcriptomics data. As we show, FICT can accurately identify cell types and sub-types improving on expression only methods and other methods proposed for clustering spatial transcriptomics data. Some of the spatial sub-types identified by FICT provide novel hypotheses about the new functions for excitatory and inhibitory neurons.AvailabilityFICT is available at: https://github.com/haotianteng/[email protected]


2020 ◽  
Author(s):  
Scott Belding ◽  
Katherine H Anderson ◽  
Lisa Myers ◽  
Alan W. Black ◽  
Vincent M. Brown

Entropy ◽  
2020 ◽  
Vol 22 (1) ◽  
pp. 105
Author(s):  
Jorge M. Silva ◽  
Eduardo Pinho ◽  
Sérgio Matos ◽  
Diogo Pratas

Sources that generate symbolic sequences with algorithmic nature may differ in statistical complexity because they create structures that follow algorithmic schemes, rather than generating symbols from a probabilistic function assuming independence. In the case of Turing machines, this means that machines with the same algorithmic complexity can create tapes with different statistical complexity. In this paper, we use a compression-based approach to measure global and local statistical complexity of specific Turing machine tapes with the same number of states and alphabet. Both measures are estimated using the best-order Markov model. For the global measure, we use the Normalized Compression (NC), while, for the local measures, we define and use normal and dynamic complexity profiles to quantify and localize lower and higher regions of statistical complexity. We assessed the validity of our methodology on synthetic and real genomic data showing that it is tolerant to increasing rates of editions and block permutations. Regarding the analysis of the tapes, we localize patterns of higher statistical complexity in two regions, for a different number of machine states. We show that these patterns are generated by a decrease of the tape’s amplitude, given the setting of small rule cycles. Additionally, we performed a comparison with a measure that uses both algorithmic and statistical approaches (BDM) for analysis of the tapes. Naturally, BDM is efficient given the algorithmic nature of the tapes. However, for a higher number of states, BDM is progressively approximated by our methodology. Finally, we provide a simple algorithm to increase the statistical complexity of a Turing machine tape while retaining the same algorithmic complexity. We supply a publicly available implementation of the algorithm in C++ language under the GPLv3 license. All results can be reproduced in full with scripts provided at the repository.


Author(s):  
Jiacheng Luo ◽  
Li Yu ◽  
Pengzhou Li ◽  
Lei Sun

The coolant inside the supercritical water cooled pressure tube operates beyond the critical thermodynamics point of water, and the structure integrity of the pressure tube is of great important to the safety of reactor. Under the accident load, the difference in temperature along the pressure tube wall will cause relatively large thermal stress. Due to the generated high tensile stress, coupled with the internal high pressure load, the defects in the inner surface of the pressure tube may propagate rapidly and even through the wall thickness. This paper investigates the structure integrity of the supercritical water cooled pressure tube based on the deterministic and the probabilistic method of fracture mechanics, and obtains the stress intensity factor and the probabilistic function. It is found that the integrity of the supercritical pressure tube can be maintained from the fracture mechanics analysis under the accident load.


Author(s):  
Alberto Gianinetti

Entropy has been defined as a probabilistic function of energy spreading and sharing, and most often this description provides a straightforward way to conceptualize entropy. It is shown that, more in general, the spreading and sharing of energy is a common outcome of a physical function levelling down available energy. The latter, as formulated by a mathematical term called the “Boltzmann factor”, originates from the equilibration of forces at the microscopic level and is effected by the net levelling force that results as statistical outcome of all the microscopic forces and that always pushes the system towards the dynamic equilibrium. This net levelling force is the rationale for which work can be done at a macroscopic level, and its derivation from the microscopic world explains why it is linked to equilibration and therefore to entropy increase.


Sign in / Sign up

Export Citation Format

Share Document