joint probability
Recently Published Documents


TOTAL DOCUMENTS

1269
(FIVE YEARS 312)

H-INDEX

49
(FIVE YEARS 7)

2022 ◽  
Author(s):  
Bradford D. Loucas ◽  
Igor Shuryak ◽  
Stephen R. Kunkel ◽  
Michael N. Cornforth

The relationship between certain chromosomal aberration (CA) types and cell lethality is well established. On that basis we used multi-fluor in situ hybridization (mFISH) to tally the number of mitotic human lymphocytes exposed to graded doses of gamma rays that carried either lethal or nonlethal CA types. Despite the fact that a number of nonlethal complex exchanges were observed, the cells containing them were seldom deemed viable, due to coincident lethal chromosome damage. We considered two model variants for describing the dose responses. The first assumes independent linear-quadratic (LQ) dose response shapes for the yields of both lethal and nonlethal CAs. The second (simplified) variant assumes that the mean number of nonlethal CAs per cell is proportional to the mean number of lethal CAs per cell, meaning that the shapes and magnitudes of both aberration types differ only by a multiplicative proportionality constant. Using these models allowed us to assemble dose response curves for the frequency of aberration-bearing cells that would be expected to survive. This took the form of a joint probability distribution for cells containing ≥1 nonlethal CAs but having zero lethal CAs. The simplified second model variant turned out to be marginally better supported than the first, and the joint probability distribution based on this model yielded a crescent-shaped dose response reminiscent of those observed for mutagenesis and transformation for cells “at risk” (i.e. not corrected for survival). Among the implications of these findings is the suggestion that similarly shaped curves form the basis for deriving metrics associated with radiation risk models.


2022 ◽  
Vol 2022 ◽  
pp. 1-12
Author(s):  
Fei Zhou

With the increasing abundance of network teaching resources, the recommendation technology based on network is becoming more and more mature. There are differences in the effect of recommendation, which leads to great differences in the effect of recommendation algorithms for teaching resources. The existing teaching resource recommendation algorithm either takes insufficient consideration of the students’ personality characteristics, cannot well distinguish the students’ users through the students’ personality, and pushes the same teaching resources or considers the student user personality not sufficient and cannot well meet the individualized learning needs of students. Therefore, in view of the above problem, combining TDINA model by the user for the students to build cognitive diagnosis model, we put forward a model based on convolution (CUPMF) joint probability matrix decomposition method of teaching resources to recommend the method combined with the history of the students answer, cognitive ability, knowledge to master the situation, and forgetting effect factors. At the same time, CNN is used to deeply excavate the test question resources in the teaching resources, and the nonlinear transformation of the test question resources output by CNN is carried out to integrate them into the joint probability matrix decomposition model to predict students’ performance on the resources. Finally, the students’ knowledge mastery matrix obtained by TDINA model is combined to recommend corresponding teaching resources to students, so as to improve learning efficiency and help students improve their performance.


2022 ◽  
Author(s):  
C. Arrighi ◽  
M. Tanganelli ◽  
M. T. Cristofaro ◽  
V. Cardinali ◽  
A. Marra ◽  
...  

AbstractNatural hazards pose a significant threat to historical cities which have an authentic and universal value for mankind. This study aims at codifying a multi-risk workflow for seismic and flood hazards, for site-scale applications in historical cities, which provides the Average Annual Loss for buildings within a coherent multi-exposure and multi-vulnerability framework. The proposed methodology includes a multi-risk correlation and joint probability analysis to identify the role of urban development in re-shaping risk components in historical contexts. The workflow is unified by exposure modelling which adopts the same assumptions and parameters. Seismic vulnerability is modelled through an empirical approach by assigning to each building a vulnerability value depending on the European Macroseismic Scale (EMS-98) and modifiers available in literature. Flood vulnerability is modelled by means of stage-damage curves developed for the study area and validated against ex-post damage claims. The method is applied to the city centre of Florence (Italy) listed as UNESCO World Heritage site since 1982. Direct multi-hazard, multi-vulnerability losses are modelled for four probabilistic scenarios. A multi-risk of 3.15 M€/year is estimated for the current situation. In case of adoption of local mitigation measures like floodproofing of basements and installation of steel tie rods, multi-risk reduces to 1.55 M€/yr. The analysis of multi-risk correlation and joint probability distribution shows that the historical evolution of the city centre, from the roman castrum followed by rebuilding in the Middle Ages, the late XIX century and the post WWII, has significantly affected multi-risk in the area. Three identified portions of the study area with a different multi-risk spatial probability distribution highlight that the urban development of the historical city influenced the flood hazard and the seismic vulnerability. The presented multi-risk workflow could be applied to other historical cities and further extended to other natural hazards.


Author(s):  
Reza Seifi Majdar ◽  
Hassan Ghassemian

Unlabeled samples and transformation matrix are two main parts of unsupervised and semi-supervised feature extraction (FE) algorithms. In this manuscript, a semi-supervised FE method, locality preserving projection in the probabilistic framework (LPPPF), to find a sufficient number of reliable and unmixed unlabeled samples from all classes and constructing an optimal projection matrix is proposed. The LPPPF has two main steps. In the first step, a number of reliable unlabeled samples are selected based on the training samples, spectral features, and spatial information in the probabilistic framework. In this way, the spectral and spatial probability distribution function is calculated for each unlabeled sample. Therefore, the spectral features and spatial information are integrated together with a joint probability distribution function. Finally, a sufficient number of unlabeled samples with the highest joint probability distribution are selected. In the second step, the selected unlabeled samples are applied to construct the transformation matrix based on the spectral and spatial information of the unlabeled samples. The adjacency graph is improved by using new weights based on spectral and spatial information. This method is evaluated on three data sets: Indian Pines, Pavia University, and Kennedy Space Center (KSC) and compared with some recent and well-known supervised, semi-supervised, and unsupervised FE methods. Various experiments demonstrate the efficiency of the LPPPF in comparison with the other FE methods. LPPPF has also considerable performance with limited training samples.


2021 ◽  
Author(s):  
Samu Mäntyniemi ◽  
Inari Helle ◽  
Ilpo Kojola

Assessment of the Finnish wolf population relies on multiple sources of information. This paper describes how Bayesian inference is used to pool the information contained in different kind of data sets (point observations, non-invasive genetics, known mortalities) for the estimation of the number of territories occupied by family packs and pairs. The output of the assessment model is a joint probability distribution, which describes current knowledge about the number of wolves within each territory. The joint distribution can be used to derive probability distributions for the total number of wolves in all territories and for the pack status within each territory. Most of the data set comprises of both voluntary-provided point observations and DNA samples provided by volunteers and research personnel. The new method reduces the role of expert judgement in the assessment process, providing increased transparency and repeatability.


Author(s):  
Rafael Guzmán-Cabrera ◽  
Iván A. Hernández-Robles ◽  
Xiomara González Ramírez ◽  
José Rafael Guzmán Sepúlveda

Probabilistic approaches are frequently used to describe irregular activity data to assist the design and development of devices. Unfortunately, useful estimations are not always feasible due to the large noise in the data modeled, as it occurs when estimating the sea waves potential for electricity generation. In this work we propose a simple methodology based on the use of joint probability models that allow discriminating extreme values, collected from measurements as pairs of independent points, while allowing the preservation of the essential statistics of the measurements. The outcome of the proposed methodology is an equivalent data series where large-amplitude fluctuations are suppressed and, therefore, can be used for design purposes. For the evaluation of the proposed method, we used year-long databases of hourly-collected measurements of the wave’s height and period, performed at maritime buoys located in the Gulf of Mexico. These measurements are used to obtain a fluctuations-reduced representation of the energy potential of the waves that can be useful, for instance, for the design of electric generators.


2021 ◽  
Author(s):  
Na Li ◽  
Shenglian Guo ◽  
Feng Xiong ◽  
Jun Wang ◽  
Yuzuo Xie

Abstract The coincidence of floods in the mainstream and its tributaries may lead to a large flooding in the downstream confluence area, and the flood coincidence risk analysis is very important for flood prevention and disaster reduction. In this study, the multiple regression model was used to establish the functional relationship among flood magnitudes in the mainstream and its tributaries. The mixed von Mises distribution and Pearson Type III distribution were selected to fit the probability distribution of the annual maximum flood occurrence dates and magnitudes, respectively. The joint distributions of the annual maximum flood occurrence dates and magnitudes were established using copula function, respectively. Fuhe River in the Poyang Lake region was selected as a study case. The joint probability, co-occurrence probability and conditional probability of flood magnitudes were quantitatively estimated and compared with the predicted flood coincidence risks. The results show that the selected marginal and joint distributions can fit observed flood dataset very well. The coincidence probabilities of flood occurrence dates in the upper mainstream and its tributaries mainly occur from May to early July. It is found that the conditional probability is the most consistent with the predicted flood coincidence risks in the mainstream and its tributaries, and is more reliable and rational in practice.


2021 ◽  
Vol 13 (24) ◽  
pp. 5103
Author(s):  
Jeongeun Won ◽  
Jiyu Seo ◽  
Jeonghoon Lee ◽  
Okjeong Lee ◽  
Sangdan Kim

Since vegetation is closely related to a variety of hydrological factors, the vegetation condition during a drought is greatly affected by moisture supply or moisture demand from the atmosphere. However, since feedback between vegetation and climate in the event of drought is very complex, it is necessary to construct a joint probability distribution that can describe and investigate the interrelationships between them. In other words, it is required to understand the interaction between vegetation and climate in terms of joint probability. In this study, the possibility of drought stress experienced by vegetation under various conditions occurring during drought was investigated by dividing drought into two aspects (atmospheric moisture supply and moisture demand). Meteorological drought indices that explain different aspects of drought and vegetation-related drought indexes that describe the state of vegetation were estimated using data remotely sensed by satellites in parts of Far East Asia centered on South Korea. Bivariate joint probability distribution modeling was performed from vegetation drought index and meteorological drought index using Copula. It was found that the relationship between the vegetation drought index and the meteorological drought index has regional characteristics and there is also a seasonal change. From the copula-based model, it was possible to quantify the conditional probability distribution for the drought stress of vegetation under meteorological drought scenarios that occur from different causes. Through this, by mapping the vulnerability of vegetation to meteorological drought in the study area, it was possible to spatially check how the vegetation responds differently depending on the season and meteorological causes. The probabilistic mapping of vegetation vulnerability to various aspects of meteorological drought may provide useful information for establishing mitigation strategies for ecological drought.


Mathematics ◽  
2021 ◽  
Vol 9 (24) ◽  
pp. 3256
Author(s):  
Rui Pang ◽  
Laifu Song

Because rockfill strength and seismic ground motion are dominant factors affecting the slope stability of rockfill dams, it is very important to accurately characterize the distribution of rockfill strength parameters, develop a stochastic ground motion model suitable for rockfill dam engineering, and effectively couple strength parameters and seismic ground motion to precisely evaluate the dynamic reliability of the three-dimensional (3D) slope stability of rockfill dams. In this study, a joint probability distribution model for rockfill strength based on the copula function and a stochastic ground motion model based on the improved Clough-Penzien spectral model were built; the strength parameters and the seismic ground motion were coupled using the GF-discrepancy method, a method for the analysis of dynamic reliability of the 3D slope stability of rockfill dams was proposed based on the generalized probability density evolution method (GPDEM), and the effectiveness of the proposed method was verified. Moreover, the effect of different joint distribution models on the dynamic reliability of the slope stability of rockfill dams was revealed, the effect of the copula function type on the dynamic reliability of the slope stability was analysed, and the differences in the dynamic reliability of the slope stability under parameter randomness, seismic ground motion randomness, and coupling randomness of parameters and seismic ground motion were systematically determined. The results were as follows: the traditional joint distribution models ignored related nonnormal distribution characteristics of rockfill strength parameters, which led to excessively low calculated failure probabilities and overestimations of the reliability of the slope stability; in practice, we found that the optimal copula function should be selected to build the joint probability distribution model, and seismic ground motion randomness must be addressed in addition to parameter randomness.


2021 ◽  
Author(s):  
Mohamed Atia ◽  
Ahmed Abdelkhalek ◽  
Anjan Sarkar ◽  
Matt Keys ◽  
Mahesh Patel ◽  
...  

Abstract Managing a large fleet of offshore structures is a dynamic process that aims at minimising risks to personnel, environment, and businesses, as well as minimising the associated Operations Expenditure. Through the collaborative efforts of ADNOC Offshore and Kent, formerly Atkins Oil & Gas, (Atkins, 2020), revised structural evaluation and integrity approaches have yielded significant cost savings. The considerable savings were associated with the elimination of the requirement for installing many new offshore structures and through reducing the subsea inspection associated efforts. The approach for evaluating the offshore assets’ structural performance was developed based on adopting target probability of failure figures subject to each asset's consequence of failure. Accordingly, structural reliability analyses were conducted specific to each structure, where the analysis considered structure specific environmental hazard curves and failure surfaces. Through mapping the evaluated structural probability of failure and ADNOC's corporate risk matrix's HSE Likelihood, each structure was precisely placed on the risk matrix. Furthermore, the inspection intervals and Topsides, Splash Zone, Subsea Levels I, II and III were mapped to each risk evaluation on the risk matrix. The optimisation approach of adopting a structure specific reliability analysis and mapping with ADNOC's corporate risk matrix yielded considerable cost benefits while providing a more accurate representation of each asset's risk. As a result of the implementation of the developed process, approximately 41% of the assets got lower risk evaluation compared to the legacy approach and presented extra structural capacities that can be utilised for future expansions and eliminating the requirement for installation of new assets. As the process expanded to include asset inspections, the subsea inspection requirements reduced by approximately 43% reflecting a considerable decrease in operating costs. A major contribution of the risk improvement is attributed to the consideration of the storm prevailing approach directions, the joint probability of wave and current magnitudes and directions, as well as the relative alignment of each structure. The developed approaches provide a framework that allows continuous update of the risk assessment and enables executives and management to make risk-based-decision supported by a consistent measure of structural risk. This has been translated into the generation of the Structural Passports (Summary reports) clearly demonstrating the assets current risk and recommendations for mitigation measures, if deemed required.


Sign in / Sign up

Export Citation Format

Share Document