scholarly journals Probabilistic Inference Hybrid IT Value Model Using Bayesian Network

Author(s):  
I Gusti Bagus Baskara Nugraha ◽  
◽  
Imaniar Ramadhani ◽  
Jaka Sembiring ◽  
◽  
...  

In this study, we propose probabilistic inference model on a hybrid IT value model using Bayesian Network (BN) that represents uncertain relationships between 13 variables of the model. Those variables are performance, market, innovation, IT support, core competence, capabilities, knowledge, human resources, IT development, IT resources, capital, labor, and IT spending. The relationships between variables in the model are determined using probabilistic approach, including the structure, nature, and direction of relationships. We derive a probabilistic graphical model and measure the relationships between variables. The results of this study shows that the probabilistic approach with Bayesian Network can show that capabilities and core competence are the most important variables to produce high performance output.

Author(s):  
Andrey Chukhray ◽  
Olena Havrylenko

The subject of research in the article is the process of intelligent computer training in engineering skills. The aim is to model the process of teaching engineering skills in intelligent computer training programs through dynamic Bayesian networks. Objectives: To propose an approach to modeling the process of teaching engineering skills. To assess the student competence level by considering the algorithms development skills in engineering tasks and the algorithms implementation ability. To create a dynamic Bayesian network structure for the learning process. To select values for conditional probability tables. To solve the problems of filtering, forecasting, and retrospective analysis. To simulate the developed dynamic Bayesian network using a special Genie 2.0-environment. The methods used are probability theory and inference methods in Bayesian networks. The following results are obtained: the development of a dynamic Bayesian network for the educational process based on the solution of engineering problems is presented. Mathematical calculations for probabilistic inference problems such as filtering, forecasting, and smoothing are considered. The solution of the filtering problem makes it possible to assess the current level of the student's competence after obtaining the latest probabilities of the development of the algorithm and its numerical calculations of the task. The probability distribution of the learning process model is predicted. The number of additional iterations required to achieve the required competence level was estimated. The retrospective analysis allows getting a smoothed assessment of the competence level, which was obtained after the task's previous instance completion and after the computation of new additional probabilities characterizing the two checkpoints implementation. The solution of the described probabilistic inference problems makes it possible to provide correct information about the learning process for intelligent computer training systems. It helps to get proper feedback and to track the student's competence level. The developed technique of the kernel of probabilistic inference can be used as the decision-making model basis for an automated training process. The scientific novelty lies in the fact that dynamic Bayesian networks are applied to a new class of problems related to the simulation of engineering skills training in the process of performing algorithmic tasks.


2021 ◽  
Author(s):  
Sophie Mentzel ◽  
Merete Grung ◽  
Knut Erik Tollefsen ◽  
Marianne Stenrod ◽  
Karina Petersen ◽  
...  

Conventional environmental risk assessment of chemicals is based on a calculated risk quotient, representing the ratio of exposure to effects of the chemical, in combination with assessment factors to account for uncertainty. Probabilistic risk assessment approaches can offer more transparency, by using probability distributions for exposure and/or effects to account for variability and uncertainty. In this study, a probabilistic approach using Bayesian network (BN) modelling is explored as an alternative to traditional risk calculation. BNs can serve as meta-models that link information from several sources and offer a transparent way of incorporating the required characterization of uncertainty for environmental risk assessment. To this end, a BN has been developed and parameterised for the pesticides azoxystrobin, metribuzin, and imidacloprid. We illustrate the development from deterministic (traditional) risk calculation, via intermediate versions, to fully probabilistic risk characterisation using azoxystrobin as an example. We also demonstrate seasonal risk calculation for the three pesticides.


Author(s):  
Ahmad Bashir ◽  
Latifur Khan ◽  
Mamoun Awad

A Bayesian network is a graphical model that finds probabilistic relationships among variables of a system. The basic components of a Bayesian network include a set of nodes, each representing a unique variable in the system, their inter-relations, as indicated graphically by edges, and associated probability values. By using these probabilities, termed conditional probabilities, and their interrelations, we can reason and calculate unknown probabilities. Furthermore, Bayesian networks have distinct advantages compared to other methods, such as neural networks, decision trees, and rule bases, which we shall discuss in this paper.


Author(s):  
H R Williams ◽  
R S Trask ◽  
I P Bond

Design and certification of novel self-healing aerospace structures was explored by reviewing the suitability of conventional deterministic certification approaches. A sandwich structure with a vascular network self-healing system was used as a case study. A novel probabilistic approach using a Monte Carlo method to generate an overall probability of structural failure yields notable new insights into design of self-healing systems, including a drive for a faster healing time of less than two flight hours. In the case study considered, a mature self-healing system could be expected to reduce the probability of structural failure (compared to a conventional damage-tolerant construction) by almost an order of magnitude. In a risk-based framework this could be traded against simplified maintenance activity (to save cost) and/or increased allowable stress (to allow a lighter structure). The first estimate of the increase in design allowable stresses permitted by a self-healing system is around 8 per cent, with a self-healing system much lighter than previously envisaged. It is thought these methods and conclusions could have wider application to self-healing and conventional high-performance composite structures.


1999 ◽  
Vol 38 (01) ◽  
pp. 37-42 ◽  
Author(s):  
G. C. Sakellaropoulos ◽  
G. C. Nikiforidis

Abstract:The assessment of a head-injured patient’s prognosis is a task that involves the evaluation of diverse sources of information. In this study we propose an analytical approach, using a Bayesian Network (BN), of combining the available evidence. The BN’s structure and parameters are derived by learning techniques applied to a database (600 records) of seven clinical and laboratory findings. The BN produces quantitative estimations of the prognosis after 24 hours for head-injured patients in the outpatients department. Alternative models are compared and their performance is tested against the success rate of an expert neurosurgeon.


Author(s):  
Edmund Jones ◽  
Vanessa Didelez

In one procedure for finding the maximal prime decomposition of a Bayesian network or undirected graphical model, the first step is to create a minimal triangulation of the network, and a common and straightforward way to do this is to create a triangulation that is not necessarily minimal and then thin this triangulation by removing excess edges. We show that the algorithm for thinning proposed in several previous publications is incorrect. A different version of this algorithm is available in the R package gRbase, but its correctness has not previously been proved. We prove that this version is correct and provide a simpler version, also with a proof. We compare the speed of the two corrected algorithms in three ways and find that asymptotically their speeds are the same, neither algorithm is consistently faster than the other, and in a computer experiment the algorithm used by gRbase is faster when the original graph is large, dense, and undirected, but usually slightly slower when it is directed.


2011 ◽  
Vol 20 (03) ◽  
pp. 375-400 ◽  
Author(s):  
INÉS DEL CAMPO ◽  
JAVIER ECHANOBE ◽  
KOLDO BASTERRETXEA ◽  
GUILLERMO BOSQUE

This paper presents a scalable architecture suitable for the implementation of high-speed fuzzy inference systems on reconfigurable hardware. The main features of the proposed architecture, based on the Takagi–Sugeno inference model, are scalability, high performance, and flexibility. A scalable fuzzy inference system (FIS) must be efficient and practical when applied to complex situations, such as multidimensional problems with a large number of membership functions and a large rule base. Several current application areas of fuzzy computation require such enhanced capabilities to deal with real-time problems (e.g., robotics, automotive control, etc.). Scalability and high performance of the proposed solution have been achieved by exploiting the inherent parallelism of the inference model, while flexibility has been obtained by applying hardware/software codesign techniques to reconfigurable hardware. Last generation reconfigurable technologies, particularly field programmable gate arrays (FPGAs), make it possible to implement the whole embedded FIS (e.g., processor core, memory blocks, peripherals, and specific hardware for fuzzy inference) on a single chip with the consequent savings in size, cost, and power consumption. As a prototyping example, we implemented a complex fuzzy controller for a vehicle semi-active suspension system composed of four three-input FIS on a single FPGA of the Xilinx's Virtex 5 device family.


2017 ◽  
Author(s):  
Nan Hua ◽  
Harianto Tjong ◽  
Hanjun Shin ◽  
Ke Gong ◽  
Xianghong Jasmine Zhou ◽  
...  

ABSTRACTHi-C technologies are widely used to investigate the spatial organization of genomes. However, the structural variability of the genome is a great challenge to interpreting ensemble-averaged Hi-C data, particularly for long-range/interchromosomal interactions. We pioneered a probabilistic approach for generating a population of distinct diploid 3D genome structures consistent with all the chromatin-chromatin interaction probabilities from Hi-C experiments. Each structure in the population is a physical model of the genome in 3D. Analysis of these models yields new insights into the causes and the functional properties of the genome’s organization in space and time. We provide a user-friendly software package, called PGS, that runs on local machines and high-performance computing platforms. PGS takes a genome-wide Hi-C contact frequency matrix and produces an ensemble of 3D genome structures entirely consistent with the input. The software automatically generates an analysis report, and also provides tools to extract and analyze the 3D coordinates of specific domains.


Sign in / Sign up

Export Citation Format

Share Document