Analyzing Risk through Probabilistic Modeling in Operations Research - Advances in Logistics, Operations, and Management Science
Latest Publications


TOTAL DOCUMENTS

16
(FIVE YEARS 0)

H-INDEX

2
(FIVE YEARS 0)

Published By IGI Global

9781466694583, 9781466694590

Author(s):  
Poornima Balakrishna ◽  
Sherry Smith Borener ◽  
Ian Crook ◽  
Alan Durston ◽  
Mindy J. Robinson

When making policy, procedural, or technological changes to a complex system that has safety implications, a key question decision makers must answer is: What are the risks to the users of the system that will result from making these changes to the system? This chapter illustrates a method to explore different facets of this question using mathematical modeling and probabilistic risk assessment techniques, with the objective of assessing the safety impact of changes to the National Airspace System that follow from the Federal Aviation Administration's next generation air traffic modernization program. The authors describe the development of an Integrated Safety Assessment Model as a structured approach to evaluating current and emerging risks in National Airspace System operations. This process addresses the previously stated risk question by combining fault tree and event sequence diagram modeling techniques, hazard identification and analysis methods, opinions from subject matter experts, and concepts from business intelligence.


Author(s):  
Dariusz Jacek Jakóbczak

Proposed method, called Probabilistic Nodes Combination (PNC), is the method of 2D curve interpolation and extrapolation using the set of key points (knots or nodes). Nodes can be treated as characteristic points of data for modeling and analyzing. The model of data can be built by choice of probability distribution function and nodes combination. PNC modeling via nodes combination and parameter ? as probability distribution function enables value anticipation in risk analysis and decision making. Two-dimensional curve is extrapolated and interpolated via nodes combination and different functions as discrete or continuous probability distribution functions: polynomial, sine, cosine, tangent, cotangent, logarithm, exponent, arc sin, arc cos, arc tan, arc cot or power function. Novelty of the paper consists of two generalizations: generalization of previous MHR method with various nodes combinations and generalization of linear interpolation with different (no basic) probability distribution functions and nodes combinations.


Author(s):  
Hao Peng

Condition-Based Maintenance (CBM) is one type of preventive maintenance policy. CBM has attracted lots of attentions of both academia and industry due to the development of advanced sensor technology and measurement devices. The proper implementation of CBM can reduce the frequency of random failures and the expected cost of maintenance during the lifecycle of a system. In this chapter, a brief overview of different maintenance strategies is first provided for the readers who are not familiar with maintenance optimization models. Then several elementary models about CBM will be introduced to help the readers get a general idea of the optimization models in this field.


Author(s):  
Katarzyna Beata Rostek

The issue of risk has accompanied human activity for thousands of years. During this time people have learned to define, describe and deal with risk in such a way that the effects of its impact are mitigated as much as possible. That does not mean that they have succeeded, or ever will succeed, in eliminating risk. It was, is and will be one of the key aspects of organizations and business entities and activities aimed at preventing and limiting the consequences of its occurrence are required. For such actions to be effective it is necessary to know risk and how to handle it. Therefore the basis of the chapter are methodical and practical aspects relating to issues of risk management and proceedings in the event of its occurrence.


Author(s):  
Franco Caron

The capability to elaborate a reliable estimate at completion for a project since the early stage of project execution is the prerequisite in order to provide an effective control of the project. The non-repetitive and uncertain nature of projects and the involvement of multiple stakeholders raise the need to exploit all the available knowledge sources in order to provide a reliable forecast. Therefore, drawing on a set of case studies, this paper proposes a Bayesian approach to support the elaboration of the estimate at completion in those industrial fields where projects are denoted by uncertainty and complexity. The Bayesian approach allows to integrate experts' opinions, data records from past projects and data related to the current performance of the ongoing project. Data from past projects are selected through a similarity analysis. The proposed approach shows a higher accuracy in comparison with the basic formulas typical of the Earned Value Management (EVM) methodology.


Author(s):  
William P. Fox

This chapter discusses the use of mathematical modeling with technology in risk assessment in the broad area of operations research. The authors provide modeling as a process and illustrate suggested steps in the process. This chapter reviews some of the main modeling texts and provide a brief discussion of their processes. Many illustrative examples are provided to show the breadth of mathematical modeling. These examples cover such topics as discrete dynamical systems, game theory, multi-attribute decision making, data envelopment analysis with linear programming, and integer programming. The authors discuss the important of sensitivity analysis, as applicable. Several scenarios are used as illustrative examples of the process.


Author(s):  
Paulo Afonso ◽  
Victor J. Jiménez

One measure of organizations' performance is the cost of products, as this may indicate the level of efficiency and contribute to define the business strategy of the firm. Nevertheless, companies face an increasingly fast-changing environment where the variability in processes, products, technology, prices, among other variables affect the performance of the organization. Particularly, given such changing environment, product and service costs may be changing over time. In this context, deterministic models for costing systems might be inappropriate. Thus, this chapter proposes a model for calculating costs which considers the variability of endogenous and exogenous cost variables. It uses the logic of a Two-Stage costing model and Monte Carlo Simulation. The proposed model may allow to some extent to predict the risk associated with the variability in costs and support the necessary steps which should be taken to better manage such risk, whether from the point of view of processes rationalization and of cost management.


Author(s):  
Nan Hu ◽  
Haojie Cheng

As the aim of large banks has been changing to select customers of highest benefits, it is important for banks to know not only if but also when a customer will default. Survival analyses have been used to estimate over time risk of default or early payoff, two major risks for banks. The major benefit of this method is that it can easily handle censoring and competing risks. An ROC curve, as a statistical tool, was applied to evaluate credit scoring systems. Traditional ROC analyses allow banks to evaluate if a credit-scoring system can correctly classify customers based on their cross-sectional default status, but will fail when assessing a credit-scoring system at a series of future time points, especially when there are censorings or competing risks. The time-dependent ROC analysis was introduced by Hu and Zhou to evaluate credit-scoring systems in a time-varying fashion and it allows us to assess credit scoring systems for predicting default by any time within study periods.


Author(s):  
Erio Castagnoli ◽  
Gino Favero ◽  
Paola Modesti

Internal consistency—or “coherence”—of a price system is the basis of several key concepts in many fields, such as subjective probability (in Probability Theory), no-arbitrage pricing, and risk measures (in Mathematical Finance). Furthermore, Actuarial Mathematics uses coherence to describe the analytical form of risk premia, and an analogous approach has recently been proposed for firms' valuation. Technically, it amounts to a characterisation of functionals with particular properties (a typical goal in Functional Analysis), which translates into a numerical representation of preferences along the traditional guidelines of Decision Theory, whose analogies with Mathematical Finance are numerous and really impressive. This is explored in this chapter.


Author(s):  
Bibhas Chandra Giri

In the traditional inventory management literature, it is a quite common assumption that the probability distribution of stochastic demand is completely known to the decision maker. However, in reality, there are ample evidences where the demand distribution is not known with certainty. In order to cope with the practical situation, it is, therefore, necessary to investigate inventory models with available incomplete information. This chapter is aimed to study a simple single-period newsboy problem in which the decision maker is risk-averse and the demand information is not perfectly known to him/her. We derive a forecast cost for the period based on sample observation used to set the value of an unknown parameter of the distribution. We analyze the significance of risk aversion on the optimal decisions. From numerical study, we observe that the expected forecast cost increases when less information about demand is available and that the risk-averse inventory manager incurs higher cost than risk-neutral manager.


Sign in / Sign up

Export Citation Format

Share Document