scholarly journals Sources of Uncertainty and Subjective Prices

Author(s):  
Veronica Cappelli ◽  
Simone Cerreia-Vioglio ◽  
Fabio Maccheroni ◽  
Massimo Marinacci ◽  
Stefania Minardi

Abstract We develop a general framework to study source-dependent preferences in economic contexts. We behaviorally identify two key features. First, we drop the assumption of uniform uncertainty attitudes and allow for source-dependent attitudes. Second, we introduce subjective prices to compare outcomes across different sources. Our model evaluates profiles source-wise, by computing the source-dependent certainty equivalents; the latter are converted into the unit of account of a common source and then aggregated into a unique evaluation. By viewing time and location as instances of sources, we show that subjective discount factors and subjective exchange rates are emblematic examples of subjective prices. Finally, we use the model to explore the implications on optimal portfolio allocations and home bias.

Author(s):  
Gabrielle Gauthier Melançon ◽  
Philippe Grangier ◽  
Eric Prescott-Gagnon ◽  
Emmanuel Sabourin ◽  
Louis-Martin Rousseau

Despite advanced supply chain planning and execution systems, manufacturers and distributors tend to observe service levels below their targets, owing to different sources of uncertainty and risks. These risks, such as drastic changes in demand, machine failures, or systems not properly configured, can lead to planning or execution issues in the supply chain. It is too expensive to have planners continually track all situations at a granular level to ensure that no deviations or configuration problems occur. We present a machine learning system that predicts service-level failures a few weeks in advance and alerts the planners. The system includes a user interface that explains the alerts and helps to identify failure fixes. We conducted this research in cooperation with Michelin. Through experiments carried out over the course of four phases, we confirmed that machine learning can help predict service-level failures. In our last experiment, planners were able to use these predictions to make adjustments on tires for which failures were predicted, resulting in an improvement in the service level of 10 percentage points. Additionally, the system enabled planners to identify recurrent issues in their supply chain, such as safety-stock computation problems, impacting the overall supply chain efficiency. The proposed system showcases the importance of reducing the silos in supply chain management.


2020 ◽  
Vol 22 (2) ◽  
pp. 126-132
Author(s):  
Yueh-Sheng Chen ◽  
◽  
Tin-Yun Liao ◽  
Tzu-Chun Hsu ◽  
Wan-Ting Hsu ◽  
...  

BACKGROUND: To determine the temporal trends of incidence and outcome based on different sources of sepsis using a nationwide administrative database. METHODS: From 2002 to 2012, the entire Taiwan's health insurance claims data of emergency-treated and hospital-treated sepsis were analysed for incidence and mortality trends. The information about patients with sepsis and sources of sepsis was identified using a set of validated International Classification of Diseases, ninth revision, clinical modification (ICD-9-CM) codes. The 30-day all-cause mortality was verified by linked death certificate database. RESULTS: A total of 1 259 578 episodes of sepsis were identified during the 11-year study period. Lower respiratory tract infection is the most common source of sepsis in patients, with the highest mortality rate. The incidence of genitourinary tract infection has the fastest growing rate. The sepsis mortality was declining at different rates for each source of sepsis. Co-infections in patients with sepsis are associated with higher mortality rate. CONCLUSION: The temporal trends of sepsis incidence and mortality varied among different sources of sepsis, with lower respiratory tract being the highest burden among patients with sepsis. Furthermore, sources of sepsis and the presence of co-infection are independent predictors of mortality. Our results support source-specific preventive and treatment strategies for future sepsis management.


2016 ◽  
Vol 20 (5) ◽  
pp. 1809-1825 ◽  
Author(s):  
Antoine Thiboult ◽  
François Anctil ◽  
Marie-Amélie Boucher

Abstract. Seeking more accuracy and reliability, the hydrometeorological community has developed several tools to decipher the different sources of uncertainty in relevant modeling processes. Among them, the ensemble Kalman filter (EnKF), multimodel approaches and meteorological ensemble forecasting proved to have the capability to improve upon deterministic hydrological forecast. This study aims to untangle the sources of uncertainty by studying the combination of these tools and assessing their respective contribution to the overall forecast quality. Each of these components is able to capture a certain aspect of the total uncertainty and improve the forecast at different stages in the forecasting process by using different means. Their combination outperforms any of the tools used solely. The EnKF is shown to contribute largely to the ensemble accuracy and dispersion, indicating that the initial conditions uncertainty is dominant. However, it fails to maintain the required dispersion throughout the entire forecast horizon and needs to be supported by a multimodel approach to take into account structural uncertainty. Moreover, the multimodel approach contributes to improving the general forecasting performance and prevents this performance from falling into the model selection pitfall since models differ strongly in their ability. Finally, the use of probabilistic meteorological forcing was found to contribute mostly to long lead time reliability. Particular attention needs to be paid to the combination of the tools, especially in the EnKF tuning to avoid overlapping in error deciphering.


2019 ◽  
Vol 29 (8) ◽  
pp. 1344-1378
Author(s):  
TOMER LIBAL ◽  
MARCO VOLPE

One of the main issues in proof certification is that different theorem provers, even when designed for the same logic, tend to use different proof formalisms and produce outputs in different formats. The project ProofCert promotes the usage of a common specification language and of a small and trusted kernel in order to check proofs coming from different sources and for different logics. By relying on that idea and by using a classical focused sequent calculus as a kernel, we propose here a general framework for checking modal proofs. We present the implementation of the framework in a Prolog-like language and show how it is possible to specialize it in a simple and modular way in order to cover different proof formalisms, such as labelled systems, tableaux, sequent calculi and nested sequent calculi. We illustrate the method for the logic K by providing several examples and discuss how to further extend the approach.


2018 ◽  
Vol 40 ◽  
pp. 06026
Author(s):  
Antje Bornschein

Dam break wave simulation provides data for emergency management. The calculation results should be as accurate as possible. The modeler has to deal with different sources of uncertainty. The paper presents dam break calculation for three different dams in order to assess the uncertainty due to the chosen model (1D or 2D), different terrain models and different Manning's n values. The comparison of the calculation results is focused on the maximum discharge, maximum water level and flood wave arrival time.


Web Mining ◽  
2011 ◽  
pp. 276-306 ◽  
Author(s):  
Honghua Dai

Web usage mining has been used effectively as an approach to automatic personalization and as a way to overcome deficiencies of traditional approaches such as collaborative filtering. Despite their success, such systems, as in more traditional ones, do not take into account the semantic knowledge about the underlying domain. Without such semantic knowledge, personalization systems cannot recommend different types of complex objects based on their underlying properties and attributes. Nor can these systems possess the ability to automatically explain or reason about the user models or user recommendations. The integration of semantic knowledge is, in fact, the primary challenge for the next generation of personalization systems. In this chapter we provide an overview of approaches for incorporating semantic knowledge into Web usage mining and personalization processes. In particular, we discuss the issues and requirements for successful integration of semantic knowledge from different sources, such as the content and the structure of Web sites for personalization. Finally, we present a general framework for fully integrating domain ontologies with Web usage mining and personalization processes at different stages, including the preprocessing and pattern discovery phases, as well as in the final stage where the discovered patterns are used for personalization.


2005 ◽  
Vol 62 (9) ◽  
pp. 3303-3319 ◽  
Author(s):  
Jean-Louis Dufresne ◽  
Richard Fournier ◽  
Christophe Hourdin ◽  
Frédéric Hourdin

Abstract The net exchange formulation (NEF) is an alternative to the usual radiative transfer formulation. It was proposed by two authors in 1967, but until now, this formulation has been used only in a very few cases for atmospheric studies. The aim of this paper is to present the NEF and its main advantages and to illustrate them in the case of planet Mars. In the NEF, the radiative fluxes are no longer considered. The basic variables are the net exchange rates between each pair of atmospheric layers i, j. NEF offers a meaningful matrix representation of radiative exchanges, allows qualification of the dominant contributions to the local heating rates, and provides a general framework to develop approximations satisfying reciprocity of radiative transfer as well as the first and second principles of thermodynamics. This may be very useful to develop fast radiative codes for GCMs. A radiative code developed along those lines is presented for a GCM of Mars. It is shown that computing the most important optical exchange factors at each time step and the other exchange factors only a few times a day strongly reduces the computation time without any significant precision lost. With this solution, the computation time increases proportionally to the number N of the vertical layers and no longer proportionally to its square N 2. Some specific points, such as numerical instabilities that may appear in the high atmosphere and errors that may be introduced if inappropriate treatments are performed when reflection at the surface occurs, are also investigated.


Sign in / Sign up

Export Citation Format

Share Document