Holistic geoethical slope portfolio risk assessment

2020 ◽  
pp. SP508-2019-157
Author(s):  
Franco Oboni ◽  
César Henri Oboni

AbstractLandslides of natural and man-made slopes represent hazardous geomorphological processes that contribute to highly variable risks. Their consequences generally include loss of life and infrastructural, environmental and cultural assets damage.Prioritizing and mitigating slope risks in a sustainable manner, while considering climate change, is related to geoethics, as any misallocation of resources will likely lead to increased risk to the public.Until recently there was little recognition of the causes and global impacts of human actions. Today, threat-denying humans can be identified as acting inappropriately and ultimately unethically. Sustainable risk management and ethical issues should be discussed simultaneously to avoid the ‘discipline silo trap’ and hazardous omissions.This contribution discusses slope risk management at various scales, i.e. how to ensure better allotment of mitigative funds while complying with sustainability goals and geoethical requirements. In 1987, the World Commission on Environment and Development published a report (also known as the Brundtland Report (Brundtland 1987. World Commission on Environment and Development Report)) that defined sustainable development as meeting the needs of the present without compromising the ability of future generations to meet their own needs.The three case histories discussed in this contribution show how sustainability and ethics can be fostered by using rational, repeatable, transparent quantitative risk assessment applicable at the local scale as well as on a large scale.

2021 ◽  
Vol 11 (11) ◽  
pp. 5208
Author(s):  
Jianpo Liu ◽  
Hongxu Shi ◽  
Ren Wang ◽  
Yingtao Si ◽  
Dengcheng Wei ◽  
...  

The spatial and temporal distribution of tunnel failure is very complex due to geologic heterogeneity and variability in both mining processes and tunnel arrangement in deep metal mines. In this paper, the quantitative risk assessment for deep tunnel failure was performed using a normal cloud model at the Ashele copper mine, China. This was completed by considering the evaluation indexes of geological condition, mining process, and microseismic data. A weighted distribution of evaluation indexes was determined by implementation of an entropy weight method to reveal the primary parameters controlling tunnel failure. Additionally, the damage levels of the tunnel were quantitatively assigned by computing the degree of membership that different damage levels had, based on the expectation normalization method. The methods of maximum membership principle, comprehensive evaluation value, and fuzzy entropy were considered to determine the tunnel damage levels and risk of occurrence. The application of this method at the Ashele copper mine demonstrates that it meets the requirement of risk assessment for deep tunnel failure and can provide a basis for large-scale regional tunnel failure control in deep metal mines.


Author(s):  
Paulo Gabriel Santos Campos de Siqueira ◽  
Alexandre Calumbi Antunes de Oliveira ◽  
Heitor Oliveira Duarte ◽  
Márcio das Chagas Moura

We have developed a probabilistic model to quantify the risks of COVID-19 explosion in Brazil, the epicenter of COVID-19 in Latin America. By explosion, we mean an excessive number of new infections that would overload the public health system. We made predictions from July 12th to Oct 10th, 2020 for various containment strategies, including business as usual, stay at home (SAH) for young and elderly, flight restrictions among regions, gradual resumption of business and the compulsory wearing of masks. They indicate that: if a SAH strategy were sustained, there would be a negligible risk of explosion and the public health system would not be overloaded. For the other containment strategies, the scenario that combines the gradual resumption of business with the mandatory wearing of masks would be the most effective, reducing risk to considerable category. Should this strategy is applied together with the investment in more Intensive Care Unit beds, risk could be reduced to negligible levels. A sensitivity analysis sustained that risks would be negligible if SAH measures were adopted thoroughly.


2016 ◽  
Vol 16 (11) ◽  
pp. 2357-2371 ◽  
Author(s):  
Patric Kellermann ◽  
Christine Schönberger ◽  
Annegret H. Thieken

Abstract. Experience has shown that river floods can significantly hamper the reliability of railway networks and cause extensive structural damage and disruption. As a result, the national railway operator in Austria had to cope with financial losses of more than EUR 100 million due to flooding in recent years. Comprehensive information on potential flood risk hot spots as well as on expected flood damage in Austria is therefore needed for strategic flood risk management. In view of this, the flood damage model RAIL (RAilway Infrastructure Loss) was applied to estimate (1) the expected structural flood damage and (2) the resulting repair costs of railway infrastructure due to a 30-, 100- and 300-year flood in the Austrian Mur River catchment. The results were then used to calculate the expected annual damage of the railway subnetwork and subsequently analysed in terms of their sensitivity to key model assumptions. Additionally, the impact of risk aversion on the estimates was investigated, and the overall results were briefly discussed against the background of climate change and possibly resulting changes in flood risk. The findings indicate that the RAIL model is capable of supporting decision-making in risk management by providing comprehensive risk information on the catchment level. It is furthermore demonstrated that an increased risk aversion of the railway operator has a marked influence on flood damage estimates for the study area and, hence, should be considered with regard to the development of risk management strategies.


2020 ◽  
Vol 54 ◽  
pp. 239-263
Author(s):  
Barbara Stańdo-Kawecka

During the work on the draft of the 1997 Code of the Execution of Penalties (CEP) much attention was paid to the principle of the treatment of sentenced persons, and particu-larly those serving prison sentences, as subjects. In the Polish penological literature two dimensions of that principle were indicated. The first one referred to the strengthening of the sentenced person’s position in relation to enforcement authorities by means of precise regulations concerning his/her legal status and effective mechanisms for the protection of his/her rights. The second dimension meant the abandonment of forced rehabilitation and providing sentenced persons with the ability to decide freely whether they wanted to partici-pate in correctional interventions. Undoubtedly, the 1997 CEP strengthened the legal status of a sentenced person. As regards the abandonment of forced rehabilitation, the legislator chose a compromise solution according to which the participation in correctional interven-tions was, as a rule, voluntary, but in some cases it was mandatory. Like in other countries, in Poland in the last decade the idea of the public protection against crime played an in-creasingly important role in the criminal policy. In the criminal justice system focused on risk management, the treatment of sentenced persons as subjects requires providing them with reliable information on the possible consequences of their decisions concerning the participation in offered correctional activities. Additionally, it requires providing them with adequate access to empirically proven correctional programmes as well as introducing a transparent system of risk assessment and monitoring during the execution of the imposed penalty or penal measure.


2013 ◽  
Vol 76 (3) ◽  
pp. 376-385 ◽  
Author(s):  
YUHUAN CHEN ◽  
SHERRI B. DENNIS ◽  
EMMA HARTNETT ◽  
GREG PAOLI ◽  
RÉGIS POUILLOT ◽  
...  

Stakeholders in the system of food safety, in particular federal agencies, need evidence-based, transparent, and rigorous approaches to estimate and compare the risk of foodborne illness from microbial and chemical hazards and the public health impact of interventions. FDA-iRISK (referred to here as iRISK), a Web-based quantitative risk assessment system, was developed to meet this need. The modeling tool enables users to assess, compare, and rank the risks posed by multiple food-hazard pairs at all stages of the food supply system, from primary production, through manufacturing and processing, to retail distribution and, ultimately, to the consumer. Using standard data entry templates, built-in mathematical functions, and Monte Carlo simulation techniques, iRISK integrates data and assumptions from seven components: the food, the hazard, the population of consumers, process models describing the introduction and fate of the hazard up to the point of consumption, consumption patterns, dose-response curves, and health effects. Beyond risk ranking, iRISK enables users to estimate and compare the impact of interventions and control measures on public health risk. iRISK provides estimates of the impact of proposed interventions in various ways, including changes in the mean risk of illness and burden of disease metrics, such as losses in disability-adjusted life years. Case studies for Listeria monocytogenes and Salmonella were developed to demonstrate the application of iRISK for the estimation of risks and the impact of interventions for microbial hazards. iRISK was made available to the public at http://irisk.foodrisk.org in October 2012.


Author(s):  
Marco Bastos ◽  
Dan Mercea

In this article, we review our study of 13 493 bot-like Twitter accounts that tweeted during the UK European Union membership referendum debate and disappeared from the platform after the ballot. We discuss the methodological challenges and lessons learned from a study that emerged in a period of increasing weaponization of social media and mounting concerns about information warfare. We address the challenges and shortcomings involved in bot detection, the extent to which disinformation campaigns on social media are effective, valid metrics for user exposure, activation and engagement in the context of disinformation campaigns, unsupervised and supervised posting protocols, along with infrastructure and ethical issues associated with social sciences research based on large-scale social media data. We argue for improving researchers' access to data associated with contentious issues and suggest that social media platforms should offer public application programming interfaces to allow researchers access to content generated on their networks. We conclude with reflections on the relevance of this research agenda to public policy. This article is part of a discussion meeting issue ‘The growing ubiquity of algorithms in society: implications, impacts and innovations'.


2014 ◽  
Vol 16 (4) ◽  
pp. 567-574 ◽  

It is timely to consider the ethical and social questions raised by progress in pharmacogenomics, based on the current importance of pharmacogenomics for avoidance of predictable side effects of drugs, and for correct choice of medications in certain cancers. It has been proposed that the entire population be genotyped for drug-metabolizing enzyme polymorphisms, as a measure that would prevent many untoward and dangerous drug reactions. Pharmacologic treatment targeting based on genomics of disease can be expected to increase greatly in the coming years. Policy and ethical issues exist on consent for large-scale genomic pharmacogenomic data collection, public vs corporate ownership of genomic research results, testing efficacy and safety of drugs used for rare genomic indications, and accessibility of treatments based on costly research that is applicable to relatively few patients. In major psychiatric disorders and intellectual deficiency, rare and de novo deletion or duplication of chromosomal segments (copy number variation), in the aggregate, are common causes of increased risk. This implies that the policy problems of pharmacogenomics will be particularly important for the psychiatric disorders.


Author(s):  
David Mangold ◽  
W. Kent Muhlbauer ◽  
Jim Ponder ◽  
Tony Alfano

Risk management of pipelines is a complex challenge due to the dynamic environment of the real world coupled with a wide range of system types installed over many decades. Various methods of risk assessment are currently being used in industry, many of which utilize relative scoring. These assessments are often not designed for the new integrity management program (IMP) requirements and are under direct challenge by regulators. SemGroup had historically used relative risk assessment methodologies to help support risk management decision-making. While the formality offered by these early methods provided benefits, it was recognized that, in order to more effectively manage risk and better meet the United States IMP objectives, a more effective risk assessment would be needed. A rapid and inexpensive migration into a better risk assessment platform was sought. The platform needed to be applicable not only to pipeline miles, but also to station facilities and all related components. The risk results had to be readily understandable and scalable, capturing risks from ‘trap to trap’ in addition to risks accompanying each segment. The solution appeared in the form a quantitative risk assessment that was ‘physics based’ rather than the classical statistics based QRA. This paper will outline the steps involved in this transition process and show how quantitative risk assessment may be efficiently implemented to better guide integrity decision-making, illustrated with a case study from SemGroup.


Sign in / Sign up

Export Citation Format

Share Document