Development and application of efficient methods for the forward propagation of epistemic uncertainty and sensitivity analysis within complex broad-scale flood risk system modelsThis article is one of a selection of papers published in this Special Issue on Hydrotechnical Engineering.

2010 ◽  
Vol 37 (7) ◽  
pp. 955-967 ◽  
Author(s):  
B. P. Gouldby ◽  
P. B. Sayers ◽  
M. C. Panzeri ◽  
J. E. Lanyon

Increasingly, an understanding of flood risk across regions and nations, and an ability to explore how these might change in time, is seen as a prerequisite to effective and efficient flood risk management. In response, specific flood risk analysis methods have been developed that are both accurate and fast to run. Although widely acknowledged as desirable, it has not previously been possible to quantify the uncertainty associated with the assessed flood probability, consequence, or risk. To help overcome this deficiency, an efficient method for the propagation of epistemic uncertainties through large-scale flood risk system models has been developed and trialed for three pilot catchments. The approach is allied to an efficient sensitivity analysis that enables the influence of individual uncertainties on the output quantity of risk to be isolated, enabling future research, development, and data-gathering efforts to be focused.

Author(s):  
Michalis I. Vousdoukas ◽  
Dimitrios Bouziotas ◽  
Alessio Giardino ◽  
Laurens M. Bouwer ◽  
Evangelos Voukouvalas ◽  
...  

Abstract. An upscaling of flood risk assessment frameworks beyond regional and national scales has taken place during recent years, with a number of large-scale models emerging as tools for hotspot identification, support for international policy-making and harmonization of climate change adaptation strategies. There is, however, limited insight on the scaling effects and structural limitations of flood risk models and, therefore, the underlying uncertainty. In light of this, we examine key sources of epistemic uncertainty in the Coastal Flood Risk (CFR) modelling chain: (i) the inclusion and interaction of different hydraulic components leading to extreme sea-level (ESL); (ii) inundation modelling; (iii) the underlying uncertainty in the Digital Elevation Model (DEM); (iv) flood defence information; (v) the assumptions behind the use of depth-damage functions that express vulnerability; and (vi) different climate change projections. The impact of these uncertainties to estimated Expected Annual Damage (EAD) for present and future climates is evaluated in a dual case study in Faro, Portugal and in the Iberian Peninsula. The ranking of the uncertainty factors varies among the different case studies, baseline CFR estimates, as well as their absolute/relative changes. We find that uncertainty from ESL contributions, and in particular the way waves are treated, can be higher than the uncertainty of the two greenhouse gas emission projections and six climate models that are used. Of comparable importance is the quality of information on coastal protection levels and DEM information. In the absence of large-extent datasets with sufficient resolution and accuracy the latter two factors are the main bottlenecks in terms of large-scale CFR assessment quality.


2013 ◽  
Vol 5 (2) ◽  
pp. 69-83 ◽  
Author(s):  
Ruth Fassinger ◽  
Susan L. Morrow

Various research methods can be appropriate for social justice aims. Quantitative, qualitative, and mixed-method approaches offer different kinds of strengths in advancing a social justice agenda. This article recaptures and expands upon the ideas presented by the authors of this special issue, recommending best practices in research for social justice in the following areas: (a) cultural competence and the role of the researcher(s); (b) formulating the focus of the research; (c) selection of the underlying paradigm and research method/design; (d) the research team: formation, process, and issues of power; (e) power and relationship with research participants; and (f) data gathering, analysis, and reporting.


1999 ◽  
Vol 03 (01) ◽  
pp. 111-131 ◽  
Author(s):  
YONG-TAE PARK ◽  
CHUL-HYUN KIM ◽  
JI-HYO LEE

In spite of the recent extension of our knowledge on technological innovation, little inquiry has been made of the distinctive characteristics between R&D firms and non-R&D firms, as well as between product-innovative firms and process-innovative firms. To this end, the main objective of this empirical study, grounded on a large-scale innovation survey of Korean manufacturing firms, is to contrast these two types of firms. The results were mixed. Some hypotheses were confirmed while others were discordant with expectation. By and large, R&D firms and product-innovative firms seem to share a similar propensity, whereas non-R&D firms and process-innovative firms are alike in character. However, there were some unexpected findings which merit attention and are worthy of in-depth examination. Although the study is subject to limitations in terms of its research design and data gathering, the results render some important policy implications. Furthermore, comparative analyses between different types of innovations need to be addressed more extensively in future research.


ReCALL ◽  
2000 ◽  
Vol 12 (1) ◽  
pp. 1-4
Author(s):  
THIERRY CHANIER

This special issue offers a selection of papers presented at the 1999 annual EUROCALL conference, held last September in Besançon, France. Although CALL has a deep rooted tradition in France, EUROCALL’99 was the first large scale international CALL conference to take place in this country. Initiated by the European Association for Computer Assisted Language Learning and the French speaking CALL journal ALSIC (2000), the conference attracted more than 370 full participants coming from 30 countries.


Author(s):  
Jason Matthew Aughenbaugh ◽  
Scott Duncan ◽  
Christiaan J. J. Paredis ◽  
Bert Bras

There is growing acceptance in the design community that two types of uncertainty exist: inherent variability and uncertainty that results from a lack of knowledge, which variously is referred to as imprecision, incertitude, irreducible uncertainty, and epistemic uncertainty. There is much less agreement on the appropriate means for representing and computing with these types of uncertainty. Probability bounds analysis (PBA) is a method that represents uncertainty using upper and lower cumulative probability distributions. These structures, called probability boxes or just p-boxes, capture both variability and imprecision. PBA includes algorithms for efficiently computing with these structures under certain conditions. This paper explores the advantages and limitations of PBA in comparison to traditional decision analysis with sensitivity analysis in the context of environmentally benign design and manufacture. The example of the selection of an oil filter involves multiple objectives and multiple uncertain parameters. These parameters are known with varying levels of uncertainty, and different assumptions about the dependencies between variables are made. As such, the example problem provides a rich context for exploring the applicability of PBA and sensitivity analysis to making engineering decisions under uncertainty. The results reveal specific advantages and limitations of both methods. The appropriate choice of an analysis depends on the exact decision scenario.


2019 ◽  
Vol 32 (8) ◽  
pp. 2353-2366 ◽  
Author(s):  
Gloria Agyemang ◽  
Brendan O’Dwyer ◽  
Jeffrey Unerman

Purpose The purpose of this paper is to offer a retrospective and prospective analysis of the themes explored in the 2006 Accounting, Auditing and Accountability Journal special issue on non-governmental organisation (NGO) accountability. Design/methodology/approach The paper is a reflective review essay. Findings The paper outlines how a number of themes in the 2006 special issue addressing downward accountability, hierarchical accountability and management control have been subsequently developed in a selection of papers from the accounting literature. The development of these themes leads to several suggestions for future research in NGO accountability. Originality/value The paper offers a systematic, original perspective on recent developments in certain areas of the field of NGO accountability.


2018 ◽  
Vol 18 (8) ◽  
pp. 2127-2142 ◽  
Author(s):  
Michalis I. Vousdoukas ◽  
Dimitrios Bouziotas ◽  
Alessio Giardino ◽  
Laurens M. Bouwer ◽  
Lorenzo Mentaschi ◽  
...  

Abstract. An upscaling of flood risk assessment frameworks beyond regional and national scales has taken place during recent years, with a number of large-scale models emerging as tools for hotspot identification, support for international policymaking, and harmonization of climate change adaptation strategies. There is, however, limited insight into the scaling effects and structural limitations of flood risk models and, therefore, the underlying uncertainty. In light of this, we examine key sources of epistemic uncertainty in the coastal flood risk (CFR) modelling chain: (i) the inclusion and interaction of different hydraulic components leading to extreme sea level (ESL), (ii) the underlying uncertainty in the digital elevation model (DEM), (iii) flood defence information, (iv) the assumptions behind the use of depth–damage functions that express vulnerability, and (v) different climate change projections. The impact of these uncertainties on estimated expected annual damage (EAD) for present and future climates is evaluated in a dual case study in Faro, Portugal, and on the Iberian Peninsula. The ranking of the uncertainty factors varies among the different case studies, baseline CFR estimates, and their absolute and relative changes. We find that uncertainty from ESL contributions, and in particular the way waves are treated, can be higher than the uncertainty of the two greenhouse gas emission projections and six climate models that are used. Of comparable importance is the quality of information on coastal protection levels and DEM information. In the absence of large datasets with sufficient resolution and accuracy, the latter two factors are the main bottlenecks in terms of large-scale CFR assessment quality.


Author(s):  
Jeffrey Melby ◽  
Abigail Stehno ◽  
Thomas C. Massey ◽  
Shubhra Misra ◽  
Norberto Nadal-Caraballo ◽  
...  

Large scale flood risk computation has enjoyed a metamorphosis since Hurricane Katrina. Improved characterization of risk is the result of improved computational capabilities due to super computer capacity combined with coupled regional hydrodynamic models, improved local hydrodynamic models, improved joint probability models, inclusion of the most important uncertainties, metamodels and increased computational capacity for stochastic simulation. Improvements in our understanding of, and the ability to model, the coupled hydrodynamics of surge and waves has been well documented as has been improvements in the joint probability method with optimal sampling (JPM-OS) for synthesizing synthetic tropical cyclones (TC) that correctly span the practical hazard probability space. However, maintaining the coupled physics and multivariate probability integrity through the entire flood risk computation while incorporating epistemic uncertainty has had relatively little attention. This paper addresses this latter topic within the context of the Sabine Pass to Galveston Bay, TX Pre-Construction, Engineering and Design, Hurricane Coastal Storm Surge and Wave Hazard Assessment.Recorded Presentation from the vICCE (YouTube Link): https://youtu.be/qYFTO6l7UME


2020 ◽  
Vol 3 ◽  
pp. 205920432096506
Author(s):  
Psyche Loui

Music therapy is an evidence-based practice, but the needs and constraints of various stakeholders pose challenges towards providing the highest standards of evidence for each clinical application. First, what is the best path from clinical need to multi-site, widely adopted intervention for a given disease or disorder? Secondly, how can we inform policy makers that what we do matters for public health––what evidence do we have, and what evidence do we need? This article will review the multiple forms of evidence for music-based interventions in the context of neurological disorders, from large-scale randomized controlled trials (RCT) to smaller-scale experimental studies, and make the case that evidence at multiple levels continues to be necessary for informing the selection of active ingredients of interest in effective musical interventions. The current article reviews some of the existing literature on music-based interventions for neurodegenerative disorders, with particular focus on neural structures and networks that are targeted by specific therapies for disorders including Alzheimer’s Disease, Parkinson’s Disease, and aphasia. This is followed by a focused discussion of principles that are gleaned from studies in cognitive and clinical neuroscience, which may inform the active ingredients of music-based interventions. Therapies that are driven by a deeper understanding of the musical elements that target specific disease mechanisms are more likely to succeed, and to increase the chances of widespread adoption. The article closes with some recommendations for future research.


2011 ◽  
Vol 33 (4) ◽  
pp. 1364-1370 ◽  
Author(s):  
Flávio Zanette ◽  
Liege da Silva Oliveira ◽  
Luiz Antonio Biasi

Araucaria angustifolia is an endangered conifer species of South America that has been over exploited for timber. To incentivize Araucaria angustifolia planting is essential and may play a key role on the conservation of this species and the ecosystems that depend on it. Hence, techniques that allow the production of seedlings with attributes that may entice farmers to plant A. angustifolia trees are very important. Grafting may permit the selection of female trees and the production of precocious plants that will produce high quality seeds. The aim of this study was to determine the best season of the year to graft. Three-year-old seedlings were used as rootstock and orthotropic branches of young plants were used for scion collection. The technique used for the grafting was the bark patch. This procedure was carried out in the beginning of each season in 2007 and 2008, with a total of 160 grafted plants. Grafting carried out in the beginning of autumn had a 50 % success rate. Grafting success was negligible for all remaining seasons. In conclusion, grafting through bark patching is a viable technique for the production of A. angustifolia seedlings. Future research should be carried out to produce grafted seedlings in large-scale.


Sign in / Sign up

Export Citation Format

Share Document