scholarly journals Attention strategies for learning under reducible and irreducible uncertainty

2020 ◽  
Vol 20 (11) ◽  
pp. 1493
Author(s):  
Marcus Watson ◽  
Mazyar Fallah ◽  
Thilo Womelsdorf
2001 ◽  
Vol 17 (suppl) ◽  
pp. S69-S75 ◽  
Author(s):  
N. Ole Nielsen

The promotion of human health must be embedded in the wider pursuit of ecosystem health. Interventions will be impaired if ecosystem-linked determinants of health are not taken into account. In the extreme case, if ecosystems lose their capacity for renewal, society will lose life support services. Essential features of ecosystem health are the capacity to maintain integrity and to achieve reasonable and sustainable human goals. An ecosystem approach to research and management must be transdisciplinary and assure participation of stakeholders. These requisites provide a means for science to better deal with the complexity of ecosystems, and for policy-makers and managers to establish and achieve reasonable societal goals. The ecosystem approach can determine links between human health and activities or events which disturb ecosystem state and function. Examples are: landscape disturbance in agriculture, mining, forestry, urbanization, and natural disasters. An understanding of these links can provide guidance for management interventions and policy options that promote human health. An ecosystem approach to management must be adaptive because of irreducible uncertainty in ecosystem function.


PeerJ ◽  
2017 ◽  
Vol 5 ◽  
pp. e3014 ◽  
Author(s):  
Peter Caley ◽  
Geoffrey R. Hosack ◽  
Simon C. Barry

Wildlife collision data are ubiquitous, though challenging for making ecological inference due to typically irreducible uncertainty relating to the sampling process. We illustrate a new approach that is useful for generating inference from predator data arising from wildlife collisions. By simply conditioning on a second prey species sampled via the same collision process, and by using a biologically realistic numerical response functions, we can produce a coherent numerical response relationship between predator and prey. This relationship can then be used to make inference on the population size of the predator species, including the probability of extinction. The statistical conditioning enables us to account for unmeasured variation in factors influencing the runway strike incidence for individual airports and to enable valid comparisons. A practical application of the approach for testing hypotheses about the distribution and abundance of a predator species is illustrated using the hypothesized red fox incursion into Tasmania, Australia. We estimate that conditional on the numerical response between fox and lagomorph runway strikes on mainland Australia, the predictive probability of observing no runway strikes of foxes in Tasmania after observing 15 lagomorph strikes is 0.001. We conclude there is enough evidence to safely reject the null hypothesis that there is a widespread red fox population in Tasmania at a population density consistent with prey availability. The method is novel and has potential wider application.


Geophysics ◽  
2021 ◽  
pp. 1-66
Author(s):  
Alberto Ardid ◽  
David Dempsey ◽  
Edward Bertrand ◽  
Fabian Sepulveda ◽  
Flora Solon ◽  
...  

In geothermal exploration, magnetotelluric (MT) data and inversion models are commonly used to image shallow conductors typically associated with the presence of an electrically conductive clay cap that overlies the main reservoir. However, these inversion models suffer from non-uniqueness and uncertainty, and the inclusion of useful geological information is still limited. We develop a Bayesian inversion method that integrates the electrical resistivity distribution from MT surveys with borehole methylene blue data (MeB), an indicator of conductive clay content. MeB data is used to inform structural priors for the MT Bayesian inversion that focus on inferring with uncertainty the shallow conductor boundary in geothermal fields. By incorporating borehole information, our inversion reduces non-uniqueness and then explicitly represents the irreducible uncertainty as estimated depth intervals for the conductor boundary. We use Markov chain Monte Carlo (McMC) and a one-dimensional three-layer resistivity model to accelerate the Bayesian inversion of the MT signal beneath each station. Then, inferred conductor boundary distributions are interpolated to construct pseudo-2D/3D models of the uncertain conductor geometry. We compared our approach against a deterministic MT inversion software on synthetic and field examples and showed good performance in estimating the depth to the bottom of the conductor, a valuable target in geothermal reservoir exploration.


Author(s):  
David J. Bartholomew

In many quarters God and chance are still seen as mutually exclusive alternatives. It is common to hear that ascribing anything to “chance” rules out God’s action. Recent scientific developments have tended to reinforce that distinction. Quantum theory introduced an irreducible uncertainty at the atomic level by requiring that certain microscopic physical events were unpredictable in principle. This was followed by the biologists’ claim that mutations, on which evolution depends, were effectively random and hence that evolutionary development was undirected. The problem this posed to Christian apologists was put most forcibly by Jacques Monod when he asserted “Pure chance,… at the root of the stupendous edifice of evolution alone is the source of every innovation.” Several attempts have been made to include chance within a theistic account. One, advocated by the intelligent design movement, is to contend that some biological structures are too complex to have originated in the way that evolutionary theory supposes and therefore that they must be attributed to God. Another is to suppose that God acts in an undetectable way at the quantum level without destroying the random appearance of what goes on there. A third approach is to contend that chance is real and hence is a means by which God works. A key step in this argument is the recognition that chance and order are not mutually exclusive. Reality operates at a number of different levels of aggregation so that what is attributable to chance at one level emerges as near certainty at a higher level. Further arguments, based on what is known as the anthropic principle, are also used to judge whether or not chance is sufficient to account for existence. These are critically evaluated.


1981 ◽  
Vol 6 ◽  
Author(s):  
R.B. Lyon

ABSTRACTThe potential impact of the post-closure phase of a nuclear fuel waste disposal project is radiation dose to man. Radiation dose is estimated as the end product of a total systems analysis. Field and laboratory research must be assimilated in a form that can be accepted by the total systems analysis procedure. A central focus of this assimilation must be the consideration of uncertainties in the analysis and data used. Irreducible uncertainty arises because of the wide variability in natural systems and the unprecedented extrapolation into the distant future. The SYVAC computer program provides a framework for assimilation of the results of the field and laboratory research with a systematic treatment of uncertainty. A SYVAC assessment of the post-closure performance of a Canadian nuclear waste disposal facility is presented with particular illustrations of the interface between the assessment models and data and the field and laboratory research.


1999 ◽  
Vol 92 (3) ◽  
pp. 325-347 ◽  
Author(s):  
Ian Almond

The Other resembles God.To welcome the Other absolutely is to preserve the Other as a state of irreducible uncertainty, to suspend the desire to ascertain exactly who or what the Other is, to suppress the wish to name, and to avoid assimilating or incorporating the Other into a reassuringly familiar vocabulary. The object of this paper is a tentative comparison between Eckhartian gelâzenheit and Derridean openness. I will compare the Derridean response to the uncertainty of the infinitely Other with that of the German Dominican preacher Meister Eckhart (1260–1329) in order to examine their respective terms of “emptiness” and “openness” and to try to understand how Eckhart's idea of description or conception as doing violence upon the Other is, in part, adopted and, in part, rejected by Jacques Derrida. As some of the most interesting aspects of Derrida's understanding of otherness can be discerned in his early work on Levinas, I will first examine Derrida's initial skepticism toward the idea of a nonviolent phenomenology, in contrast to his more recent reappraisal of his relation to Levinas and the “welcome of the Other.”


2017 ◽  
Vol 30 (19) ◽  
pp. 7585-7598 ◽  
Author(s):  
Karen A. McKinnon ◽  
Andrew Poppick ◽  
Etienne Dunn-Sigouin ◽  
Clara Deser

Abstract Estimates of the climate response to anthropogenic forcing contain irreducible uncertainty due to the presence of internal variability. Accurate quantification of this uncertainty is critical for both contextualizing historical trends and determining the spread of climate projections. The contribution of internal variability to uncertainty in trends can be estimated in models as the spread across an initial condition ensemble. However, internal variability simulated by a model may be inconsistent with observations due to model biases. Here, statistical resampling methods are applied to observations in order to quantify uncertainty in historical 50-yr (1966–2015) winter near-surface air temperature trends over North America related to incomplete sampling of internal variability. This estimate is compared with the simulated trend uncertainty in the NCAR CESM1 Large Ensemble (LENS). The comparison suggests that uncertainty in trends due to internal variability is largely overestimated in LENS, which has an average amplification of variability of 32% across North America. The amplification of variability is greatest in the western United States and Alaska. The observationally derived estimate of trend uncertainty is combined with the forced signal from LENS to produce an “Observational Large Ensemble” (OLENS). The members of OLENS indicate the range of observationally constrained, spatially consistent temperature trends that could have been observed over the past 50 years if a different sequence of internal variability had unfolded. The smaller trend uncertainty in OLENS suggests that is easier to detect the historical climate change signal in observations than in any given member of LENS.


2021 ◽  
Vol 21 (1) ◽  
pp. 49-59
Author(s):  
Grzegorz M. Malinowski

The purpose of this article is primarily to introduce the topic of scientific uncertainty to the wider context of economics and management. Scientific uncertainty is one of the manifestations of irreducible uncertainty and reflection on it should enable better decision making. An entity that bases its operation on current scientific research, which depreciates over time and ultimately leads to erroneous decisions, is referred to as the “loser”. The text indicates estimation of potential scale of this problem supplemented by an outline of sociological difficulties identified in the analysis of the process of building scientific statements. The article ends with a sketch of the answer to the question “how to act in the context of scientific uncertainty?”.


Sign in / Sign up

Export Citation Format

Share Document