Limitations in the Use of Past Datasets for Future Hazard Analysis 

Author(s):  
John Clague

<p>Frequency-magnitude relations derived from historic and prehistoric datasets underpin many natural hazard risk assessments. For example, probabilistic estimates of seismic risk rely on instrumented records of past earthquakes, in some cases supplemented by prehistoric seismicity inferred from proxy geologic evidence. Yet, there are several problems in these datasets that compromise the reliability of derived frequency-magnitude relations. In this presentation, I briefly discuss these problems. First, historic records of past events are temporally biased. Using seismicity as an example, earthquake catalogues are complete only for the past several decades, the period during which seismic networks have been sufficiently extensive to capture all events. During the first half of the twentieth century, small and even moderate earthquakes went unrecorded, and farther back in time, the occurrence of even large earthquakes is limited to eyewitness accounts. Prior to the last century, there is only limited knowledge of rare, but large events with low average return periods. Yet, low social and political tolerance for risk requires knowledge of events with return periods of hundreds to thousands of years. Temporal biases of this type result in huge uncertainties about the future occurrence of events with large return periods. A second limitation, which applies particularly to prehistoric events, is the large uncertainty in the times and magnitudes of events inferred using geologic proxy data. The example I use in this talk is the large debris-flow prone Cheekye River fan in southwestern British Columbia. Relatively small debris flows have happened on the fan in the historic period, and there is geologic evidence for several much larger prehistoric events during the Holocene. A new residential subdivision has been proposed for the apex of the fan, requiring that geologists estimate the sizes of debris flows with return periods up to 10,000 years. The Cheekye fan has been better studied than any other fan in western Canada, yet there are very large uncertainties in the sizes and times of events that are more than 100 years old. Event times are imprecise because radiocarbon ages carry inherent uncertainties of several decades to centuries. Furthermore, the geologic record of past events is incomplete. The frequency-magnitude curve for debris flows on Cheekye fan is ‘better than nothing’, but the very low societal tolerance for risk in Canada means that decisions about development on the fan likely will be based on worst-case scenarios of long return-period events that are poorly grounded in science. A third limitation that I highlight in my presentation pertains to weather-related hazards (floods, severe storms, and many landslides). An assumption made when using frequency-magnitude relations to evaluate hazard and risk is that the past can be applied to the near-future. This assumption is invalid for weather-related hazards, because climate is changing. Climate non-stationarity implies, for example, that historic hydrometric data, upon which flood frequency analyses were based in the past century may be of limited use in planning for future extreme floods.</p>

2014 ◽  
Vol 11 (7) ◽  
pp. 7551-7584 ◽  
Author(s):  
B. Arheimer ◽  
G. Lindström

Abstract. There is an on-going discussion whether floods are more frequent nowadays than in the past and whether they will increase in a future climate. To explore this for Sweden we merged observed time-series from 69 sites across the country (450 000 km2) for the past century with high-resolution dynamic scenario modeling of the up-coming century. The results show that the changes of daily annual high flows in Sweden oscillate between decades, but there is no significant trend for the past 100 years. A small tendency for high flows to decrease by 0.3–0.4% per decade in magnitude and 10-year flood frequency was noted, but not statistically significant. Temperature was found to be the strongest climate driver for river high-flows, as these are mainly related to snow melt in Sweden. Also in the future there will be oscillations between decades, but these were difficult to estimate as climate projections were not in phase with observations. However, in the long term, the daily annual high-flows may decrease by on average 1% per decade, mainly due to lower peaks from snow melt in the spring (–2% per decade) caused by higher temperatures and shorter snow season. On the contrary, autumn flows may increase by 3% per decade due to more intensive rainfall. This indicates a shift in flood generating processes in the future, with more influence of rain generated floods. This should be considered in reference data for design variables when adapting to climate change. Uncertainties related to the study are discussed in the paper, both for observed data and for the complex model chain of climate impact assessments in hydrology.


The Holocene ◽  
2021 ◽  
pp. 095968362110605
Author(s):  
Scott St. George ◽  
Joseph Zeleznik ◽  
Judith Avila ◽  
Matthew Schlauderaff

Over the past century, the Red River of the North has been the least stationary river in the continental United States. In Canada, historical and paleoenvironmental evidence indicates severe floods were common during the early 1800s, with the record ce 1826 flood having an estimated peak discharge 50% higher than the second-most severe flood ever observed. Unfortunately, the recorded history of flooding upstream in the United States does not begin until seven decades after this event. If 1826 was an equally exceptional flood on American reach of the river, then current flood-frequency curves for the river underestimate significantly the risks posed by future flooding. Alternatively, if the American stretch did not produce a major flood in 1826, then the recent spate of flooding that has occurred over the past two decades is exceptional within the context of the past 200 years. Communities in the Fargo-Moorhead metropolitan area are building a 58-km long, $2.75 billion (USD) diversion channel that would redirect floodwaters westward around the two cities before returning it to the main channel. Because this and other infrastructure in North Dakota and Minnesota is intended to provide protection against low-probability, high-magnitude floods, new paleoflood investigations in the region would help local, state, and federal policy-makers better understand the true flood threats posed by the Red River of the North.


2021 ◽  
Author(s):  
Daniel Hamill ◽  
Gabrielle David

Streamflow influences the distribution and organization of high water marks along rivers and streams in a landscape. The federal definition of ordinary high water mark (OHWM) is defined by physical and vegetative field indicators that are used to identify inundation extents of ordinary high water levels without any reference to the relationship between streamflow and regulatory definition. Streamflow is the amount, or volume, of water that moves through a stream per unit time. This study explores regional characteristics and relationships between field-delineated OHWMs and frequency-magnitude streamflow metrics derived from a flood frequency analysis. The elevation of OHWM is related to representative constant-level discharge return periods with national average return periods of 6.9 years using partial duration series and 2.8 years using annual maximum flood frequency approaches. The range in OHWM return periods is 0.5 to 9.08, and 1.05 to 11.01 years for peaks-over-threshold and annual maximum flood frequency methods, respectively. The range of OHWM return periods is consistent with the range found in national studies of return periods related to bankfull streamflow. Hydraulic models produced a statistically significant relationship between OHWM and bank-full, which reinforces the close relationship between the scientific concept and OHWM in most stream systems.


2021 ◽  
Author(s):  
Katharina Schroeer ◽  
Cornelia Schwierz ◽  
Simona Trefalt ◽  
Alessandro Hering ◽  
Urs Germann

<p>Hailstorms and associated hail stone sizes are a tricky atmospheric hazard to assess, because the processes leading to severe convective weather are complex and the spatiotemporal scales of the impacts are often small. The high natural variability of hail requires expensive high-resolution, area-covering measurements to establish robust statics. Weather radars help to achieve this, but despite growing data archives, records usually do not yet extend to climatological time scales (≥30y), and reference ground observations to calibrate hail algorithms are still fragmentary. Consequentially, there remain substantial uncertainties regarding the long-term hazard of hail. Nevertheless, stakeholders require estimates of return periods for preventive regulations or as input to downstream impact models, e.g., in the insurance and engineering sector.</p><p>In the project “Hail climatology Switzerland” MeteoSwiss partnered up with three federal offices, the insurance and engineering sectors to establish a common national reference of the occurrence of hail in Switzerland. The deliverables include developing return period maps of extreme hail events. However, the definition of such extremes varies across sectors. For example, stakeholders from damage prevention require impact probabilities of the largest hailstorm onto an average rooftop, whereas reinsurance stakeholders are interested in nation-wide worst-case events. Here we report on the approaches we took in deriving the frequencies of severe hail considering the different stakeholder demands and the challenges and uncertainties we thereby encountered.</p><p>Using newly reprocessed gridded radar hail data, we assess frequencies of observed hail occurrence in Switzerland over 19 years (2002-2020). We further developed a probabilistic hazard model using stochastic resampling of hailstorms, driven by large-scale environmental boundary conditions. In order to take a storm-object perspective on extremes, we isolate more than 40’000 individual hailstorm footprints. This allows us to consider local storm properties such as the distributions of hail stone sizes by storm area and duration. In addition, we identify region-dependent extreme storm properties, which is specifically relevant in the Alpine region, where high and complex topography creates sharp climatic gradients and results from other regions are often not easily transferable.</p><p>Results show that observed storm tracks vary strongly between years, and hail footprints vary substantially by storm type. Comparing our results obtained from the longest radar-based hail record so far, we find that the spatial patterns of hail agree well with existing hazard maps derived, i.a., from damage claims. However, we also find that frequencies of local extreme hail stone sizes may have been underestimated in the past. This is further corroborated by a regionally aggregated comparative analysis of the radar record to historical records of very large hail in Switzerland over the past century.</p>


2020 ◽  
Vol 4 (4) ◽  
pp. 365-381
Author(s):  
Ny Anjara Fifi Ravelomanantsoa ◽  
Sarah Guth ◽  
Angelo Andrianiaina ◽  
Santino Andry ◽  
Anecia Gentles ◽  
...  

Seven zoonoses — human infections of animal origin — have emerged from the Coronaviridae family in the past century, including three viruses responsible for significant human mortality (SARS-CoV, MERS-CoV, and SARS-CoV-2) in the past twenty years alone. These three viruses, in addition to two older CoV zoonoses (HCoV-229E and HCoV-NL63) are believed to be originally derived from wild bat reservoir species. We review the molecular biology of the bat-derived Alpha- and Betacoronavirus genera, highlighting features that contribute to their potential for cross-species emergence, including the use of well-conserved mammalian host cell machinery for cell entry and a unique capacity for adaptation to novel host environments after host switching. The adaptive capacity of coronaviruses largely results from their large genomes, which reduce the risk of deleterious mutational errors and facilitate range-expanding recombination events by offering heightened redundancy in essential genetic material. Large CoV genomes are made possible by the unique proofreading capacity encoded for their RNA-dependent polymerase. We find that bat-borne SARS-related coronaviruses in the subgenus Sarbecovirus, the source clade for SARS-CoV and SARS-CoV-2, present a particularly poignant pandemic threat, due to the extraordinary viral genetic diversity represented among several sympatric species of their horseshoe bat hosts. To date, Sarbecovirus surveillance has been almost entirely restricted to China. More vigorous field research efforts tracking the circulation of Sarbecoviruses specifically and Betacoronaviruses more generally is needed across a broader global range if we are to avoid future repeats of the COVID-19 pandemic.


VASA ◽  
2018 ◽  
Vol 47 (3) ◽  
pp. 165-176 ◽  
Author(s):  
Katrin Gebauer ◽  
Holger Reinecke

Abstract. Low-density lipoprotein cholesterol (LDL-C) has been proven to be a causal factor of atherosclerosis and, along with other triggers like inflammation, the most frequent reason for peripheral arterial disease. Moreover, a linear correlation between LDL-C concentration and cardiovascular outcome in high-risk patients could be established during the past century. After the development of statins, numerous randomized trials have shown the superiority for LDL-C reduction and hence the decrease in cardiovascular outcomes including mortality. Over the past decades it became evident that more intense LDL-C lowering, by either the use of highly potent statin supplements or by additional cholesterol absorption inhibitor application, accounted for an even more profound cardiovascular risk reduction. Proprotein convertase subtilisin/kexin type 9 (PCSK9), a serin protease with effect on the LDL receptor cycle leading to its degradation and therefore preventing continuing LDL-C clearance from the blood, is the target of a newly developed monoclonal antibody facilitating astounding LDL-C reduction far below to what has been set as target level by recent ESC/EAS guidelines in management of dyslipidaemias. Large randomized outcome trials including subjects with PAD so far have been able to prove significant and even more intense cardiovascular risk reduction via further LDL-C debasement on top of high-intensity statin medication. Another approach for LDL-C reduction is a silencing interfering RNA muting the translation of PCSK9 intracellularly. Moreover, PCSK9 concentrations are elevated in cells involved in plaque composition, so the potency of intracellular PCSK9 inhibition and therefore prevention or reversal of plaques may provide this mechanism of action on PCSK9 with additional beneficial effects on cells involved in plaque formation. Thus, simultaneous application of statins and PCSK9 inhibitors promise to reduce cardiovascular event burden by both LDL-C reduction and pleiotropic effects of both agents.


1901 ◽  
Vol 51 (1309supp) ◽  
pp. 20976-20977
Author(s):  
W. M. Flinders Petrje
Keyword(s):  

Author(s):  
Matthew Bagot

One of the central questions in international relations today is how we should conceive of state sovereignty. The notion of sovereignty—’supreme authority within a territory’, as Daniel Philpott defines it—emerged after the Treaty of Westphalia in 1648 as a result of which the late medieval crisis of pluralism was settled. But recent changes in the international order, such as technological advances that have spurred globalization and the emerging norm of the Responsibility to Protect, have cast the notion of sovereignty into an unclear light. The purpose of this paper is to contribute to the current debate regarding sovereignty by exploring two schools of thought on the matter: first, three Catholic scholars from the past century—Luigi Sturzo, Jacques Maritain, and John Courtney Murray, S.J.—taken as representative of Catholic tradition; second, a number of contemporary political theorists of cosmopolitan democracy. The paper argues that there is a confluence between the Catholic thinkers and the cosmopolitan democrats regarding their understanding of state sovereignty and that, taken together, the two schools have much to contribute not only to our current understanding of sovereignty, but also to the future of global governance.


Author(s):  
Seva Gunitsky

Over the past century, democracy spread around the world in turbulent bursts of change, sweeping across national borders in dramatic cascades of revolution and reform. This book offers a new global-oriented explanation for this wavelike spread and retreat—not only of democracy but also of its twentieth-century rivals, fascism, and communism. The book argues that waves of regime change are driven by the aftermath of cataclysmic disruptions to the international system. These hegemonic shocks, marked by the sudden rise and fall of great powers, have been essential and often-neglected drivers of domestic transformations. Though rare and fleeting, they not only repeatedly alter the global hierarchy of powerful states but also create unique and powerful opportunities for sweeping national reforms—by triggering military impositions, swiftly changing the incentives of domestic actors, or transforming the basis of political legitimacy itself. As a result, the evolution of modern regimes cannot be fully understood without examining the consequences of clashes between great powers, which repeatedly—and often unsuccessfully—sought to cajole, inspire, and intimidate other states into joining their camps.


Sign in / Sign up

Export Citation Format

Share Document