aleatory variability
Recently Published Documents


TOTAL DOCUMENTS

30
(FIVE YEARS 14)

H-INDEX

11
(FIVE YEARS 2)

2021 ◽  
pp. 875529302110560
Author(s):  
Yousef Bozorgnia ◽  
Norman A Abrahamson ◽  
Sean K Ahdi ◽  
Timothy D Ancheta ◽  
Linda Al Atik ◽  
...  

This article summarizes the Next Generation Attenuation (NGA) Subduction (NGA-Sub) project, a major research program to develop a database and ground motion models (GMMs) for subduction regions. A comprehensive database of subduction earthquakes recorded worldwide was developed. The database includes a total of 214,020 individual records from 1,880 subduction events, which is by far the largest database of all the NGA programs. As part of the NGA-Sub program, four GMMs were developed. Three of them are global subduction GMMs with adjustment factors for up to seven worldwide regions: Alaska, Cascadia, Central America and Mexico, Japan, New Zealand, South America, and Taiwan. The fourth GMM is a new Japan-specific model. The GMMs provide median predictions, and the associated aleatory variability, of RotD50 horizontal components of peak ground acceleration, peak ground velocity, and 5%-damped pseudo-spectral acceleration (PSA) at oscillator periods ranging from 0.01 to 10 s. Three GMMs also quantified “within-model” epistemic uncertainty of the median prediction, which is important in regions with sparse ground motion data, such as Cascadia. In addition, a damping scaling model was developed to scale the predicted 5%-damped PSA of horizontal components to other damping ratios ranging from 0.5% to 30%. The NGA-Sub flatfile, which was used for the development of the NGA-Sub GMMs, and the NGA-Sub GMMs coded on various software platforms, have been posted for public use.


2021 ◽  
Vol 21 (11) ◽  
pp. 3509-3517
Author(s):  
Warner Marzocchi ◽  
Jacopo Selva ◽  
Thomas H. Jordan

Abstract. The main purpose of this article is to emphasize the importance of clarifying the probabilistic framework adopted for volcanic hazard and eruption forecasting. Eruption forecasting and volcanic hazard analysis seek to quantify the deep uncertainties that pervade the modeling of pre-, sin-, and post-eruptive processes. These uncertainties can be differentiated into three fundamental types: (1) the natural variability of volcanic systems, usually represented as stochastic processes with parameterized distributions (aleatory variability); (2) the uncertainty in our knowledge of how volcanic systems operate and evolve, often represented as subjective probabilities based on expert opinion (epistemic uncertainty); and (3) the possibility that our forecasts are wrong owing to behaviors of volcanic processes about which we are completely ignorant and, hence, cannot quantify in terms of probabilities (ontological error). Here we put forward a probabilistic framework for hazard analysis recently proposed by Marzocchi and Jordan (2014), which unifies the treatment of all three types of uncertainty. Within this framework, an eruption forecasting or a volcanic hazard model is said to be complete only if it (a) fully characterizes the epistemic uncertainties in the model's representation of aleatory variability and (b) can be unconditionally tested (in principle) against observations to identify ontological errors. Unconditional testability, which is the key to model validation, hinges on an experimental concept that characterizes hazard events in terms of exchangeable data sequences with well-defined frequencies. We illustrate the application of this unified probabilistic framework by describing experimental concepts for the forecasting of tephra fall from Campi Flegrei. Eventually, this example may serve as a guide for the application of the same probabilistic framework to other natural hazards.


2021 ◽  
pp. 875529302110348
Author(s):  
Grace A Parker ◽  
Jonathan P Stewart ◽  
David M Boore ◽  
Gail M Atkinson ◽  
Behzad Hassani

We develop semi-empirical ground motion models (GMMs) for peak ground acceleration, peak ground velocity, and 5%-damped pseudo-spectral accelerations for periods from 0.01 to 10 s, for the median orientation-independent horizontal component of subduction earthquake ground motion. The GMMs are applicable to interface and intraslab subduction earthquakes in Japan, Taiwan, Mexico, Central America, South America, Alaska, the Aleutian Islands, and Cascadia. The GMMs are developed using a combination of data inspection, data regression with respect to physics-informed functions, ground-motion simulations, and geometrical constraints for certain model components. The GMMs capture observed differences in source and path effects for interface and intraslab events, conditioned on moment magnitude, rupture distance, and hypocentral depth. Site effect and aleatory variability models are shared between event types. Regionalized GMM components include the model constant (that controls ground motion amplitude), anelastic attenuation, magnitude-scaling break point, linear site response, and sediment depth terms. We develop models for the aleatory between-event variability [Formula: see text], within-event variability [Formula: see text], single-station within-event variability [Formula: see text], and site-to-site variability [Formula: see text]. Ergodic analyses should use the median GMM and aleatory variability computed using the between-event and within-event variability models. An analysis incorporating non-ergodic site response should use the median GMM at the reference shear-wave velocity condition, a site-specific site response model, and aleatory variability computed using the between-event and single-station within-event variability models. Epistemic uncertainty in the median model is represented by standard deviations on the regional model constants, which facilitates scaled-backbone representations of model uncertainty in hazard analyses.


2021 ◽  
Author(s):  
Warner Marzocchi ◽  
Jacopo Selva ◽  
Thomas H. Jordan

Abstract. The main purpose of this article is to emphasize the importance of clarifying the probabilistic framework adopted for volcanic hazard and eruption forecasting. Eruption forecasting and volcanic hazard analysis seeks to quantify the deep uncertainties that pervade the modeling of pre-, sin- and post-eruptive processes. These uncertainties can be differentiated into three fundamental types: (1) the natural variability of volcanic systems, usually represented as stochastic processes with parameterized distributions (aleatory variability); (2) the uncertainty in our knowledge of how volcanic systems operate and evolve, often represented as subjective probabilities based on expert opinion (epistemic uncertainty); and (3) the possibility that our forecasts are wrong owing to behaviors of volcanic processes about which we are completely ignorant and, hence, cannot quantify in terms of probabilities (ontological error). Here we put forward a probabilistic framework for hazard analysis recently proposed by Marzocchi & Jordan (2014), which unifies the treatment of all three types of uncertainty. Within this framework, an eruption forecasting or a volcanic hazard model is said to be complete only if it (a) fully characterizes the epistemic uncertainties in the model's representation of aleatory variability and (b) can be unconditionally tested (in principle) against observations to identify ontological errors. Unconditional testability, which is the key to model validation, hinges on an experimental concept that characterizes hazard events in terms of exchangeable data sequences with well-defined frequencies. We illustrate the application of this unified probabilistic framework by describing experimental concepts for the forecasting of tephra fall from Campi Flegrei. Eventually, this example may serve as a guide for the application of the same probabilistic framework to other natural hazards.


Author(s):  
Elaine K. Young ◽  
Eric Cowgill ◽  
Katherine M. Scharer ◽  
Emery O. Anderson-Merritt ◽  
Amanda Keen-Zebert ◽  
...  

ABSTRACT The geologic slip rate on the Mojave section of the San Andreas fault is poorly constrained, despite its importance for understanding earthquake hazard, apparent discrepancies between geologic and geodetic slip rates along this fault section, and long-term fault interactions in southern California. Here, we use surficial geologic mapping, excavations, and radiocarbon and luminescence dating to quantify the displacements and ages of late Holocene landforms offset by the fault at three sites. At the Ranch Center site, the slip rate is determined using the base of a fan marking incision and deflection of an ephemeral channel. At the adjacent Key Slide site, the margin of a landslide deposited on indigenous fire hearths provides a minimum rate. At the X-12 site, the slip rate is determined from a channel that incised into a broad fan surface, and is deflected and beheaded by the fault. We use maximum–minimum bounds on both the displacement and age of each offset feature to calculate slip rate for each site independently. Overlap of the three independent rate ranges yields a rate of 33–39 mm/yr over the last 3 ka, under the assumption that the sites share a common history, given their proximity. Considered in sequence, site-level epistemic uncertainties in the data permit but do not require a rate increase since ∼1200 cal B.P. Modest rate changes can be explained by aleatory variability in earthquake timing and magnitude; larger changes could suggest a shared regional variation with the Garlock and other faults. The new late Holocene slip rates are consistent with geodetic model estimates that include a viscoelastic crust and earthquake cycle effects. The geologic slip rates also provide average slip over dozens of earthquake cycles—a key constraint for long-term earthquake rupture forecasts.


2021 ◽  
Author(s):  
Meng Zhang ◽  
Hua Pan

AbstractThe lognormal distribution is commonly used to characterize the aleatory variability of ground-motion prediction equations (GMPEs) in probabilistic seismic hazard analysis (PSHA). However, this approach often leads to results without actual physical meaning at low exceedance probabilities. In this paper, we discuss how to calculate PSHA with a low exceedance probability. Peak ground acceleration records from the NGA-West2 database and 15,493 residuals calculated by Campbell-Bozorgnia using the NGA-West2 GMPE were applied to analyze the tail shape of the residuals. The results showed that the generalized Pareto distribution (GPD) captured the characteristics of residuals in the tail better than the lognormal distribution. Further study showed that the shapes of the tails of the distributions of residuals with different magnitudes varied significantly due to the heteroscedasticity of the magnitude; the distribution of residuals with larger magnitudes had a smaller upper limit on the right side. Moreover, the residuals of the three magnitude ranges given in this study were more consistent with the GPD of different parameters at the tail than the lognormal distribution and the GPD fitted by all the residuals, leading to a bounded PSHA hazard curve. Therefore, the lognormal distribution is more representative up to a determined threshold, and the GPD fitted to the residuals of three ranges of magnitude better characterizes the tail for PSHA calculation.


2021 ◽  
pp. 875529302199383
Author(s):  
Sanaz Rezaeian ◽  
Peter M Powers ◽  
Allison M Shumway ◽  
Mark D Petersen ◽  
Nicolas Luco ◽  
...  

The United States Geological Survey (USGS) National Seismic Hazard Model (NSHM) is the scientific foundation of seismic design regulations in the United States and is regularly updated to consider the best available science and data. The 2018 update of the conterminous US NSHM includes major changes to the underlying ground motion models (GMMs). Most of the changes are motivated by the new multi-period response spectra requirements of seismic design regulations that use hazard results for 22 spectral periods and 8 site classes. In the central and eastern United States (CEUS), the 2018 NSHM incorporates 31 new GMMs for hard-rock site conditions [Formula: see text], including the Next Generation Attenuation (NGA)-East GMMs. New aleatory variability and site-effect models, both specific to the CEUS, are applied to all median hard-rock GMMs. This article documents the changes to the USGS GMM selection criteria and provides details on the new CEUS GMMs used in the 2018 NSHM update. The median GMMs, their weights, epistemic uncertainty, and aleatory variability are compared with those considered in prior NSHMs. This article further provides implementation details on the CEUS site-effect model, which allows conversion of hard-rock ground motions to other site conditions in the CEUS for the first time in NSHMs. Compared with the 2014 NSHM hard-rock ground motions, the weighted average of median GMMs increases for large magnitude events at middle to large distance range, epistemic uncertainty increases in almost all situations, but aleatory variability is not significantly different. Finally, the total effect on hazard is demonstrated for an assumed earthquake source model in the CEUS, which shows an increased ring of ground motions in the vicinity of the New Madrid seismic zone and decreased ground motions near the East Tennessee seismic zone.


2021 ◽  
Author(s):  
Molly Gallahue ◽  
Leah Salditch ◽  
Madeleine Lucas ◽  
James Neely ◽  
Susan Hough ◽  
...  

<div> <p>Probabilistic seismic hazard assessments forecast levels of earthquake shaking that should be exceeded with only a certain probability over a given period of time are important for earthquake hazard mitigation. These rely on assumptions about when and where earthquakes will occur, their size, and the resulting shaking as a function of distance as described by ground-motion models (GMMs) that cover broad geologic regions. Seismic hazard maps are used to develop building codes.</p> </div><div> <p>To explore the robustness of maps’ shaking forecasts, we consider how maps hindcast past shaking. We have compiled the California Historical Intensity Mapping Project (CHIMP) dataset of the maximum observed seismic intensity of shaking from the largest Californian earthquakes over the past 162 years. Previous comparisons between the maps for a constant V<sub>S30</sub> (shear-wave velcoity in the top 30 m of soil) of 760 m/s and CHIMP based on several metrics suggested that current maps overpredict shaking.</p> <p>The differences between the V<sub>S30</sub> at the CHIMP sites and the reference value of 760 m/s could amplify or deamplify the ground motions relative to the mapped values. We evaluate whether the V<sub>S30 </sub>at the CHIMP sites could cause a possible bias in the models. By comparison with the intensity data in CHIMP, we find that using site-specific V<sub>S30</sub> does not improve map performance, because the site corrections cause only minor differences from the original 2018 USGS hazard maps at the short periods (high frequencies) relevant to peak ground acceleration and hence MMI. The minimal differences reflect the fact that the nonlinear deamplification due to increased soil damping largely offsets the linear amplification due to low V<sub>S30</sub>. The net effects will be larger for longer periods relevant to tall buildings, where net amplification occurs. </p> <div> <p>Possible reasons for this discrepancy include limitations of the dataset, a bias in the hazard models, an over-estimation of the aleatory variability of the ground motion or that seismicity throughout the historical period has been lower than the long-term average, perhaps by chance due to the variability of earthquake recurrence. Resolving this discrepancy, which is also observed in Italy and Japan, could improve the performance of seismic hazard maps and thus earthquake safety for California and, by extension, worldwide. We also explore whether new nonergodic GMMs, with reduced aleatory variability, perform better than presently used ergodic GMMs compared to historical data.</p> </div> </div>


2021 ◽  
pp. 875529302098801
Author(s):  
Mark D Petersen ◽  
Allison M Shumway ◽  
Peter M Powers ◽  
Charles S Mueller ◽  
Morgan P Moschetti ◽  
...  

The 2018 US Geological Survey National Seismic Hazard Model (NSHM) incorporates new data and updated science to improve the underlying earthquake and ground motion forecasts for the conterminous United States. The NSHM considers many new data and component input models: (1) new earthquakes between 2013 and 2017 and updated earthquake magnitudes for some earlier earthquakes; (2) two updated smoothed seismicity models to forecast earthquake rates; (3) two suites of new central and eastern US (CEUS) ground motion models (GMMs) to translate ground shaking for various earthquake sizes and source-to-site distances considered in the model; (4) two CEUS GMMs for aleatory variability; (5) two CEUS site-effect models that modify ground shaking based on alternative shallow site conditions; (6) more advanced western US (WUS) lithologic and structural information to assess basin site effects for selected urban regions; and (7) a more comprehensive range of outputs (22 periods and 8 site classes) than in previous versions of the NSHMs. Each of these new datasets and models produces changes in the probabilistic ground shaking levels that are spatially and statistically analyzed. Recent earthquakes or changes to some older earthquake magnitudes and locations mostly result in probabilistic ground shaking levels that are similar to previous models, but local changes can reach up to +80% and −60% compared to the 2014 model. Newly developed CEUS models for GMMs, aleatory variability, and site effects cause overall changes up to ±64%. The addition of the WUS basin amplifications causes changes of up to +60% at longer periods for sites overlying deep soft soils. Across the conterminous United States, the hazard changes in the model are mainly caused by new GMMs in the CEUS, by sedimentary basin effects for long periods (≥1 s) in the WUS, and by seismicity changes for short (0.2 s) and long (1 s) periods for both areas.


2020 ◽  
Vol 92 (1) ◽  
pp. 238-245
Author(s):  
Christopher Brooks ◽  
John Douglas

Abstract The aleatory-variability component (standard deviation) of a ground motion has a large influence on results of a probabilistic seismic hazard assessment. kappa, a measure of high-frequency attenuation, has site- and record-specific effects that have been suggested as reasons for observing heteroscedastic aleatory variability within earthquake ground motions. Specifically, kappa has been proposed as a reason why ground motions from small earthquakes are more variable than those from large earthquakes, which is modeled by magnitude-dependent within-event standard deviations in ground-motion prediction equations (GMPEs). In this study, we use ground motions simulated using the stochastic method to examine the influence of the site-specific component of kappa on aleatory variability of earthquake ground motions and examine the hypothesis that this could be a cause of the observed heteroscedasticity in this variability. We consider simulations with both fixed and continuous stress drop distributions and the site-specific component of kappa to demonstrate that variation in the stress drop parameter contributes minimally to magnitude-dependency, unlike the site-specific component of kappa, which causes significant magnitude-dependency. Variation in the site-specific component of kappa is, therefore, proposed to be at least partially responsible for the magnitude-dependency captured in the aleatory-variability components of some recent GMPEs. It is found, however, that the expected impact of the site-specific component of kappa on aleatory variability is much greater than modeled in these GMPEs, which suggests that there could be a mitigating effect that is not captured within the simulations (e.g., correlated inputs to the simulations).


Sign in / Sign up

Export Citation Format

Share Document