scholarly journals Workflows for Construction of Spatio-Temporal Probabilistic Maps for Volcanic Hazard Assessment

2022 ◽  
Vol 9 ◽  
Author(s):  
Renette Jones-Ivey ◽  
Abani Patra ◽  
Marcus Bursik

Probabilistic hazard assessments for studying overland pyroclastic flows or atmospheric ash clouds under short timelines of an evolving crisis, require using the best science available unhampered by complicated and slow manual workflows. Although deterministic mathematical models are available, in most cases, parameters and initial conditions for the equations are usually only known within a prescribed range of uncertainty. For the construction of probabilistic hazard assessments, accurate outputs and propagation of the inherent input uncertainty to quantities of interest are needed to estimate necessary probabilities based on numerous runs of the underlying deterministic model. Characterizing the uncertainty in system states due to parametric and input uncertainty, simultaneously, requires using ensemble based methods to explore the full parameter and input spaces. Complex tasks, such as running thousands of instances of a deterministic model with parameter and input uncertainty require a High Performance Computing infrastructure and skilled personnel that may not be readily available to the policy makers responsible for making informed risk mitigation decisions. For efficiency, programming tasks required for executing ensemble simulations need to run in parallel, leading to twin computational challenges of managing large amounts of data and performing CPU intensive processing. The resulting flow of work requires complex sequences of tasks, interactions, and exchanges of data, hence the automatic management of these workflows are essential. Here we discuss a computer infrastructure, methodology and tools which enable scientists and other members of the volcanology research community to develop workflows for construction of probabilistic hazard maps using remotely accessed computing through a web portal.

Author(s):  
Hanz Richter ◽  
Kedar B. Karnik

The problem of controlling the rectilinear motion of an open container without exceeding a prescribed liquid level and other constraints is considered using a recently-developed constrained sliding mode control design methodology based on invariant cylinders. A conventional sliding mode regulator is designed first to address nominal performance in the sliding mode. Then an robustly-invariant cylinder is constructed and used to describe the set of safe initial conditions from which the closed-loop controller can be operated without constraint violation. Simulations of a typical transfer illustrate the usefulness of the method in an industrial setting. Experimental results corresponding to a high-speed transfer validate the theory.


Author(s):  
Antoine Ferrand ◽  
Marc Bellenoue ◽  
Yves Bertin ◽  
Radu Cirligeanu ◽  
Patrick Marconi ◽  
...  

In order to decrease the fuel consumption, a new flight mode is being considered for twin-engine helicopters, in which one engine is put into sleeping mode (a mode in which the gas generator is kept at a stabilized, sub-idle speed by means of an electric motor, with no combustion), while the remaining engine operates at nominal load. The restart of the engine in sleeping mode is therefore deemed critical for safety reasons. This efficient new flight mode has raised the interest in the modeling of the restart of a turboshaft engine. In this context, the initial conditions of the simulations are better known relative to a ground start, in particular the air flow through the gas generator is constant, the fuel and oil system states are known and temperatures of the casings are equal to ambient. During the restart phase of the engine, the gas generator speed is kept at constant speed until the light-up is detected by a rise in inter-turbine temperature, then the starter torque increases, accelerating the engine towards idle speed. In this paper, the modeling of the acceleration of the gas generator from light-up to idle and above idle speeds is presented. Details on the light-up process are not addressed here. The study is based on the high-fidelity aero-thermodynamic restart model that is currently being developed for a 2000 horse power, free turbine turboshaft. In this case, the term high-fidelity refers not only to the modeling of the flow path components but it also includes all the subsystems, secondary air flows and controls with a high level of detail. The physical phenomena governing the acceleration of the turboshaft engine following a restart — mainly the transient evolution of the combustion efficiency and the power loss by heat soakage — are discussed in this paper and modeling solutions are presented. The results of the simulations are compared to engine test data, highlighting that the studied phenomena have an impact on the acceleration of the turboshaft engine and that the model is able to correctly predict acceleration trends.


2015 ◽  
Vol 57 ◽  
Author(s):  
Andre Kristofer Pattantyus ◽  
Steven Businger

<div class="page" title="Page 1"><div class="section"><div class="layoutArea"><div class="column"><p><span>Deterministic model forecasts do not convey to the end users the forecast uncertainty the models possess as a result of physics parameterizations, simplifications in model representation of physical processes, and errors in initial conditions. This lack of understanding leads to a level of uncertainty in the forecasted value when only a single deterministic model forecast is available. Increasing computational power and parallel software architecture allows multiple simulations to be carried out simultaneously that yield useful measures of model uncertainty that can be derived from ensemble model results. The Hybrid Single Particle Lagrangian Integration Trajectory and Dispersion model has the ability to generate ensemble forecasts. A meteorological ensemble was formed to create probabilistic forecast products and an ensemble mean forecast for volcanic emissions from the Kilauea volcano that impacts the state of Hawai’i. The probabilistic forecast products show uncertainty in pollutant concentrations that are especially useful for decision-making regarding public health. Initial comparison of the ensemble mean forecasts with observations and a single model forecast show improvements in event timing for both sulfur dioxide and sulfate aerosol forecasts. </span></p></div></div></div></div><p> </p>


2021 ◽  
Author(s):  
Cora Fontana ◽  
Eleonora Cianci ◽  
Massimiliano Moscatelli

&lt;p&gt;School education constitutes one of the strategic functions to be recovered after an earthquake. The structural improvement of school buildings together with the strengthening of the administrators&amp;#8217; capacity to react positively following an earthquake are key factors that contribute to social vulnerability&amp;#8217;s reduction. Nevertheless, in Italy, the issue of risk reduction policies related to school sector is not yet consolidated in the institutional agendas. Observing the last major Italian earthquakes what remains predominant is school buildings&amp;#8217; damage degree with consequent interruption of the system functionality. Among the causes: the building heritage vulnerability and the lack of risk mitigation policies, capable of building a resilient community for future earthquakes. That of resilience is considered a relevant paradigm to address the issue of how to strengthen the school sector&amp;#8217;s capacity to ensure the buildings physical safety and to guarantee the maintenance of the school function, looking at pre and post-event phases.&lt;/p&gt;&lt;p&gt;The paper proposes a set of indicators and a methodology for a preliminary assessment of the educational sector&amp;#8217;s seismic resilience, in terms of initial conditions. The method has been tested on a first case study: Calabria Region, Southern Italy. The results show that spatial differences in the educational sector&amp;#8217;s seismic resilience are evident. Except for some large urban areas, the less resilient areas are grouped mainly in the southern part of the Region, while the most resilient ones are located mostly in the central-northern sector. The ambition is to identify a repeatable approach, useful as guidelines for school seismic prevention policies.&lt;/p&gt;


2021 ◽  
Author(s):  
Valeria Lupiano ◽  
Claudia Calidonna ◽  
Paolo Catelan ◽  
Francesco Chidichimo ◽  
Gino Mirocle Crisci ◽  
...  

&lt;p&gt;Lahars represent one of the world destructive natural phenomena as number of casualties (Manville et al., 2013). Lahars originate as mixtures of water and volcanic deposits frequently by heavy rainfalls; they are erosive floods capable of increase in volume along its path to more than 10 times their initial size, moving up to 100 km/h in steeply sloping as far as an extreme distance of hundreds of kilometers.&lt;/p&gt;&lt;p&gt;Beside tools of early warning, security measures have been adopted in volcanic territory, by constructing retaining dams and embankments in key positions for containing and deviating possible lahars (Leung et al., 2003). This solution could involve a strong environmental impact both for the works and the continuous accumulation of volcanic deposits, such that equilibrium conditions could lack far, triggering more disastrous events.&lt;/p&gt;&lt;p&gt;The growing frequency of lahars in the Vasc&amp;#250;n Valley area, Tungurahua Volcano Ecuador, maybe for the climatic change, has recently produced smaller (shorter accumulation periods) and therefore less dangerous events.&lt;/p&gt;&lt;p&gt;Momentary ponds form along rivers in volcanic areas, when they become usually blocked by landslides of volcanic deposits, which are originated by pyroclastic flows and lahars. The most frequent cause of a breakout of such natural ponds is the overflow of water across the newly formed dam and subsequent erosion and rapid downcutting into the loose rock debris.&lt;/p&gt;&lt;p&gt;Dam collapse can occur by sliding of the volcanic deposit or by its overturning. By eroding the blockage and flowing out river channel downstream, the initial surge of water will incorporate a dangerous volume of sediments. This produces lahars with possible devastating effects for settlements in their path (Leung et al., 2003).&lt;/p&gt;&lt;p&gt;The use of simulation tools (from the cellular automata model LLUNPIY) and field data (including necessary subsoil survey) permit to individuate points, where dams by backfills, easy to collapse, can produce momentary ponds.&lt;/p&gt;&lt;p&gt;Small temporary dams with similar (but controlled) behavior of above mentioned dams can be designed and built at low cost by local backfills in order to allow the outflow of streams produced by regular rainfall events. This result is achieved by properly dimensioning a discharge channel at the dam base (Lupiano et al., 2020).&lt;/p&gt;&lt;p&gt;So small lahars can be triggered for minor rainfall events, lahar detachments can be anticipated for major events, avoiding simultaneous confluence with other lahars (Lupiano et al., 2020).&lt;/p&gt;&lt;p&gt;&lt;strong&gt;REFERENCES&lt;/strong&gt;&lt;/p&gt;&lt;p&gt;Leung, MF, Santos, JR, Haimes, YY (2003). Risk modeling, assessment, and management of lahar flow threat. Risk Analysis, 23(6), 1323-1335.&lt;/p&gt;&lt;p&gt;Lupiano, V., Chidichimo, F., Machado, G., Catelan, P., Molina, L., Calidonna, C.R., Straface, S., Crisci, G. M., And Di Gregorio, S. (2020) - From examination of natural events to a proposal for risk mitigation of lahars by a cellular-automata methodology: a case study for Vasc&amp;#250;n valley, Ecuador. Nat. Hazards Earth Syst. Sci., 20, 1&amp;#8211;20, 2020.&lt;/p&gt;&lt;p&gt;Manville, V., Major, J.J. and Fagents, S.A. (2013). Modeling lahar behavior and hazards. in Fagents, SA, Gregg, TKP, and Lopes, RMC (eds.) Modeling Volcanic Processes: The Physics and Mathematics of Volcanism. Cambridge: Cambridge University Press, pp. 300&amp;#8211;330.&lt;/p&gt;


2021 ◽  
Author(s):  
Stéphanie Leroux ◽  
Jean-Michel Brankart ◽  
Aurélie Albert ◽  
Jean-Marc Molines ◽  
Laurent Brodeau ◽  
...  

&lt;p&gt;In this contribution, we investigate the predictability properties of the ocean dynamics using an ensemble of medium range numerical forecasts. This question is particularly relevant for ocean dynamics at small scales (&lt; 30 km), where sub-mesoscale dynamics is responsible for the fast evolution of ocean properties. Relatively little is known about the predictability properties of a high resolution model, and hence about the accuracy and resolution that is needed from the observation system used to generate the initial conditions.&lt;/p&gt;&lt;p&gt;A kilometric-scale regional configuration of NEMO for the Western Mediterranean (MEDWEST60, at 1/60&amp;#186; horizontal resolution) has been developed, using boundary conditions from a larger&amp;#160; North Atlantic configuration at same resolution (eNATL60). This deterministic model has then been transformed into a probabilistic model by introducing innovative stochastic parameterizations of model uncertainties resulting from unresolved processes. The purpose is here primarily to generate ensembles of&amp;#160; model states to initialize predictability experiments. The stochastic parameterization is also applied to assess the possible impact of irreducible model uncertainties on the skill of the forecast. A set of three ensemble experiments (20 members and 2 months ) are performed, one&amp;#160; with the deterministic model initiated with perturbed initial conditions, and two with the stochastic model, for two different amplitudes of model uncertainty. In all three experiments, the spread of the ensemble is shown to emerge from the small scales (10 km wavelength) and progressively upscales to the largest structures. After two months, the ensemble variance saturates over most of the spectrum (except in the largest scales), whereas the small scales (&lt; 30 km) are fully decorrelated between the different members. These ensemble simulations are thus appropriate to provide a statistical description of the dependence between initial accuracy and forecast accuracy over the full range of potentially-useful forecast time-lags (typically, between 1 and 20 days).&amp;#160;&amp;#160;&amp;#160;&lt;/p&gt;&lt;p&gt;The predictability properties are statistically assessed using a cross-validation algorithm (i.e. using alternatively each ensemble member as the reference truth and the remaining 19 members as the ensemble forecast) together with a specific score to characterize the initial and forecast accuracy. From the joint distribution of initial and final scores, it is then possible to quantify the probability distribution of the forecast score given the initial score, or reciprocally to derive conditions on the initial accuracy to obtain a target forecast skill. In this contribution, the misfit between ensemble members is quantified in terms of overall accuracy (CRPS score), geographical position of the ocean structures (location score), and&amp;#160; spatial spectral decorrelation of the Sea Surface Height 2-D fields (spectral score). For example, our results show that, in the region and period&amp;#160; of interest, the initial location accuracy required (necessary condition) with a perfect model (deterministic) to obtain a location accuracy of the forecast of 10 km with a 95% confidence is about 8 km for a 1-day forecast, 4 km for a 5-day forecast, 1.5 km for a 10-day forecast, and this requirement cannot be met with a 15-day or longer forecast.&lt;/p&gt;


2010 ◽  
Vol 1 (1) ◽  
pp. 36-47 ◽  
Author(s):  
Atilla Altinok ◽  
Didier Gonze ◽  
Francis Lévi ◽  
Albert Goldbeter

We consider an automaton model that progresses spontaneously through the four successive phases of the cell cycle: G1, S (DNA replication), G2 and M (mitosis). Each phase is characterized by a mean duration τ and a variability V . As soon as the prescribed duration of a given phase has passed, the transition to the next phase of the cell cycle occurs. The time at which the transition takes place varies in a random manner according to a distribution of durations of the cell cycle phases. Upon completion of the M phase, the cell divides into two cells, which immediately enter a new cycle in G1. The duration of each phase is reinitialized for the two newborn cells. At each time step in any phase of the cycle, the cell has a certain probability to be marked for exiting the cycle and dying at the nearest G1/S or G2/M transition. To allow for homeostasis, which corresponds to maintenance of the total cell number, we assume that cell death counterbalances cell replication at mitosis. In studying the dynamics of this automaton model, we examine the effect of factors such as the mean durations of the cell cycle phases and their variability, the type of distribution of the durations, the number of cells, the regulation of the cell population size and the independence of steady-state proportions of cells in each phase with respect to initial conditions. We apply the stochastic automaton model for the cell cycle to the progressive desynchronization of cell populations and to their entrainment by the circadian clock. A simple deterministic model leads to the same steady-state proportions of cells in the four phases of the cell cycle.


1998 ◽  
Vol 88 (10) ◽  
pp. 1000-1012 ◽  
Author(s):  
X.-M. Xu ◽  
M. S. Ridout

A stochastic model that simulates the spread of disease over space and time was developed to study the effects of initial epidemic conditions (number of initial inocula and their spatial pattern), sporulation rate, and spore dispersal gradient on the spatio-temporal dynamics of plant disease epidemics. The spatial spread of disease was simulated using a half-Cauchy distribution with median dispersal distance μ (units of distance). The rate of temporal increase in disease incidence (βI, per day) was influenced jointly by μ and by the sporulation rate λ (spores per lesion per day). The relationship between βI and μ was nonlinear: the increase in βI with increasing μ was greatest when μ was small (i.e., when the dispersal gradient was steep). The rate of temporal increase in disease severity of diseased plants (βS) was affected mainly by λ: βS increased directly with increasing λ. Intraclass correlation (κt), the correlation of disease status of plants within quadrats, increased initially with disease incidence, reached a peak, and then declined as disease incidence approached 1.0. This relationship was well described by a power-law model that is consistent with the binary form of the variance power law. The amplitude of the model relating κt to disease incidence was affected mainly by μ: κt decreased with increasing μ. The shape of the curve was affected mainly by initial conditions, especially the spatial pattern of the initial inocula. Generally, the relationship of spatial autocorrelation (ρt,k), the correlation of disease status of plants at various distances apart, to disease incidence and distance was well described by a four-parameter power-law model. ρt,k increased with disease incidence to a maximum and then declined at higher values of disease incidence, in agreement with a power-law relationship. The amplitude of ρt,k was determined mainly by initial conditions and by μ: ρt,k decreased with increasing μ and was lower for regular patterns of initial inocula. The shape of the ρt,k curve was affected mainly by initial conditions, especially the spatial pattern of the initial inocula. At any level of disease incidence, autocorrelation declined exponentially with spatial lag; the degree of this decline was determined mainly by μ: it was steeper with decreasing μ.


2003 ◽  
Vol 31 (3) ◽  
pp. 233-244
Author(s):  
Antonio Campo ◽  
Francisco Alhama

Evaluation of spatio-temporal temperatures and total heat transfer rates in simple bodies (large plate, long cylinder and sphere) has been traditionally explained in undergraduate courses of heat transfer by the Heisler/Gröber or by the Boelter/Gröber charts. These three charts pose some restrictions with respect to the applicable times. Additionally, the charts do not provide information about the time-dependent heat fluxes at the surface. Conversely, evaluation of spatio-temporal temperatures, time-dependent heat fluxes at the surface and total heat transfer rates can be easily done for the entire time domain with the network simulation method (NSM) in conjunction with the commercial code PSPICE. NSM relies on the existing physical analogy between the unsteady transport of electric current and the unsteady transport of unidirectional heat by conduction. This analogy has been named the RC analogy in the specialized literature. The code PSPICE simulates the electric circuits for a specific body together with the imposed boundary and initial conditions, and produces numerical results for the quantities of interest, such as: the spatio-temporal temperature distributions; the time-dependent heat flux distributions at the surface; and the total heat transfer.


Sign in / Sign up

Export Citation Format

Share Document