scholarly journals Probabilistic, high-resolution tsunami predictions in northern Cascadia by exploiting sequential design for efficient emulation

2021 ◽  
Vol 21 (12) ◽  
pp. 3789-3807
Author(s):  
Dimitra M. Salmanidou ◽  
Joakim Beck ◽  
Peter Pazak ◽  
Serge Guillas

Abstract. The potential of a full-margin rupture along the Cascadia subduction zone poses a significant threat over a populous region of North America. Previous probabilistic tsunami hazard assessment studies produced hazard curves based on simulated predictions of tsunami waves, either at low resolution or at high resolution for a local area or under limited ranges of scenarios or at a high computational cost to generate hundreds of scenarios at high resolution. We use the graphics processing unit (GPU)-accelerated tsunami simulator VOLNA-OP2 with a detailed representation of topographic and bathymetric features. We replace the simulator by a Gaussian process emulator at each output location to overcome the large computational burden. The emulators are statistical approximations of the simulator's behaviour. We train the emulators on a set of input–output pairs and use them to generate approximate output values over a six-dimensional scenario parameter space, e.g. uplift/subsidence ratio and maximum uplift, that represent the seabed deformation. We implement an advanced sequential design algorithm for the optimal selection of only 60 simulations. The low cost of emulation provides for additional flexibility in the shape of the deformation, which we illustrate here considering two families – buried rupture and splay-faulting – of 2000 potential scenarios. This approach allows for the first emulation-accelerated computation of probabilistic tsunami hazard in the region of the city of Victoria, British Columbia.

2021 ◽  
Author(s):  
Dimitra M. Salmanidou ◽  
Joakim Beck ◽  
Serge Guillas

Abstract. The potential of a full-margin rupture along the Cascadia subduction zone poses a significant threat over a populous region of North America. Traditional probabilistic tsunami hazard assessments produce hazard maps based on simulated prediction of tsunami waves either under limited ranges of scenarios or at low resolution, due to cost. We use the GPU-accelerated tsunami simulator VOLNA-OP2 with a detailed representation of topographic and bathymetric features. We replace the simulator by a Gaussian Process emulator at each output location to overcome the large computational burden. The emulators are statistical approximations of the simulator behaviour. We train the emulators on a set of input-output pairs and use them to generate approximate output values over a six-dimensional scenario parameter space, e.g., uplift/subsidence ratio, maximum uplift, that represent the seabed deformation. We implement an advanced sequential design algorithm for the optimal selection of only sixty simulations. This approach allows for a first emulation-accelerated computation of probabilistic tsunami hazard in the region of the city of Victoria, British Columbia. The low cost of emulation provides for additional flexibility in the shape of the deformation, which we illustrate here, considering two families, buried rupture and splay-faulting, of 2,000 potential scenarios.


2021 ◽  
Vol 13 (11) ◽  
pp. 2107
Author(s):  
Shiyu Wu ◽  
Zhichao Xu ◽  
Feng Wang ◽  
Dongkai Yang ◽  
Gongjian Guo

Global Navigation Satellite System Reflectometry Bistatic Synthetic Aperture Radar (GNSS-R BSAR) is becoming more and more important in remote sensing because of its low power, low mass, low cost, and real-time global coverage capability. The Back Projection Algorithm (BPA) was usually selected as the GNSS-R BSAR imaging algorithm because it can process echo signals of complex geometric configurations. However, the huge computational cost is a challenge for its application in GNSS-R BSAR. Graphics Processing Units (GPU) provides an efficient computing platform for GNSS-R BSAR processing. In this paper, a solution accelerating the BPA of GNSS-R BSAR using GPU is proposed to improve imaging efficiency, and a matching pre-processing program was proposed to synchronize direct and echo signals to improve imaging quality. To process hundreds of gigabytes of data collected by a long-time synthetic aperture in fixed station mode, a stream processing structure was used to process such a large amount of data to solve the problem of limited GPU memory. In the improvement of the imaging efficiency, the imaging task is divided into pre-processing and BPA, which are performed in the Central Processing Unit (CPU) and GPU, respectively, and a pixel-oriented parallel processing method in back projection is adopted to avoid memory access conflicts caused by excessive data volume. The improved BPA with the long synthetic aperture time is verified through the simulation of and experimenting on the GPS-L5 signal. The results show that the proposed accelerating solution is capable of taking approximately 128.04 s, which is 156 times lower than pure CPU framework for producing a size of 600 m × 600 m image with 1800 s synthetic aperture time; in addition, the same imaging quality with the existing processing solution can be retained.


2021 ◽  
Vol 9 ◽  
Author(s):  
Viviane Souty ◽  
Audrey Gailler

Probabilistic Tsunami Hazard Assessment (PTHA) is a fundamental framework for producing time-independent forecasts of tsunami hazards at the coast, taking into account local to distant tsunamigenic earthquake sources. If high resolution bathymetry and topography data at the shoreline are available, local tsunami inundation models can be computed to identify the highest risk areas and derive evidence-based evacuation plans to improve community safety. We propose a fast high-resolution Seismic-PTHA approach to estimate the tsunami hazard at a coastal level using the Bay of Cannes as test site. The S-PTHA process is firstly fastened by performing seismic and tsunami hazards separately to allow for quick updates, either from seismic rates by adding new earthquakes, or from tsunami hazard by adding new scenarios of tsunamis. Furthermore, significant tsunamis are selected on the basis of the extrapolation of a tsunami amplitude collected offshore from low-resolution simulations to an a priori amplitude nearshore using Green’s law. This allows a saving in computation time on high-resolution simulations of almost 85%. The S-PTHA performed in the Bay of Cannes exhibits maximum expected tsunami waves that do not exceed 1 m in a 2500-year period, except in some particular places such as the Old Port of Cannes. However, the probability to experience wave heights of 30 cm in this same period exceeds 50% along the main beach of Cannes and these results need to be considered in risk mitigation plans given the high touristic attraction of the area, especially in summer times.


2021 ◽  
Author(s):  
Gareth Davies ◽  
Rikki Weber ◽  
Kaya Wilson ◽  
Phil Cummins

Offshore Probabilistic Tsunami Hazard Assessments (offshore PTHAs) provide large-scale analyses of earthquake-tsunami frequencies and uncertainties in the deep ocean, but do not provide high-resolution onshore tsunami hazard information as required for many risk-management applications. To understand the implications of an offshore PTHA for the onshore hazard at any site, in principle the tsunami inundation should be simulated locally for every scenario in the offshore PTHA. In practice this is rarely feasible due to the computational expense of inundation models, and the large number of scenarios in offshore PTHAs. Monte-Carlo methods offer a practical and rigorous alternative for approximating the onshore hazard, using a random subset of scenarios. The resulting Monte-Carlo errors can be quantified and controlled, enabling high-resolution onshore PTHAs to be implemented at a fraction of the computational cost. This study develops novel Monte-Carlo sampling approaches for offshore-to-onshore PTHA. Modelled offshore PTHA wave heights are used to preferentially sample scenarios that have large offshore waves near an onshore site of interest. By appropriately weighting the scenarios, the Monte-Carlo errors are reduced without introducing any bias. The techniques are applied to a high-resolution onshore PTHA for the island of Tongatapu in Tonga. In this region, the new approaches lead to efficiency improvements equivalent to using 4-18 times more random scenarios, as compared with stratified-sampling by magnitude, which is commonly used for onshore PTHA. The greatest efficiency improvements are for rare, large tsunamis, and for calculations that represent epistemic uncertainties in the tsunami hazard. To facilitate the control of Monte-Carlo errors in practical applications, this study also provides analytical techniques for estimating the errors both before and after inundation simulations are conducted. Before inundation simulation, this enables a proposed Monte-Carlo sampling scheme to be checked, and potentially improved, at minimal computational cost. After inundation simulation, it enables the remaining Monte-Carlo errors to be quantified at onshore sites, without additional inundation simulations. In combination these techniques enable offshore PTHAs to be rigorously transformed into onshore PTHAs, with full characterisation of epistemic uncertainties, while controlling Monte-Carlo errors.


2014 ◽  
Vol 7 (8) ◽  
pp. 8233-8270
Author(s):  
K. Lengfeld ◽  
M. Clemens ◽  
H. Münster ◽  
F. Ament

Abstract. This publication intends to proof that a network of low-cost local area weather radars (LAWR) is a reliable and scientifically valuable complement to nationwide radar networks. A network of four LAWRs has been installed in northern Germany within the framework of the project Precipitation and Attenuation Estimates from a High-Resolution Weather Radar Network (PATTERN) observing precipitation with temporal resolution of 30 s, azimuthal resolution of 1° and spatial resolution of 60 m. The network covers an area of 60 km × 80 km. In this paper algorithms used to obtain undisturbed precipitation fields from raw reflectivity data are described and their performance is analysed. In order to correct for background noise in reflectivity measurements operationally, noise level estimates from the measured reflectivity field is combined with noise levels from the last 10 time steps. For detection of non-meteorological echoes two different kinds of clutter filters are applied: single radar algorithms and network based algorithms that take advantage of the unique features of high temporal and spatial resolution of the network. Overall the network based clutter filter works best with a detection rate of up to 70%, followed by the classic TDBZ filter using the texture of the logarithmic reflectivity field. A comparison of a reflectivity field from the PATTERN network with the product from a C-band radar operated by the German Meteorological Service indicates high spatial accordance of both systems in geographical position of the rain event as well as reflectivity maxima. A longterm study derives good accordance of X-band radar of the network with C-band radar. But especially at the border of precipitation events the standard deviation within a range gate of the C-band radar with range resolution of 1 km is up to 3 dBZ. Therefore, a network of high-resolution low-cost LAWRs can give valuable information on the small scale structure of rain events in areas of special interest, e.g. urban regions, in addition the nationwide radar networks.


2009 ◽  
Vol 20 ◽  
pp. 25-32 ◽  
Author(s):  
J. Van Baelen ◽  
Y. Pointin ◽  
W. Wobrock ◽  
A. Flossmann ◽  
G. Peters ◽  
...  

Abstract. This paper describes an innovative project which has just been launched at the "Laboratoire de Météorologie Physique" (LaMP) in Clermont-Ferrand in collaboration with the "Meteorologische Institut" in Hamburg, where a low cost X-band high resolution precipitation radar is combined with supporting measurements and a bin microphysical cloud resolving model in order to develop adapted Z–R relationships for accurate rain rate estimates over a local area such as a small catchment basin, an urban complex or even an agriculture domain. In particular, the use of K-band micro rain radars which can retrieve vertical profiles of drop size distribution and the associated reflectivity will be used to perform direct comparisons with X-band radar volume samples while a network of rain-gauges provides ground truth to which our rain estimates will be compared. Thus, the experimental suite of instrumentation should provide a detailed characterization of the various rain regimes and their associated Z–R relationship. Furthermore, we will make use of the hilly environment of the radar to test the use of novel attenuation methods in order to estimate rainfall rates. A second important aspect of this work is to use the detailed cloud modeling available at LaMP. Simulations of precipitating clouds in highly resolved 3-D dynamics model allow predicting the spectra of rain drops and precipitating ice particles. Radar reflectivity determined from these model studies will be compared with the observations in order to better understand which raindrop size spectrum shape factor should be applied to the radar algorithms as a function of the type of precipitating cloud. Likewise, these comparisons between the modeled and the observed reflectivity will also give us the opportunity to further improve our model microphysics and the parameterizations for meso-scale models.


Sign in / Sign up

Export Citation Format

Share Document