scholarly journals What are the main uncertainties in estimating earthquake risk?

Author(s):  
D. Vere-Jones

The general heading of seismic risk covers a variety of problems, of which the following at least may be distinguished: risk analyses for a particular structure on a particular site; 
 the determination of large-scale zoning schemes; 
 microzoning; 
 risk analyses for earthquake insurance ; 
 risk analyses for civil defence and hazard reduction programmes. 
 This paper is concerned chiefly with the first of these, although a few remarks concerning the others appear in the final section. Its intention is to identify the major uncertainties which enter into the estimation of seismic risk (in the sense of 1. above) and to place an order of magnitude estimate on the errors each is likely to cause. Some earlier papers with a similar theme are by Cornell and Vanmarke (1969), Donovan and Bernstein (1978), McGuire and Shedlock (1981), McGuire and Barnhard (1981) among others. Evidently there is a considerable element of personal judgement in such an attempt; the principal aim is to draw attention to the effects of aspects which are difficult to bring into explicit consideration and as a consequence are in danger of being ignored or otherwise brushed under the carpet.

2020 ◽  
Vol 36 (1_suppl) ◽  
pp. 345-371
Author(s):  
Anirudh Rao ◽  
Debashish Dutta ◽  
Pratim Kalita ◽  
Nick Ackerley ◽  
Vitor Silva ◽  
...  

This study presents a comprehensive open probabilistic seismic risk model for India. The proposed model comprises a nationwide residential and non-residential building exposure model, a selection of analytical seismic vulnerability functions tailored for Indian building classes, and the open implementation of an existing probabilistic seismic hazard model for India. The vulnerability of the building exposure is combined with the seismic hazard using the stochastic (Monte Carlo) event-based calculator of the OpenQuake engine to estimate probabilistic seismic risk metrics such as average annual economic losses and the exceedance probability curves at the national, state, district, and subdistrict levels. The risk model and the underlying datasets, along with the risk metrics calculated at different scales, are intended to be used as tools to quantitatively assess the earthquake risk across India and also compare with other countries to develop risk-informed building design guidelines, for more careful land-use planning, to optimize earthquake insurance pricing, and to enhance general earthquake risk awareness and preparedness.


2019 ◽  
Vol 623 ◽  
pp. A139 ◽  
Author(s):  
Adam Pluta ◽  
Niclas Mrotzek ◽  
Angelos Vourlidas ◽  
Volker Bothmer ◽  
Neel Savani

Context. We use forward modelling on multi-viewpoint coronagraph observations to estimate the 3-dimensional morphology, initial speed and deprojected masses of Coronal Mass Ejections (CMEs). The CME structure is described via the Graduated Cylindrical Shell (GCS) model, which enables the measurement of CME parameters in a consistent and comparable manner. Aims. This is the first large-scale use of the GCS model to estimate CME masses, so we discuss inherent peculiarities and implications for the mass determination with a special focus on CME events emerging from close to the observer’s central meridian. Further, we analyse the CME characteristics best suited to estimate the CME mass in a timely manner to make it available to CME arrival predictions. Methods. We apply the method to a set of 122 bright events observed simultaneously from two vantage points with the COR2 coronagraphs onboard of the twin NASA STEREO spacecraft. The events occurred between January 2007 and December 2013 and are compiled in an online catalogue within the EU FP7 project HELCATS. We statistically analyse the derived CME parameters, their mutual connection and their relation to the solar cycle. Results. We show that the derived morphology of intense disk events is still systematically overestimated by up to a factor of 2 with stereoscopic modelling, which is the same order of magnitude as for observations from only one vantage point. The overestimation is very likely a combination of projection effects as well as the increased complexity of separating CME shocks and streamers from CME fronts for such events. We further show that CME mass determination of disk events can lead to overestimation of the mass by about a factor of 10 or more, in case of overlapping bright structures. Conclusions. We conclude that for stereoscopic measurements of disk events, the measurement of the initial CME speed is the most reliable one. We further suggest that our presented CME speed-mass correlation is most suited to estimate the CME mass early from coronagraph observations.


Author(s):  
D. J. Scott

This paper is about insuring buildings and contents against the risk of earthquake. There are problems encountered with the assessment and underwriting of the earthquake risk, and there are lessons to be learnt by insurers following the Edgecumbe and Mexico earthquakes. Discussing these matters will lead to conclusions and recommendations that: (a) The exposure to the seismic risk must be properly assessed, costed, and spread more equitably amongst the community; (b) All parties likely to be involved in an earthquake should establish closer relationships now to improve disaster planning to cope with a great earthquake.


2016 ◽  
Vol 32 (1) ◽  
pp. 285-301 ◽  
Author(s):  
In Ho Cho ◽  
Keith Porter

Large-scale earthquake risk assessment necessitates models of the seismic performance of building classes. This work addresses how to depict a class with only a few (index) buildings whose designs span the attributes that most impact the seismic behavior of the class. We propose a general numerical moment matching (MM) technique to represent those seismic attributes of index buildings, which can then be individually analyzed by second-generation performance-based earthquake engineering methods (PBEE-2). The results probabilistically combined to model the class behavior as a whole. Thereby, the model honors the joint distribution of variable features within the class and propagates all other uncertainties that PBEE-2 already recognizes. Importantly, we can reflect and propagate with rigor the uncertain attributes of buildings within a class, notably without resorting to standard distributions. The MM enables PBEE-2 to rigorously capture seismic risk assessment of a building class.


2000 ◽  
Vol 179 ◽  
pp. 205-208
Author(s):  
Pavel Ambrož ◽  
Alfred Schroll

AbstractPrecise measurements of heliographic position of solar filaments were used for determination of the proper motion of solar filaments on the time-scale of days. The filaments have a tendency to make a shaking or waving of the external structure and to make a general movement of whole filament body, coinciding with the transport of the magnetic flux in the photosphere. The velocity scatter of individual measured points is about one order higher than the accuracy of measurements.


2019 ◽  
Vol 7 (2A) ◽  
Author(s):  
Camilo Fuentes Serrano ◽  
Juan Reinaldo Estevez Alvares ◽  
Alfredo Montero Alvarez ◽  
Ivan Pupo Gonzales ◽  
Zahily Herrero Fernandez ◽  
...  

A method for determination of Cr, Fe, Co, Ni, Cu, Zn, Hg and Pb in waters by Energy Dispersive X Ray Fluorescence (EDXRF) was implemented, using a radioisotopic source of 238Pu. For previous concentration was employed a procedure including a coprecipitation step with ammonium pyrrolidinedithiocarbamate (APDC) as quelant agent, the separation of the phases by filtration, the measurement of filter by EDXRF and quantification by a thin layer absolute method. Sensitivity curves for K and L lines were obtained respectively. The sensitivity for most elements was greater by an order of magnitude in the case of measurement with a source of 238Pu instead of 109Cd, which means a considerable decrease in measurement times. The influence of the concentration in the precipitation efficiency was evaluated for each element. In all cases the recoveries are close to 100%, for this reason it can be affirmed that the method of determination of the studied elements is quantitative. Metrological parameters of the method such as trueness, precision, detection limit and uncertainty were calculated. A procedure to calculate the uncertainty of the method was elaborated; the most significant source of uncertainty for the thin layer EDXRF method is associated with the determination of instrumental sensitivities. The error associated with the determination, expressed as expanded uncertainty (in %), varied from 15.4% for low element concentrations (2.5-5 μg/L) to 5.4% for the higher concentration range (20-25 μg/L).


2018 ◽  
Vol 68 (12) ◽  
pp. 2857-2859
Author(s):  
Cristina Mihaela Ghiciuc ◽  
Andreea Silvana Szalontay ◽  
Luminita Radulescu ◽  
Sebastian Cozma ◽  
Catalina Elena Lupusoru ◽  
...  

There is an increasing interest in the analysis of salivary biomarkers for medical practice. The objective of this article was to identify the specificity and sensitivity of quantification methods used in biosensors or portable devices for the determination of salivary cortisol and salivary a-amylase. There are no biosensors and portable devices for salivary amylase and cortisol that are used on a large scale in clinical studies. These devices would be useful in assessing more real-time psychological research in the future.


2019 ◽  
Vol 22 (5) ◽  
pp. 346-354
Author(s):  
Yan A. Ivanenkov ◽  
Renat S. Yamidanov ◽  
Ilya A. Osterman ◽  
Petr V. Sergiev ◽  
Vladimir A. Aladinskiy ◽  
...  

Aim and Objective: Antibiotic resistance is a serious constraint to the development of new effective antibacterials. Therefore, the discovery of the new antibacterials remains one of the main challenges in modern medicinal chemistry. This study was undertaken to identify novel molecules with antibacterial activity. Materials and Methods: Using our unique double-reporter system, in-house large-scale HTS campaign was conducted for the identification of antibacterial potency of small-molecule compounds. The construction allows us to visually assess the underlying mechanism of action. After the initial HTS and rescreen procedure, luciferase assay, C14-test, determination of MIC value and PrestoBlue test were carried out. Results: HTS rounds and rescreen campaign have revealed the antibacterial activity of a series of Nsubstituted triazolo-azetidines and their isosteric derivatives that has not been reported previously. Primary hit-molecule demonstrated a MIC value of 12.5 µg/mL against E. coli Δ tolC with signs of translation blockage and no SOS-response. Translation inhibition (26%, luciferase assay) was achieved at high concentrations up to 160 µg/mL, while no activity was found using C14-test. The compound did not demonstrate cytotoxicity in the PrestoBlue assay against a panel of eukaryotic cells. Within a series of direct structural analogues bearing the same or bioisosteric scaffold, compound 2 was found to have an improved antibacterial potency (MIC=6.25 µg/mL) close to Erythromycin (MIC=2.5-5 µg/mL) against the same strain. In contrast to the parent hit, this compound was more active and selective, and provided a robust IP position. Conclusion: N-substituted triazolo-azetidine scaffold may be used as a versatile starting point for the development of novel active and selective antibacterial compounds.


2021 ◽  
Author(s):  
Parsoa Khorsand ◽  
Fereydoun Hormozdiari

Abstract Large scale catalogs of common genetic variants (including indels and structural variants) are being created using data from second and third generation whole-genome sequencing technologies. However, the genotyping of these variants in newly sequenced samples is a nontrivial task that requires extensive computational resources. Furthermore, current approaches are mostly limited to only specific types of variants and are generally prone to various errors and ambiguities when genotyping complex events. We are proposing an ultra-efficient approach for genotyping any type of structural variation that is not limited by the shortcomings and complexities of current mapping-based approaches. Our method Nebula utilizes the changes in the count of k-mers to predict the genotype of structural variants. We have shown that not only Nebula is an order of magnitude faster than mapping based approaches for genotyping structural variants, but also has comparable accuracy to state-of-the-art approaches. Furthermore, Nebula is a generic framework not limited to any specific type of event. Nebula is publicly available at https://github.com/Parsoa/Nebula.


Sign in / Sign up

Export Citation Format

Share Document