scholarly journals An Operational Earthquake Forecasting Experiment for Israel: Preliminary Results

2021 ◽  
Vol 9 ◽  
Author(s):  
Giuseppe Falcone ◽  
Ilaria Spassiani ◽  
Yosef Ashkenazy ◽  
Avi Shapira ◽  
Rami Hofstetter ◽  
...  

Operational Earthquake Forecasting (OEF) aims to deliver timely and reliable forecasts that may help to mitigate seismic risk during earthquake sequences. In this paper, we build the first OEF system for the State of Israel, and we evaluate its reliability. This first version of the OEF system is composed of one forecasting model, which is based on a stochastic clustering Epidemic Type Earthquake Sequence (ETES) model. For every day of the forecasting time period, January 1, 2016 - November 15, 2020, the OEF-Israel system produces a weekly forecast for target earthquakes with local magnitudes greater than 4.0 and 5.5 in the entire State of Israel. Specifically, it provides space-time-dependent seismic maps of the weekly probabilities, obtained by using a fixed set of the model’s parameters, which are estimated through the maximum likelihood technique based on a learning period of about 32 years (1983–2015). According to the guidance proposed by the Collaboratory for the Study of Earthquake Predictability (CSEP), we also perform the N- and S-statistical tests to verify the reliability of the forecasts. Results show that the OEF system forecasts a number of events comparable to the observed one, and also captures quite well the spatial distribution of the real catalog with the exception of two target events that occurred in low seismicity regions.

2020 ◽  
Vol 110 (4) ◽  
pp. 1736-1751 ◽  
Author(s):  
Simone Mancini ◽  
Margarita Segou ◽  
Maximilian Jonas Werner ◽  
Tom Parsons

ABSTRACT Operational earthquake forecasting protocols commonly use statistical models for their recognized ease of implementation and robustness in describing the short-term spatiotemporal patterns of triggered seismicity. However, recent advances on physics-based aftershock forecasting reveal comparable performance to the standard statistical counterparts with significantly improved predictive skills when fault and stress-field heterogeneities are considered. Here, we perform a pseudoprospective forecasting experiment during the first month of the 2019 Ridgecrest (California) earthquake sequence. We develop seven Coulomb rate-and-state models that couple static stress-change estimates with continuum mechanics expressed by the rate-and-state friction laws. Our model parameterization supports a gradually increasing complexity; we start from a preliminary model implementation with simplified slip distributions and spatially homogeneous receiver faults to reach an enhanced one featuring optimized fault constitutive parameters, finite-fault slip models, secondary triggering effects, and spatially heterogenous planes informed by pre-existing ruptures. The data-rich environment of southern California allows us to test whether incorporating data collected in near-real time during an unfolding earthquake sequence boosts our predictive power. We assess the absolute and relative performance of the forecasts by means of statistical tests used within the Collaboratory for the Study of Earthquake Predictability and compare their skills against a standard benchmark epidemic-type aftershock sequence (ETAS) model for the short (24 hr after the two Ridgecrest mainshocks) and intermediate terms (one month). Stress-based forecasts expect heightened rates along the whole near-fault region and increased expected seismicity rates in central Garlock fault. Our comparative model evaluation not only supports that faulting heterogeneities coupled with secondary triggering effects are the most critical success components behind physics-based forecasts, but also underlines the importance of model updates incorporating near-real-time available aftershock data reaching better performance than standard ETAS. We explore the physical basis behind our results by investigating the localized shut down of pre-existing normal faults in the Ridgecrest near-source area.


2021 ◽  
Author(s):  
Yavor Kamer ◽  
Shyam Nandan ◽  
Stefan Hiemer ◽  
Guy Ouillon ◽  
Didier Sornette

<p>Nature is scary. You can be sitting at your home and next thing you know you are trapped under the ruble of your own house or sucked into a sinkhole. For millions of years we have been the figurines of this precarious scene and we have found our own ways of dealing with the anxiety. It is natural that we create and consume prophecies, conspiracies and false predictions. Information technologies amplify not only our rational but also irrational deeds. Social media algorithms, tuned to maximize attention, make sure that misinformation spreads much faster than its counterpart.</p><p>What can we do to minimize the adverse effects of misinformation, especially in the case of earthquakes? One option could be to designate one authoritative institute, set up a big surveillance network and cancel or ban every source of misinformation before it spreads. This might have worked a few centuries ago but not in this day and age. Instead we propose a more inclusive option: embrace all voices and channel them into an actual, prospective earthquake prediction platform (Kamer et al. 2020). The platform is powered by a global state-of-the-art statistical earthquake forecasting model that provides near real-time earthquake occurrence probabilities anywhere on the globe (Nandan et al. 2020). Using this model as a benchmark in statistical metrics specifically tailored to the prediction problem, we are able to distill all these voices and quantify the essence of predictive skill. This approach has several advantages. Rather than trying to silence or denounce, we listen and evaluate each claim and report the predictive skill of the source. We engage the public and allow them to take part in a scientific experiment that will increase their risk awareness. We effectively demonstrate that anybody with an internet connected device can make an earthquake prediction, but that it is not so trivial to achieve skillful predictive performance.</p><p>Here we shall present initial results from our global earthquake prediction experiment that we have been conducting on www.richterx.com for the past two years, yielding more than 10,000 predictions. These results will hopefully demystify the act of predicting an earthquake in the eyes of the public, and next time someone forwards a prediction message it would arouse more scrutiny than panic or distaste.<br><br>Nandan, S., Kamer, Y., Ouillon, G., Hiemer, S., Sornette, D. (2020). <em>Global models for short-term earthquake forecasting and predictive skill assessment</em>. European Physical Journal ST. doi: 10.1140/epjst/e2020-000259-3<br>Kamer, Y., Nandan, S., Ouillon, G., Hiemer, S., Sornette, D. (2020). <em>Democratizing earthquake predictability research: introducing the RichterX platform.</em> European Physical Journal ST. doi: 10.1140/epjst/e2020-000260-2 </p>


2012 ◽  
Vol 2 (1) ◽  
pp. 3
Author(s):  
David Alan Rhoades ◽  
Paul G. Somerville ◽  
Felipe Dimer de Oliveira ◽  
Hong Kie Thio

The Every Earthquake a Precursor According to Scale (EEPAS) long-range earthquake forecasting model has been shown to be informative in several seismically active regions, including New Zealand, California and Japan. In previous applications of the model, the tectonic setting of earthquakes has been ignored. Here we distinguish crustal, plate interface, and slab earthquakes and apply the model to earthquakes with magnitude M≥4 in the Japan region from 1926 onwards. The target magnitude range is M≥ 6; the fitting period is 1966-1995; and the testing period is 1996-2005. In forecasting major slab earthquakes, it is optimal to use only slab and interface events as precursors. In forecasting major interface events, it is optimal to use only interface events as precursors. In forecasting major crustal events, it is optimal to use only crustal events as precursors. For the smoothed-seismicity component of the EEPAS model, it is optimal to use slab and interface events for earthquakes in the slab, interface events only for earthquakes on the interface, and crustal and interface events for crustal earthquakes. The optimal model parameters indicate that the precursor areas for slab earthquakes are relatively small compared to those for earthquakes in other tectonic categories, and that the precursor times and precursory earthquake magnitudes for crustal earthquakes are relatively large. The optimal models fit the learning data sets better than the raw EEPAS model, with an average information gain per earthquake of about 0.4. The average information gain is similar in the testing period, although it is higher for crustal earthquakes and lower for slab and interface earthquakes than in the learning period. These results show that earthquake interactions are stronger between earthquakes of similar tectonic types and that distinguishing tectonic types improves forecasts by enhancing the depth resolution where tectonic categories of earthquakes are vertically separated. However, when depth resolution is ignored, the model formed by aggregating the optimal forecasts for each tectonic category performs no better than the raw EEPAS model.


2019 ◽  
Vol 66 (3) ◽  
pp. 363-388
Author(s):  
Serkan Aras ◽  
Manel Hamdi

When the literature regarding applications of neural networks is investigated, it appears that a substantial issue is what size the training data should be when modelling a time series through neural networks. The aim of this paper is to determine the size of training data to be used to construct a forecasting model via a multiple-breakpoint test and compare its performance with two general methods, namely, using all available data and using just two years of data. Furthermore, the importance of the selection of the final neural network model is investigated in detail. The results obtained from daily crude oil prices indicate that the data from the last structural change lead to simpler architectures of neural networks and have an advantage in reaching more accurate forecasts in terms of MAE value. In addition, the statistical tests show that there is a statistically significant interaction between data size and stopping rule.


1986 ◽  
Vol 8 (3) ◽  
pp. 149-163 ◽  
Author(s):  
Daniel M. Landers ◽  
Stephen H. Boutcher ◽  
Min Q. Wang

In the past 7 years JSP has evolved to become a respected sport psychology journal. The journal has been uncompromising in the strong research posture it has taken. It is currently the only journal entirely devoted to sport psychology that uses a single set of criteria for evaluating the scientific merit of submitted manuscripts. Over this time period the submitted manuscripts have shown an increase in the number of female principal authors as well as authors being affiliated with departments other than physical education. Survey studies were the most common submittals, but lately there has been a greater emphasis in field experimental studies. Some potential problem areas are noted in subject selection and choice of statistical tests. An examination of research areas revealed that in recent years "motivation" was the most frequently submitted topic. It appeared that other research areas varied in terms of their publishability. The common methodological problems associated with rejection of these types of manuscripts are discussed.


2018 ◽  
Vol 7 (4) ◽  
pp. 700-701
Author(s):  
Brijesh Sathian ◽  
Edwin R Van Teijlingen

There is an urgent need of earthquake forecasting model for Nepal in this current scenario. It can be developed by the scientists of Nepal with the help of experienced international scientists. This will help the Nepalese to take timely and necessary precautions. We would argue that above all we need to use earthquake prediction knowledge to improve the disaster prepardness in local communities, service providers (hospitals, Non-Governmental Organizations, police, etc.), government policy-makers and international agencies. On the whole, both seismology and public health are most successful when focusing on  prevention not on prediction per se. J Epidemiol. 2017;7(4); 700-701.


Author(s):  
Mridu Sinha ◽  
Shashi Bala Arya ◽  
Shashi Saxena ◽  
Nitant Sood

Background: Induction of labour is an iatrogenic deliberate attempt to terminate the pregnancy in order to achieve vaginal delivery in cases of valid indication. It should be carefully supervised as it is a challenge to the clinician, mother and the fetus. Aim of this study was to find out common indications for IOL in a tertiary care teaching centre and its feto-maternal outcome.Methods: An institutional based retrospective observational study was conducted to describe the prevalence of labour induction and factors associated with its outcome, during the time-period of one year from January 2018 to December 2018, at SRMS IMS, Bareilly. Logistic regression analysis was employed to assess the relative effect of determinants and statistical tests were used to see the associations.Results: Most of the patients were primigravidas of younger age-group. Idiopathic oligohydramnios and postdatism were the commonest indications for induction of labour and Misoprost was the commonest drug used for it. Though majority had vaginal delivery, as the method was changed to combined method it was significantly associated with increased likelihood of LSCS. Similarly there was increased association with maternal cervico-vaginal tear / lacerations as the method was changed to combined type. However there were no association between post-partum hemorrhage, meconium stained liquor or fetal distress.Conclusions: Common indications for induction of labour were oligohydramnios and postdatism. Misoprost can be safely used for induction of labour without any increased risk for LSCS or any fetal / neonatal risks.


2019 ◽  
Vol 219 (3) ◽  
pp. 2148-2164
Author(s):  
A M Lombardi

SUMMARY The operational earthquake forecasting (OEF) is a procedure aimed at informing communities on how seismic hazard changes with time. This can help them live with seismicity and mitigate risk of destructive earthquakes. A successful short-term prediction scheme is not yet produced, but the search for it should not be abandoned. This requires more research on seismogenetic processes and, specifically, inclusion of any information about earthquakes in models, to improve forecast of future events, at any spatio-temporal-magnitude scale. The short- and long-term forecast perspectives of earthquake occurrence followed, up to now, separate paths, involving different data and peculiar models. But actually they are not so different and have common features, being parts of the same physical process. Research on earthquake predictability can help to search for a common path in different forecast perspectives. This study aims to improve the modelling of long-term features of seismicity inside the epidemic type aftershock sequence (ETAS) model, largely used for short-term forecast and OEF procedures. Specifically, a more comprehensive estimation of background seismicity rate inside the ETAS model is attempted, by merging different types of data (seismological instrumental, historical, geological), such that information on faults and on long-term seismicity integrates instrumental data, on which the ETAS models are generally set up. The main finding is that long-term historical seismicity and geological fault data improve the pseudo-prospective forecasts of independent seismicity. The study is divided in three parts. The first consists in models formulation and parameter estimation on recent seismicity of Italy. Specifically, two versions of ETAS model are compared: a ‘standard’, previously published, formulation, only based on instrumental seismicity, and a new version, integrating different types of data for background seismicity estimation. Secondly, a pseudo-prospective test is performed on independent seismicity, both to test the reliability of formulated models and to compare them, in order to identify the best version. Finally, a prospective forecast is made, to point out differences and similarities in predicting future seismicity between two models. This study must be considered in the context of its limitations; anyway, it proves, beyond argument, the usefulness of a more sophisticated estimation of background rate, inside short-term modelling of earthquakes.


2021 ◽  
Author(s):  
Robert Shcherbakov

<p>Earthquakes trigger subsequent earthquakes. They form clusters and swarms in space and in time. This is a direct manifestation of the non-Poisson behavior in the occurrence of earthquakes, where earthquake magnitudes and time intervals between successive events are not independent and are influenced by past seismicity. As a result, the distribution of the number of earthquakes is no longer strictly Poisson and the statistics of the largest events deviate from the GEV distribution. In statistical seismology, the occurrence of earthquakes is typically approximated by a stochastic marked point process. Among different models, the ETAS model is the most successful in reproducing several key aspects of seismicity. Recent analysis suggests that the ETAS model generates sequences of events which are not Poisson. This becomes important when the ETAS based models are used for earthquake forecasting (Shcherbakov et al., Nature Comms., 2019). In this work, I consider the Bayesian framework combined with the ETAS model to constrain the magnitudes of the largest expected aftershocks during a future forecasting time interval. This includes the MCMC sampling of the posterior distribution of the ETAS parameters and computation of the Bayesian predictive distribution for the magnitudes of the largest expected events. To validate the forecasts, the statistical tests developed by the CSEP are reformulated for the Bayesian framework. In addition, I define and compute the Bayesian p-value to evaluate the consistency of the forecasted extreme earthquakes during each forecasting time interval. The Bayesian p-value gives the probability that the largest forecasted earthquake can be more extreme than the observed one. The suggested approach is applied to the recent 2019 Ridgecrest earthquake sequence to forecast retrospectively the occurrence of the largest aftershocks (Shcherbakov, JGR, 2021). The results indicate that the Bayesian approach combined with the ETAS model outperformed the approach based on the Poisson assumption, which uses the extreme value distribution and the Omori law.</p>


Sign in / Sign up

Export Citation Format

Share Document