scholarly journals Global Earthquake Forecasting System (GEFS): The challenges ahead

2021 ◽  
Vol 230 (1) ◽  
pp. 473-490
Author(s):  
A. Mignan ◽  
G. Ouillon ◽  
D. Sornette ◽  
F. Freund

Abstract We conclude this special issue on the Global Earthquake Forecasting System (GEFS) by briefly reviewing and analyzing the claims of non-seismic precursors made in the present volume, and by reflecting on the current limitations and future directions to take. We find that most studies presented in this special volume, taken individually, do not provide strong enough evidence of non-seismic precursors to large earthquakes. The majority of the presented results are hampered by the fact that the task at hand is susceptible to potential biases in data selection and possible overfitting. The most encouraging results are obtained for ground-based geoelectric signals, although the probability gain is likely small compared to an earthquake clustering baseline. The only systematic search on satellite data available so far, those of the DEMETER mission, did not find a robust precursory pattern. The conclusion that we can draw is that the overall absence of convincing evidence is likely due to a deficit in systematically applying robust statistical methods and in integrating scientific knowledge of different fields. Most authors are specialists of their field while the study of earthquake precursors requires a system approach combined with the knowledge of many specific characteristics of seismicity. Relating non-seismic precursors to earthquakes remains a challenging multidisciplinary field of investigation. The plausibility of these precursors predicted by models of lithosphere-atmosphere-ionosphere coupling, together with the suggestive evidence collected here, call for further investigations. The primary goal of the GEFS is thus to build a global database of candidate signals, which could potentially improve earthquake predictability (if the weak signals observed are real and false positives sufficiently uncorrelated between different data sources). Such a stacking of disparate and voluminous data will require big data storage and machine learning pipelines, which has become feasible only recently. This special issue compiled an eclectic list of non-seismic precursor candidates, which is in itself a valuable source of information for seismologists, geophysicists and other scientists who may not be familiar with such types of investigations. It also forms the foundation for a coherent, multi-disciplinary collaboration on earthquake prediction.

2021 ◽  
Author(s):  
Yavor Kamer ◽  
Shyam Nandan ◽  
Stefan Hiemer ◽  
Guy Ouillon ◽  
Didier Sornette

<p>Nature is scary. You can be sitting at your home and next thing you know you are trapped under the ruble of your own house or sucked into a sinkhole. For millions of years we have been the figurines of this precarious scene and we have found our own ways of dealing with the anxiety. It is natural that we create and consume prophecies, conspiracies and false predictions. Information technologies amplify not only our rational but also irrational deeds. Social media algorithms, tuned to maximize attention, make sure that misinformation spreads much faster than its counterpart.</p><p>What can we do to minimize the adverse effects of misinformation, especially in the case of earthquakes? One option could be to designate one authoritative institute, set up a big surveillance network and cancel or ban every source of misinformation before it spreads. This might have worked a few centuries ago but not in this day and age. Instead we propose a more inclusive option: embrace all voices and channel them into an actual, prospective earthquake prediction platform (Kamer et al. 2020). The platform is powered by a global state-of-the-art statistical earthquake forecasting model that provides near real-time earthquake occurrence probabilities anywhere on the globe (Nandan et al. 2020). Using this model as a benchmark in statistical metrics specifically tailored to the prediction problem, we are able to distill all these voices and quantify the essence of predictive skill. This approach has several advantages. Rather than trying to silence or denounce, we listen and evaluate each claim and report the predictive skill of the source. We engage the public and allow them to take part in a scientific experiment that will increase their risk awareness. We effectively demonstrate that anybody with an internet connected device can make an earthquake prediction, but that it is not so trivial to achieve skillful predictive performance.</p><p>Here we shall present initial results from our global earthquake prediction experiment that we have been conducting on www.richterx.com for the past two years, yielding more than 10,000 predictions. These results will hopefully demystify the act of predicting an earthquake in the eyes of the public, and next time someone forwards a prediction message it would arouse more scrutiny than panic or distaste.<br><br>Nandan, S., Kamer, Y., Ouillon, G., Hiemer, S., Sornette, D. (2020). <em>Global models for short-term earthquake forecasting and predictive skill assessment</em>. European Physical Journal ST. doi: 10.1140/epjst/e2020-000259-3<br>Kamer, Y., Nandan, S., Ouillon, G., Hiemer, S., Sornette, D. (2020). <em>Democratizing earthquake predictability research: introducing the RichterX platform.</em> European Physical Journal ST. doi: 10.1140/epjst/e2020-000260-2 </p>


2005 ◽  
Vol 12 (6) ◽  
pp. 965-977 ◽  
Author(s):  
J. R. Holliday ◽  
K. Z. Nanjo ◽  
K. F. Tiampo ◽  
J. B. Rundle ◽  
D. L. Turcotte

Abstract. No proven method is currently available for the reliable short time prediction of earthquakes (minutes to months). However, it is possible to make probabilistic hazard assessments for earthquake risk. In this paper we discuss a new approach to earthquake forecasting based on a pattern informatics (PI) method which quantifies temporal variations in seismicity. The output, which is based on an association of small earthquakes with future large earthquakes, is a map of areas in a seismogenic region ("hotspots'') where earthquakes are forecast to occur in a future 10-year time span. This approach has been successfully applied to California, to Japan, and on a worldwide basis. Because a sharp decision threshold is used, these forecasts are binary--an earthquake is forecast either to occur or to not occur. The standard approach to the evaluation of a binary forecast is the use of the relative (or receiver) operating characteristic (ROC) diagram, which is a more restrictive test and less subject to bias than maximum likelihood tests. To test our PI method, we made two types of retrospective forecasts for California. The first is the PI method and the second is a relative intensity (RI) forecast based on the hypothesis that future large earthquakes will occur where most smaller earthquakes have occurred in the recent past. While both retrospective forecasts are for the ten year period 1 January 2000 to 31 December 2009, we performed an interim analysis 5 years into the forecast. The PI method out performs the RI method under most circumstances.


2014 ◽  
Vol 85 (5) ◽  
pp. 961-969 ◽  
Author(s):  
W. Marzocchi ◽  
A. M. Lombardi ◽  
E. Casarotti

2019 ◽  
Vol 219 (3) ◽  
pp. 2148-2164
Author(s):  
A M Lombardi

SUMMARY The operational earthquake forecasting (OEF) is a procedure aimed at informing communities on how seismic hazard changes with time. This can help them live with seismicity and mitigate risk of destructive earthquakes. A successful short-term prediction scheme is not yet produced, but the search for it should not be abandoned. This requires more research on seismogenetic processes and, specifically, inclusion of any information about earthquakes in models, to improve forecast of future events, at any spatio-temporal-magnitude scale. The short- and long-term forecast perspectives of earthquake occurrence followed, up to now, separate paths, involving different data and peculiar models. But actually they are not so different and have common features, being parts of the same physical process. Research on earthquake predictability can help to search for a common path in different forecast perspectives. This study aims to improve the modelling of long-term features of seismicity inside the epidemic type aftershock sequence (ETAS) model, largely used for short-term forecast and OEF procedures. Specifically, a more comprehensive estimation of background seismicity rate inside the ETAS model is attempted, by merging different types of data (seismological instrumental, historical, geological), such that information on faults and on long-term seismicity integrates instrumental data, on which the ETAS models are generally set up. The main finding is that long-term historical seismicity and geological fault data improve the pseudo-prospective forecasts of independent seismicity. The study is divided in three parts. The first consists in models formulation and parameter estimation on recent seismicity of Italy. Specifically, two versions of ETAS model are compared: a ‘standard’, previously published, formulation, only based on instrumental seismicity, and a new version, integrating different types of data for background seismicity estimation. Secondly, a pseudo-prospective test is performed on independent seismicity, both to test the reliability of formulated models and to compare them, in order to identify the best version. Finally, a prospective forecast is made, to point out differences and similarities in predicting future seismicity between two models. This study must be considered in the context of its limitations; anyway, it proves, beyond argument, the usefulness of a more sophisticated estimation of background rate, inside short-term modelling of earthquakes.


2016 ◽  
Author(s):  
S. Rémy ◽  
A. Veira ◽  
R. Paugam ◽  
M. Sofiev ◽  
J. W. Kaiser ◽  
...  

Abstract. The Global Fire Assimilation System (GFAS) assimilates Fire Radiative Power (FRP) observations from satellite-based sensors to produce daily estimates of biomass burning emissions. It has been extended to include information about injection heights provided by two distinct algorithms, which also use meteorological information from the operational weather forecasts of ECMWF. Injection heights are provided by the semi-empirical IS4FIRES parameterization and an analytical one-dimension Plume Rise Model (PRM). The two algorithms provide estimates for injection heights for each satellite pixel. Similarly to how FRP observations are processed in GFAS, these estimates are then gridded, averaged and assimilated, using a simple observation operator, so as to fill the observational gaps. A global database of daily biomass burning emissions and injection heights at 0.1° resolution has been produced for 2003–2015. The database is being extended in near-real-time with the operational GFAS service of the Copernicus Atmospheric Monitoring Service (CAMS). The two injection height datasets were compared against a new dataset of satellite-based plume height observations. The IS4FIRES parameterization showed a better overall agreement against observations, while the PRM was better at capturing the variability of injection heights and at estimating the injection heights of large fires. The results from both also show a differentiation depending on the type of vegetation. A positive trend with time in median injection heights from the PRM was noted, less marked from the IS4FIRES parameterization. This is provoked by a negative trend in number of small fires, especially in regions such as South America. The use of biomass burning emission heights from GFAS in atmospheric composition forecasts was assessed in two case studies: the South AMerican Biomass Burning Analysis (SAMBBA) campaign which took place in September 2012 in Brazil, and a series of large fire events in the Western U.S. in August 2013. For these case studies, forecasts of biomass burning aerosol species by the Composition-Integrated Forecasting System (C-IFS) of CAMS were found to better reproduce the observed vertical distribution when using PRM injection heights from GFAS.


Author(s):  
A. Bhatia ◽  
S. Pasari ◽  
A. Mehta

<p><strong>Abstract.</strong> Earthquake is one of the most devastating natural calamities that takes thousands of lives and leaves millions more homeless and deprives them of the basic necessities. Earthquake forecasting can minimize the death count and economic loss encountered by the affected region to a great extent. This study presents an earthquake forecasting system by using Artificial Neural Networks (ANN). Two different techniques are used with the first focusing on the accuracy evaluation of multilayer perceptron using different inputs and different set of hyper-parameters. The limitation of earthquake data in the first experiment led us to explore another technique, known as nowcasting of earthquakes. The nowcasting technique determines the current progression of earthquake cycle of higher magnitude earthquakes by taking into account the number of smaller earthquake events in the same region. To implement the nowcasting method, a Long Short Term Memory (LSTM) neural network architecture is considered because such networks are one of the most recent and promising developments in the time-series analysis. Results of different experiments are discussed along with their consequences.</p>


Author(s):  
Lungfa Collins Wuyep ◽  
Umar Afegbua Kadiri ◽  
Isogun Adeyemi Monday ◽  
Nanshin Emmanuel Nansak ◽  
Lumi Zakka ◽  
...  

Regardless of the doubt caused by some rounds on the impossibility of earthquake forecast, more and more countries, even at the highest governmental levels, realize that doing nothing is the ostrich position of dread before the real difficulties associated with the creation of a real forecasting system. Nigeria in times past was believed to be aseismic. However, the seismic record of Nigeria from 1933-2021 have demonstrated in contrast to the idea, numerous quakes have been recorded in Nigeria throughout the years. With the development of observation techniques and theoretical knowledge of geochemistry, geochemical observation of faults gas has become a hotspot once more in recent years. Rn, Hg, H2, etc., are used for geochemical observations. 222Rn has a half-life of 3.825 days, a magnitude 5.0 earthquake will be detected through precursory phenomena at a distance not greater than 142 km. Mercury and other elements are used as important detectors for earthquake prediction and they play an important role in revealing the relationship between fluid in the fault zone and the occurrence of earthquakes, the range for a magnitude 5.0 earthquake is limited to 200 km. Hydrogen concentrations have been monitored for precursory variations in many fault systems, using either discrete sampling and laboratory analysis or continuous monitoring of ground gas, using hydrogen-sensitive fuel cells. Precursory changes in groundwater chemistry are often attributed to the mixing of fluids from two or more chemically distinct aquifers, the physical mechanism responsible for the mixing of fluids is, however, not well established.


2022 ◽  
Author(s):  
Marcus Herrmann ◽  
Ester Piegari ◽  
Warner Marzocchi

Abstract The Magnitude–Frequency-Distribution (MFD) of earthquakes is typically modeled with the (tapered) Gutenberg–Richter relation. The main parameter of this relation, the b-value, controls the relative rate of small and large earthquakes. Resolving spatiotemporal variations of the b-value is critical to understanding the earthquake occurrence process and improving earthquake forecasting. However, this variation is not well understood. Here we present unexpected MFD variability using a high-resolution earthquake catalog of the 2016–2017 central Italy sequence. Isolation of seismicity clusters reveals that the MFD differs in nearby clusters, varies or remains constant in time depending on the cluster, and features an unexpected b-value increase in the cluster where the largest event will occur. These findings suggest a strong influence of the heterogeneity and complexity of tectonic structures on the MFD. Our findings raise the question of the appropriate spatiotemporal scale for resolving the b-value, which poses a serious obstacle to interpreting and using the MFD in earthquake forecasting.


Sign in / Sign up

Export Citation Format

Share Document