scholarly journals The Future of the Field of Earthquake Forecasting, New Data Assimilation and Fusion Strategy, towards Timely Earthquake Prediction and Warning

2018 ◽  
Vol 1 (3) ◽  
Author(s):  
Pierre-Richard Cornely
2021 ◽  
Author(s):  
Yavor Kamer ◽  
Shyam Nandan ◽  
Stefan Hiemer ◽  
Guy Ouillon ◽  
Didier Sornette

<p>Nature is scary. You can be sitting at your home and next thing you know you are trapped under the ruble of your own house or sucked into a sinkhole. For millions of years we have been the figurines of this precarious scene and we have found our own ways of dealing with the anxiety. It is natural that we create and consume prophecies, conspiracies and false predictions. Information technologies amplify not only our rational but also irrational deeds. Social media algorithms, tuned to maximize attention, make sure that misinformation spreads much faster than its counterpart.</p><p>What can we do to minimize the adverse effects of misinformation, especially in the case of earthquakes? One option could be to designate one authoritative institute, set up a big surveillance network and cancel or ban every source of misinformation before it spreads. This might have worked a few centuries ago but not in this day and age. Instead we propose a more inclusive option: embrace all voices and channel them into an actual, prospective earthquake prediction platform (Kamer et al. 2020). The platform is powered by a global state-of-the-art statistical earthquake forecasting model that provides near real-time earthquake occurrence probabilities anywhere on the globe (Nandan et al. 2020). Using this model as a benchmark in statistical metrics specifically tailored to the prediction problem, we are able to distill all these voices and quantify the essence of predictive skill. This approach has several advantages. Rather than trying to silence or denounce, we listen and evaluate each claim and report the predictive skill of the source. We engage the public and allow them to take part in a scientific experiment that will increase their risk awareness. We effectively demonstrate that anybody with an internet connected device can make an earthquake prediction, but that it is not so trivial to achieve skillful predictive performance.</p><p>Here we shall present initial results from our global earthquake prediction experiment that we have been conducting on www.richterx.com for the past two years, yielding more than 10,000 predictions. These results will hopefully demystify the act of predicting an earthquake in the eyes of the public, and next time someone forwards a prediction message it would arouse more scrutiny than panic or distaste.<br><br>Nandan, S., Kamer, Y., Ouillon, G., Hiemer, S., Sornette, D. (2020). <em>Global models for short-term earthquake forecasting and predictive skill assessment</em>. European Physical Journal ST. doi: 10.1140/epjst/e2020-000259-3<br>Kamer, Y., Nandan, S., Ouillon, G., Hiemer, S., Sornette, D. (2020). <em>Democratizing earthquake predictability research: introducing the RichterX platform.</em> European Physical Journal ST. doi: 10.1140/epjst/e2020-000260-2 </p>


2013 ◽  
Vol 13 (10) ◽  
pp. 2605-2618 ◽  
Author(s):  
Q. Li ◽  
G.-M. Xu

Abstract. We found the possible correlation between the precursory pattern of tidal triggering of earthquakes and the crustal heterogeneities, which is of particular importance to the researchers in earthquake prediction and earthquake hazard prevention. We investigated the connection between the tidal variations and earthquake occurrence in the Liyang, Wunansha, Cangshan, Wenan, Luquan and Yaoan regions of China. Most of the regions show a higher correlation with tidal triggering in several years preceding the large or destructive earthquakes compared to other times, indicating that the tidal triggering may inherently relate to the nucleation of the destructive earthquakes during this time. In addition, the analysis results indicate that the Liyang, Cangshan and Luquan regions, with stronger heterogeneity, show statistically significant effects of tidal triggering preceding the large or destructive earthquakes, while the Wunansha, Wenan and Yaoan regions, with relatively weak heterogeneity, show statistically insignificant effects of it, signifying that the precursory pattern of tidal triggering of earthquakes in these six regions is possibly related to the heterogeneities of the crustal rocks. The above results suggest that when people try to find the potential earthquake hazardous areas or make middle–long-term earthquake forecasting by means of precursory pattern of the tidal triggering, the crustal heterogeneity in these areas has to be taken into consideration for the purpose of increasing the prediction efficiency. If they do not consider the influence of crustal heterogeneity on the tidal triggering of earthquakes, the prediction efficiency might greatly decrease.


Atmosphere ◽  
2020 ◽  
Vol 11 (9) ◽  
pp. 968
Author(s):  
Qingfu Liu ◽  
Xuejin Zhang ◽  
Mingjing Tong ◽  
Zhan Zhang ◽  
Bin Liu ◽  
...  

This paper describes the vortex initialization (VI) currently used in NCEP operational hurricane models (HWRF and HMON, and possibly HAFS in the future). The VI corrects the background fields for hurricane models: it consists of vortex relocation, and size and intensity corrections. The VI creates an improved background field for the data assimilation and thereby produces an improved analysis for the operational hurricane forecast. The background field after VI can be used as an initial field (as in the HMON model, without data assimilation) or a background field for data assimilation (as in HWRF model).


Geosciences ◽  
2018 ◽  
Vol 8 (12) ◽  
pp. 489 ◽  
Author(s):  
Jürgen Helmert ◽  
Aynur Şensoy Şorman ◽  
Rodolfo Alvarado Montero ◽  
Carlo De Michele ◽  
Patricia de Rosnay ◽  
...  

The European Cooperation in Science and Technology (COST) Action ES1404 “HarmoSnow”, entitled, “A European network for a harmonized monitoring of snow for the benefit of climate change scenarios, hydrology and numerical weather prediction” (2014-2018) aims to coordinate efforts in Europe to harmonize approaches to validation, and methodologies of snow measurement practices, instrumentation, algorithms and data assimilation (DA) techniques. One of the key objectives of the action was “Advance the application of snow DA in numerical weather prediction (NWP) and hydrological models and show its benefit for weather and hydrological forecasting as well as other applications.” This paper reviews approaches used for assimilation of snow measurements such as remotely sensed and in situ observations into hydrological, land surface, meteorological and climate models based on a COST HarmoSnow survey exploring the common practices on the use of snow observation data in different modeling environments. The aim is to assess the current situation and understand the diversity of usage of snow observations in DA, forcing, monitoring, validation, or verification within NWP, hydrology, snow and climate models. Based on the responses from the community to the questionnaire and on literature review the status and requirements for the future evolution of conventional snow observations from national networks and satellite products, for data assimilation and model validation are derived and suggestions are formulated towards standardized and improved usage of snow observation data in snow DA. Results of the conducted survey showed that there is a fit between the snow macro-physical variables required for snow DA and those provided by the measurement networks, instruments, and techniques. Data availability and resources to integrate the data in the model environment are identified as the current barriers and limitations for the use of new or upcoming snow data sources. Broadening resources to integrate enhanced snow data would promote the future plans to make use of them in all model environments.


2019 ◽  
Vol 28 (01) ◽  
pp. 055-055

Albers DJ, Levine ME, Stuart A, Mamykina L, Gluckman B, Hripcsak G. Mechanistic machine learning: how data assimilation leverages physiological knowledge using bayesian inference to forecast the future, infer the present, and phenotype. J Am Med Inform Assoc 2018;25(10):1392-401 https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6188514/ Oktay O, Ferrante E, Kamnitsas K, Heinrich M, Bai W, Caballero J, Cook SA, de Marvao A, Dawes T, O'Regan DP, Kainz B, Glocker B, Rueckert D. Anatomically Constrained Neural Networks (ACNNs): application to cardiac image enhancement and segmentation. IEEE Trans Med Imaging 2018;37(2):384-95 https://spiral.imperial.ac.uk:8443/handle/10044/1/50440 Lee J, Sun J, Wang F, Wang S, Jun CH, Jiang X. Privacy-preserving patient similarity learning in a federated environment: development and analysis. JMIR Med Inform 2018;6(2):e20 https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5924379/


2017 ◽  
Vol 122 (10) ◽  
pp. 7924-7937 ◽  
Author(s):  
Iyan E. Mulia ◽  
Daisuke Inazu ◽  
Takuji Waseda ◽  
Aditya Riadi Gusman

2006 ◽  
Vol 1 (3) ◽  
pp. 415-415
Author(s):  
Kazuki Koketsu ◽  

Tatsuo Usami, now professor emeritus at the University of Tokyo, published a paper entitled “Earthquake Studies and the Earthquake Prediction System in Japan” in the March 1974 issue of Technocrat. I was impressed by Professor Usami’s comprehensive review and healthy criticism of earthquake prediction in Japan, which appears fresh even today. He gave an overview of the 1923 Kanto earthquake and Program 1 to 2 of the earthquake prediction project in Japan. The motivation and research for the project in its early stage are well summarized in the paper. The Tokai earthquake hypothesis [1] was proposed during Program 3, so the budget for the project at national universities was approximately tripled in Program 4 and increased to about 12 billion yen in Program 7 (Table 1). The 1995 Kobe (Hyogo-ken Nanbu) earthquake occurred during Program 7 killing 6,434 people and completely destroying 104,906 houses [2]. Since this unexpected earthquake was as destructive as the 1923 Kanto earthquake, the earthquake prediction project was reformed in New Program 1 (Table 1). The Headquarters for Earthquake Research Promotion was established, moving emphasis from empirical short-term prediction to long-term earthquake forecasting and prediction of strong ground motion [3]. Dr. Hiroe Miyake and I reviewed this situation in a preceding article [4], taking over the mission of writing a recent history of Japanese seismology from Professor Usami's paper. References: [1] K. Ishibashi, “Did the rupture zone of the 1707 Hoei earthquake not extend to deep Suruga Bay?,” Rep. Subcomm. Tokai Distr., Coord. Comm. Earthq. Predict., Geogr. Surv. Inst., pp. 69-78, 1977 (in Japanese). [2] K. Koketsu, “Chronological table of damaging earthquakes in Japan,” in Chronological Scientific Tables 2007, Maruzen, pp.698-729, 2006 (in Japanese). [3] N. Hirata, “Past, current and future of Japanese national program for earthquake prediction research,” Earth Planets and Space, 56, pp. xliii-l, 2004. [4] K. Koketsu and H. Miyake, “Earthquake Observation and Strong Motion Seismology in Japan from 1975 to 2005,” Journal of Disaster Research, Vol.1, No.3, pp. 407-414, 2006. Kazuki Koketsu Professor, University of Tokyo


2011 ◽  
Vol 18 (1) ◽  
pp. 49-70 ◽  
Author(s):  
M. J. Werner ◽  
K. Ide ◽  
D. Sornette

Abstract. Data assimilation is routinely employed in meteorology, engineering and computer sciences to optimally combine noisy observations with prior model information for obtaining better estimates of a state, and thus better forecasts, than achieved by ignoring data uncertainties. Earthquake forecasting, too, suffers from measurement errors and partial model information and may thus gain significantly from data assimilation. We present perhaps the first fully implementable data assimilation method for earthquake forecasts generated by a point-process model of seismicity. We test the method on a synthetic and pedagogical example of a renewal process observed in noise, which is relevant for the seismic gap hypothesis, models of characteristic earthquakes and recurrence statistics of large quakes inferred from paleoseismic data records. To address the non-Gaussian statistics of earthquakes, we use sequential Monte Carlo methods, a set of flexible simulation-based methods for recursively estimating arbitrary posterior distributions. We perform extensive numerical simulations to demonstrate the feasibility and benefits of forecasting earthquakes based on data assimilation.


Sign in / Sign up

Export Citation Format

Share Document