The early postseismic phase of Tohoku-Oki earthquake (2011) from kinematics solutions: implication for subduction interface dynamics

Author(s):  
Axel Periollat ◽  
Mathilde Radiguet ◽  
Jérôme Weiss ◽  
Cédric Twardzik ◽  
Lou Marill ◽  
...  

<p>Earthquakes are usually followed by a postseismic phase where the stresses induced by the earthquakes are relaxed. It is a combination of different processes among which aseismic slip on the fault zone (called afterslip), viscoelastic deformation of the surrounding material, poroelastic relaxation and aftershocks. However, little work has been done at the transition from the co- to the postseismic phase, and the physical processes involved.</p><p>We study the 2011 Mw 9.0 Tohoku-Oki earthquake, one of the largest and most instrumented recent earthquake, using GEONET GPS data. We focus on the few minutes to the first month following the mainshock, a period dominated by afterslip.</p><p>Based on the method developed by Twardzik et al. (2019), we process 30-s kinematic position time series and we use it to characterize the fast displacements rates that typically occur during the early stages of the postseismic phase. We quantify precisely the co-seismic offset of the mainshock, without including early afterslip, and we also characterize the co-seismic offset of the Mw 7.9 Ibaraki-Oki aftershock, which occurred 30 minutes after the mainshock. We analyze the spatial distribution of the co-seismic offsets for both earthquakes. We also use signal induced by the postseismic phase over different time windows to investigate the spatio-temporal evolution of the postseismic slip. We determine the redistribution of stresses to estimate the regional influence of the mainshock and aftershock on postseismic slip.</p><p>From a detailed characterization of the first month of postseismic kinematic time series, we find that the best-fitting law is given by an Omori-like decay. The displacement rate is of the type v<sub>0</sub>/(t+c)<sup>p</sup> with spatial variation for the initial velocity v<sub>0</sub> and for the time constant c. We find a consistent estimate of the p-value close to 0.7 over most of the studied area, apart from a small region close to the aftershock location where higher p values (p~1) are observed. This p value of 0.7 shows that the evolution of the Tohoku-Oki early afterslip is not logarithmic. We discuss about the implications of these observations in terms of subduction interface dynamics and rheology. We also discuss about the different time-scales involved in the relaxation, and how this model, established for the early postseismic phase over one month, performs over longer time scales (by comparison with daily time series lasting several years).</p><p>Twardzik Cedric, Mathilde Vergnolle, Anthony Sladen and Antonio Avallone (2019), doi.org/10.1038/s41598-019-39038-z</p><p><strong>Keywords: </strong>Early Postseismic, Afterslip, GPS, Kinematic, Omori Law</p>

2020 ◽  
Vol 33 (12) ◽  
pp. 5155-5172
Author(s):  
Quentin Jamet ◽  
William K. Dewar ◽  
Nicolas Wienders ◽  
Bruno Deremble ◽  
Sally Close ◽  
...  

AbstractMechanisms driving the North Atlantic meridional overturning circulation (AMOC) variability at low frequency are of central interest for accurate climate predictions. Although the subpolar gyre region has been identified as a preferred place for generating climate time-scale signals, their southward propagation remains under consideration, complicating the interpretation of the observed time series provided by the Rapid Climate Change–Meridional Overturning Circulation and Heatflux Array–Western Boundary Time Series (RAPID–MOCHA–WBTS) program. In this study, we aim at disentangling the respective contribution of the local atmospheric forcing from signals of remote origin for the subtropical low-frequency AMOC variability. We analyze for this a set of four ensembles of a regional (20°S–55°N), eddy-resolving (1/12°) North Atlantic oceanic configuration, where surface forcing and open boundary conditions are alternatively permuted from fully varying (realistic) to yearly repeating signals. Their analysis reveals the predominance of local, atmospherically forced signal at interannual time scales (2–10 years), whereas signals imposed by the boundaries are responsible for the decadal (10–30 years) part of the spectrum. Due to this marked time-scale separation, we show that, although the intergyre region exhibits peculiarities, most of the subtropical AMOC variability can be understood as a linear superposition of these two signals. Finally, we find that the decadal-scale, boundary-forced AMOC variability has both northern and southern origins, although the former dominates over the latter, including at the site of the RAPID array (26.5°N).


2021 ◽  
Vol 73 (1) ◽  
Author(s):  
Magnus D. Hammer ◽  
Grace A. Cox ◽  
William J. Brown ◽  
Ciarán D. Beggan ◽  
Christopher C. Finlay

AbstractWe present geomagnetic main field and secular variation time series, at 300 equal-area distributed locations and at 490 km altitude, derived from magnetic field measurements collected by the three Swarm satellites. These Geomagnetic Virtual Observatory (GVO) series provide a convenient means to globally monitor and analyze long-term variations of the geomagnetic field from low-Earth orbit. The series are obtained by robust fits of local Cartesian potential field models to along-track and East–West sums and differences of Swarm satellite data collected within a radius of 700 km of the GVO locations during either 1-monthly or 4-monthly time windows. We describe two GVO data products: (1) ‘Observed Field’ GVO time series, where all observed sources contribute to the estimated values, without any data selection or correction, and (2) ‘Core Field’ GVO time series, where additional data selection is carried out, then de-noising schemes and epoch-by-epoch spherical harmonic analysis are applied to reduce contamination by magnetospheric and ionospheric signals. Secular variation series are provided as annual differences of the Core Field GVOs. We present examples of the resulting Swarm GVO series, assessing their quality through comparisons with ground observatories and geomagnetic field models. In benchmark comparisons with six high-quality mid-to-low latitude ground observatories we find the secular variation of the Core Field GVO field intensities, calculated using annual differences, agrees to an rms of 1.8 nT/yr and 1.2 nT/yr for the 1-monthly and 4-monthly versions, respectively. Regular sampling in space and time, and the availability of data error estimates, makes the GVO series well suited for users wishing to perform data assimilation studies of core dynamics, or to study long-period magnetospheric and ionospheric signals and their induced counterparts. The Swarm GVO time series will be regularly updated, approximately every four months, allowing ready access to the latest secular variation data from the Swarm satellites.


Author(s):  
Jia-Rong Yeh ◽  
Chung-Kang Peng ◽  
Norden E. Huang

Multi-scale entropy (MSE) was developed as a measure of complexity for complex time series, and it has been applied widely in recent years. The MSE algorithm is based on the assumption that biological systems possess the ability to adapt and function in an ever-changing environment, and these systems need to operate across multiple temporal and spatial scales, such that their complexity is also multi-scale and hierarchical. Here, we present a systematic approach to apply the empirical mode decomposition algorithm, which can detrend time series on various time scales, prior to analysing a signal’s complexity by measuring the irregularity of its dynamics on multiple time scales. Simulated time series of fractal Gaussian noise and human heartbeat time series were used to study the performance of this new approach. We show that our method can successfully quantify the fractal properties of the simulated time series and can accurately distinguish modulations in human heartbeat time series in health and disease.


Author(s):  
NA LI ◽  
MARTIN CRANE ◽  
HEATHER J. RUSKIN

SenseCam is an effective memory-aid device that can automatically record images and other data from the wearer's whole day. The main issue is that, while SenseCam produces a sizeable collection of images over the time period, the vast quantity of captured data contains a large percentage of routine events, which are of little interest to review. In this article, the aim is to detect "Significant Events" for the wearers. We use several time series analysis methods such as Detrended Fluctuation Analysis (DFA), Eigenvalue dynamics and Wavelet Correlations to analyse the multiple time series generated by the SenseCam. We show that Detrended Fluctuation Analysis exposes a strong long-range correlation relationship in SenseCam collections. Maximum Overlap Discrete Wavelet Transform (MODWT) was used to calculate equal-time Correlation Matrices over different time scales and then explore the granularity of the largest eigenvalue and changes of the ratio of the sub-dominant eigenvalue spectrum dynamics over sliding time windows. By examination of the eigenspectrum, we show that these approaches enable detection of major events in the time SenseCam recording, with MODWT also providing useful insight on details of major events. We suggest that some wavelet scales (e.g., 8 minutes–16 minutes) have the potential to identify distinct events or activities.


2012 ◽  
Vol 29 (4) ◽  
pp. 613-628 ◽  
Author(s):  
Steven L. Morey ◽  
Dmitry S. Dukhovskoy

Abstract Statistical analysis methods are developed to quantify the impacts of multiple forcing variables on the hydrographic variability within an estuary instrumented with an enduring observational system. The methods are applied to characterize the salinity variability within Apalachicola Bay, a shallow multiple-inlet estuary along the northeastern Gulf of Mexico coast. The 13-yr multivariate time series collected by the National Estuary Research Reserve at three locations within the bay are analyzed to determine how the estuary responds to variations in external forcing mechanisms, such as freshwater discharge, precipitation, tides, and local winds at multiple time scales. The analysis methods are used to characterize the estuarine variability under differing flow regimes of the Apalachicola River, a managed waterway, with particular focus on extreme events and scales of variability that are critical to local ecosystems. Multivariate statistical models are applied that describe the salinity response to winds from multiple directions, river flow, and precipitation at daily, weekly, and monthly time scales to understand the response of the estuary under different climate regimes. Results show that the salinity is particularly sensitive to river discharge and wind magnitude and direction, with local precipitation being largely unimportant. Applying statistical analyses with conditional sampling quantifies how the likelihoods of high-salinity and long-duration high-salinity events, conditions of critical importance to estuarine organisms, change given the state of the river flow. Intraday salinity range is shown to be negatively correlated with the salinity, and correlated with river discharge rate.


2021 ◽  
Vol 11 (12) ◽  
pp. 5615
Author(s):  
Łukasz Sobolewski ◽  
Wiesław Miczulski

Ensuring the best possible stability of UTC(k) (local time scale) and its compliance with the UTC scale (Universal Coordinated Time) forces predicting the [UTC-UTC(k)] deviations, the article presents the results of work on two methods of constructing time series (TS) for a neural network (NN), increasing the accuracy of UTC(k) prediction. In the first method, two prepared TSs are based on the deviations determined according to the UTC scale with a 5-day interval. In order to improve the accuracy of predicting the deviations, the PCHIP interpolating function is used in subsequent TSs, obtaining TS elements with a 1-day interval. A limitation in the improvement of prediction accuracy for these TS has been a too large prediction horizon. The introduction in 2012 of the additional UTC Rapid scale by BIPM makes it possible to shorten the prediction horizon, and the building of two TSs has been proposed according to the second method. Each of them consists of two subsets. The first subset is based on deviations determined according to the UTC scale, the second on the UTC Rapid scale. The research of the proposed TS in the field of predicting deviations for the Polish Timescale by means of GMDH-type NN shows that the best accuracy of predicting the deviations has been achieved for TS built according to the second method.


2020 ◽  
Vol 3 (1) ◽  
pp. 37
Author(s):  
Toyi Maniki Diphagwe ◽  
Bernard Moeketsi Hlalele ◽  
Dibuseng Priscilla Mpakathi

The 2019/20 Australian bushfires burned over 46 million acres of land, killed 34 people and left 3500 individuals homeless. Majority of deaths and buildings destroyed were in New South Wales, while the Northern Territory accounted for approximately 1/3 of the burned area. Many of the buildings that were lost were farm buildings, adding to the challenge of agricultural recovery that is already complex because of ash-covered farmland accompanied by historic levels of drought. The current research therefore aimed at characterising veldfire risk in the study area using Keetch-Byram Drought Index (KBDI). A 39-year-long time series data was obtained from an online NASA database. Both homogeneity and stationarity tests were deployed using a non-parametric Pettitt’s and Dicky-Fuller tests respectively for data quality checks. Major results revealed a non-significant two-tailed Mann Kendall trend test with a p-value = 0.789 > 0.05 significance level. A suitable probability distribution was fitted to the annual KBDI time series where both Kolmogorov-Smirnov and Chi-square tests revealed Gamma (1) as a suitably fitted probability distribution. Return level computation from the Gamma (1) distribution using XLSTAT computer software resulted in a cumulative 40-year return period of moderate to high fire risk potential. With this low probability and 40-year-long return level, the study found the area less prone to fire risks detrimental to animal and crop production. More agribusiness investments can safely be executed in the Northern Territory without high risk aversion.


2015 ◽  
Vol 12 (8) ◽  
pp. 7437-7467 ◽  
Author(s):  
J. E. Reynolds ◽  
S. Halldin ◽  
C. Y. Xu ◽  
J. Seibert ◽  
A. Kauffeldt

Abstract. Concentration times in small and medium-sized watersheds (~ 100–1000 km2) are commonly less than 24 h. Flood-forecasting models then require data at sub-daily time scales, but time-series of input and runoff data with sufficient lengths are often only available at the daily time scale, especially in developing countries. This has led to a search for time-scale relationships to infer parameter values at the time scales where they are needed from the time scales where they are available. In this study, time-scale dependencies in the HBV-light conceptual hydrological model were assessed within the generalized likelihood uncertainty estimation (GLUE) approach. It was hypothesised that the existence of such dependencies is a result of the numerical method or time-stepping scheme used in the models rather than a real time-scale-data dependence. Parameter values inferred showed a clear dependence on time scale when the explicit Euler method was used for modelling at the same time steps as the time scale of the input data (1–24 h). However, the dependence almost fully disappeared when the explicit Euler method was used for modelling in 1 h time steps internally irrespectively of the time scale of the input data. In other words, it was found that when an adequate time-stepping scheme was implemented, parameter sets inferred at one time scale (e.g., daily) could be used directly for runoff simulations at other time scales (e.g., 3 or 6 h) without any time scaling and this approach only resulted in a small (if any) model performance decrease, in terms of Nash–Sutcliffe and volume-error efficiencies. The overall results of this study indicated that as soon as sub-daily driving data can be secured, flood forecasting in watersheds with sub-daily concentration times is possible with model-parameter values inferred from long time series of daily data, as long as an appropriate numerical method is used.


Sign in / Sign up

Export Citation Format

Share Document