Revisiting the deformation transients before the 2011 Tohoku-Oki Megathrust Earthquake with GPS

Author(s):  
Anne Socquet ◽  
Lou Marill ◽  
David Marsan ◽  
Baptiste Rousset ◽  
Mathilde Radiguet ◽  
...  

<p>The precursory activity leading up to the Tohoku-Oki earthquake of 2011 has been suggested to feature both long- and short-term episodes of decoupling and suggests a particularly complex slow slip history. The analysis of the F3 solution of the Japanese GPS network suggested that an accelerated slip occurred in the deeper part of the seismogenic zone during the 10 years preceding the earthquake (Heki & Mitsui, EPSL 2013; Mavrommatis et al., GRL 2014; Yokota & Koketsu, Nat. Com. 2015). During the two months preceding the earthquake, no anomaly in the GPS position time series has been revealed so far, although several anomalous geophysical signals have been reported (an extended foreshock crisis near the future hypocenter (Kato et al., Science 2012), a synchronized increase of intermediate-depth background seismicity (Bouchon et al., Nat Geosc. 2016), a signal in ocean-bottom pressure gauges and on-land strainmeter time series (Ito et al., Tectonoph. 2013), and large scale gravity anomalies that suggest deep-seated slab deformation processes (Panet et al., Nat. Geosc. 2018 ; Wang & Burgmann, GRL 2019)).</p><p>We present novel results based on an independent analysis of the Japanese GPS data set. We perform a full reprocessing of the raw data with a double-difference approach, a systematic analysis of the obtained time-series, including noise characterization and network filtering, and make a robust assessment of long- and short-term tectonic aseismic transients preceding the Tohoku-Oki earthquake. An accelerated slip on the lower part of the seismogenic zone over the last decade is confirmed, not only below the epicenter of Tohoku-Oki earthquake but also further south, offshore Boso peninsula, which is a worrying sign of an on-going slow decoupling east of Tokyo. At shorter time-scale, first results seem compatible with a slow slip close to the epicenter initiating ~ 2 months before the mainshock.</p>

2021 ◽  
Vol 17 (12) ◽  
pp. 155014772110612
Author(s):  
Zhengqiang Ge ◽  
Xinyu Liu ◽  
Qiang Li ◽  
Yu Li ◽  
Dong Guo

To significantly protect the user’s privacy and prevent the user’s preference disclosure from leading to malicious entrapment, we present a combination of the recommendation algorithm and the privacy protection mechanism. In this article, we present a privacy recommendation algorithm, PrivItem2Vec, and the concept of the recommended-internet of things, which is a privacy recommendation algorithm, consisting of user’s information, devices, and items. Recommended-internet of things uses bidirectional long short-term memory, based on item2vec, which improves algorithm time series and the recommended accuracy. In addition, we reconstructed the data set in conjunction with the Paillier algorithm. The data on the server are encrypted and embedded, which reduces the readability of the data and ensures the data’s security to a certain extent. Experiments show that our algorithm is superior to other works in terms of recommended accuracy and efficiency.


2019 ◽  
Vol 52 (13-14) ◽  
pp. 2283-2313 ◽  
Author(s):  
Will Jennings ◽  
Clare Saunders

This article argues that the agenda-setting power of protest must be understood in dynamic terms. Specifically, it develops and tests a dynamic theory of media reaction to protest which posits that features of street demonstrations—such as their size, violence, societal conflict, and the presence of a “trigger”—lead protest issues to be reported and sustained in the media agenda over time. We conduct a unique empirical analysis of media coverage of protest issues, based upon a data set of 48 large-scale street demonstrations in nine countries. Time-series cross-sectional analysis is used to estimate the dynamic effects of demonstration features on media coverage of the protest issue. The findings show that violence can increase media attention in the short term and larger protest size sustains it over the longer term. The agenda-setting power of protest is structured in time.


Author(s):  
Young-Rae Cho ◽  
Aidong Zhang

High-throughput techniques involve large-scale detection of protein-protein interactions. This interaction data set from the genome-scale perspective is structured into an interactome network. Since the interaction evidence represents functional linkage, various graph-theoretic computational approaches have been applied to the interactome networks for functional characterization. However, this data is generally unreliable, and the typical genome-wide interactome networks have a complex connectivity. In this paper, the authors explore systematic analysis of protein interactome networks, and propose a $k$-round signal flow simulation algorithm to measure interaction reliability from connection patterns of the interactome networks. This algorithm quantitatively characterizes functional links between proteins by simulating the propagation of information signals through complex connections. In this regard, the algorithm efficiently estimates the strength of alternative paths for each interaction. The authors also present an algorithm for mining the complex interactome network structure. The algorithm restructures the network by hierarchical ordering of nodes, and this structure re-formatting process reveals hub proteins in the interactome networks. This paper demonstrates that two rounds of simulation accurately scores interaction reliability in terms of ontological correlation and functional consistency. Finally, the authors validate that the selected structural hubs represent functional core proteins.


Ocean Science ◽  
2015 ◽  
Vol 11 (6) ◽  
pp. 953-963 ◽  
Author(s):  
K. Bentel ◽  
F. W. Landerer ◽  
C. Boening

Abstract. The Atlantic Meridional Overturning Circulation (AMOC) is a key mechanism for large-scale northward heat transport and thus plays an important role for global climate. Relatively warm water is transported northward in the upper layers of the North Atlantic Ocean and, after cooling at subpolar latitudes, sinks down and is transported back south in the deeper limb of the AMOC. The utility of in situ ocean bottom pressure (OBP) observations to infer AMOC changes at single latitudes has been characterized in the recent literature using output from ocean models. We extend the analysis and examine the utility of space-based observations of time-variable gravity and the inversion for ocean bottom pressure to monitor AMOC changes and variability between 20 and 60° N. Consistent with previous results, we find a strong correlation between the AMOC signal and OBP variations, mainly along the western slope of the Atlantic Basin. We then use synthetic OBP data – smoothed and filtered to resemble the resolution of the GRACE (Gravity Recovery and Climate Experiment) gravity mission, but without errors – and reconstruct geostrophic AMOC transport. Due to the coarse resolution of GRACE-like OBP fields, we find that leakage of signal across the step slopes of the ocean basin is a significant challenge at certain latitudes. Transport signal rms is of a similar order of magnitude as error rms for the reconstructed time series. However, the interannual AMOC anomaly time series can be recovered from 20 years of monthly GRACE-like OBP fields with errors less than 1 sverdrup in many locations.


2012 ◽  
Vol 586 ◽  
pp. 322-327 ◽  
Author(s):  
Jian Xiong Long ◽  
Lin Ming Yu ◽  
Shi Chen

In the battery state of charge of a systematic analysis, the observed data has a property that is the direction of time t (referred to as vertical) for a limited length, and number of samples obtained by N (called horizontal) for the infinite data set , it is called as a short sequence and multi-sample time series. By studying the characteristic of this time series, a new system identification method has been proposed, and the system identifiability for this process has been demonstrated. Through practice simulations, a satisfactory application results have been obtained. This feature of the time series identification problem is the same in other areas have a certain reference value.


2000 ◽  
Vol 10 (12) ◽  
pp. 2767-2780 ◽  
Author(s):  
LIANGYUE CAO ◽  
ALISTAIR MEES

We have found numerical evidence of deterministic properties in a multichannel physiological time series, a segment of the Santa Fe data set B which was recorded from a patient with sleep apnea. We used recently developed methods for finding good time-delay embeddings for multichannel data, together with nonlinear deterministic prediction. We show that determining good embeddings for this multivariate time series gives good short-term predictions, and convincing free-run (simulation) behavior, using only a simple local-linear approximation method. We also discovered relations between the different channels, which are heart-rate, respiration and blood-oxygen saturation.


Author(s):  
Dorothea Metzen ◽  
Erhan Genç ◽  
Stephan Getzmann ◽  
Mauro F. Larra ◽  
Edmund Wascher ◽  
...  

AbstractEEG resting-state alpha asymmetry is one of the most widely investigated forms of functional hemispheric asymmetries in both basic and clinical neuroscience. However, studies yield inconsistent results. One crucial prerequisite to obtain reproducible results is the reliability of the index of interest. There is a body of research suggesting a moderate-to-good reliability of EEG resting-state alpha asymmetry, but unfortunately sample sizes in these studies are typically small. This study presents the first large-scale short-term reliability study of frontal and parietal EEG resting-state alpha asymmetry. We used the Dortmund Vital Study data set containing 370 participants. In each participant, EEG resting state was recorded eight times, twice with their eyes opened, twice with their eyes-closed, each on two different EEG systems. We found good reliability of EEG alpha power and alpha asymmetry on both systems for electrode pairs. We also found that alpha power asymmetry reliability is higher in the eyes-closed condition than in the eyes-open condition. The frontomedial electrode pair showed weaker reliability than the frontolateral and parietal electrode pairs. Interestingly, we found no population-level alpha asymmetry in frontal electrodes, one of the most investigated electrode sites in alpha asymmetry research. In conclusion, our results suggest that while EEG alpha asymmetry is an overall reliable measure, frontal alpha asymmetry should be assessed using multiple electrode pairs.


2021 ◽  
Author(s):  
Theresa Schellander-Gorgas ◽  
Frank Kreienkamp ◽  
Philip Lorenz ◽  
Christoph Matulla ◽  
Janos Tordai

<p>EPISODES is an empirical statistical downscaling (ESD) method, which has been initiated and developed by the German Weather Service (DWD). Having resulted in good evaluation scores for Germany, the methodology it is also set-up and adapted for Austria at ZAMG and, hence, for an alpine territory with complex topography.</p><p>ESD methods are sparing regarding computational costs compared to dynamical downscaling models. Due to this advantage ESD can be applied in a short time frame and in a demand-based manner. It enables, e.g., processing ensembles of downscaled climate projections, which can be assessed either as stand-alone data set or to enhance ensembles based on dynamical methods. This helps improve the robustness of climatological statements for the purpose of climate impact research.</p><p>Preconditions for achieving high-quality results by EPISODES are long-term, temporally consistent observation data sets and a best possible realistic reproduction of relevant large-scale weather conditions by the GCMs. Given these requirements, EPISODES produces high-quality multivariate and spatially/temporally consistent synthetic time series on regular grids or station locations. The output is provided for daily time steps and, at maximum, for the resolution of underlying observation data.</p><p>The EPISODES method consists on mainly two steps: At first stage, univariate time series are produced on a coarse grid based on the analogue method and linear regression. It means that coarse scale atmospheric conditions of each single day as described by the GCM projections are assigned to a selection of at most similar daily weather situations of the observed past. From this selection new values are determined by linear regression for each day.</p><p>The second stage of the EPISODES method works like a weather generator. Short-term anomalies based on first stage results, on the one hand, and on observations, on the other hand, are matched selecting the most similar day for all used meteorological parameters and coarse grid points at the same time. Together with the high-resolution climatological background of observations and the climatological shift as described by GCM projections the short-term variability are combined to synthetic daily values for each target grid point. This approach provides the desired characteristics of the downscaled climate projections such as multivariability and spatio-temporal consistency.</p><p>Recent EPISODES evaluation results for daily precipitation and daily mean temperature are presented for the Austrian federal territory. Performance of the EPISODES ensemble will also be discussed in relation to existing ensembles based on dynamical methods which have already been widely used in climate impact studies in Austria: EURO-CORDEX and ÖKS15.</p>


2019 ◽  
Vol 119 (8) ◽  
pp. 1748-1763 ◽  
Author(s):  
Mengdi Li ◽  
Eugene Chng ◽  
Alain Yee Loong Chong ◽  
Simon See

Purpose Emoji has become an essential component of any digital communication and its importance can be attested to by its sustained popularity and widespread use. However, research in Emojis is rarely to be seen due to the lack of data at a greater scale. The purpose of this paper is to systematically analyse and compare the usage of Emojis in a cross-cultural manner. Design/methodology/approach This research conducted an empirical analysis using a large-scale, cross-regional emoji usage data set from Twitter, a platform where the limited 140 characters allowance has made it essential for the inclusion of emojis within tweets. The extremely large textual data set covers a period of only two months, but the 673m tweets authored by more than 2,081,542 unique users is a sufficiently large sample for the authors to yield significant results. Findings This research discovered that the categories and frequencies of Emojis communicated by users can provide a rich source of data to understand cultural differences between Twitter users from a large range of demographics. This research subsequently demonstrated the preferential use of Emojis complies with Hofstede’s Cultural Dimensions Model, in which different representations of demographics and culture within countries present significantly different use of Emojis to communicate emotions. Originality/value This study provides a robust example of how to strategically conduct research using large-scale emoji data to pursue research questions previously difficult. To the best of authors’ knowledge, the present study pioneers the first systematic analysis and comparison of the usage of emojis on Twitter across different cultures; it is the largest, in terms of the scale study of emoji usage to-date.


2021 ◽  
Vol 13 (24) ◽  
pp. 5000
Author(s):  
Felix Reuß ◽  
Isabella Greimeister-Pfeil ◽  
Mariette Vreugdenhil ◽  
Wolfgang Wagner

To ensure future food security, improved agricultural management approaches are required. For many of those applications, precise knowledge of the distribution of crop types is essential. Various machine and deep learning models have been used for automated crop classification using microwave remote sensing time series. However, the application of these approaches on a large spatial and temporal scale is barely investigated. In this study, the performance of two frequently used algorithms, Long Short-Term Memory (LSTM) networks and Random Forest (RF), for crop classification based on Sentinel-1 time series and meteorological data on a large spatial and temporal scale is assessed. For data from Austria, the Netherlands, and France and the years 2015–2019, scenarios with different spatial and temporal scales were defined. To quantify the complexity of these scenarios, the Fisher Discriminant measurement F1 (FDR1) was used. The results demonstrate that both classifiers achieve similar results for simple classification tasks with low FDR1 values. With increasing FDR1 values, however, LSTM networks outperform RF. This suggests that the ability of LSTM networks to learn long-term dependencies and identify the relation between radar time series and meteorological data becomes increasingly important for more complex applications. Thus, the study underlines the importance of deep learning models, including LSTM networks, for large-scale applications.


Sign in / Sign up

Export Citation Format

Share Document