reference event
Recently Published Documents


TOTAL DOCUMENTS

26
(FIVE YEARS 4)

H-INDEX

8
(FIVE YEARS 1)

2021 ◽  
Author(s):  
Steven Gibbons

Correlation detectors are now used routinely in seismology to detect occurrences of signals bearing close resemblance to a reference waveform. They facilitate the detection of low-amplitude signals in significant background noise that may elude detection using energy detectors, and they associate a detected signal with a source location. Many seismologists use the fully normalized correlation coefficient $C$ between the template and incoming data to determine a detection. This is in contrast to other fields with a longer tradition for matched filter detection where the theoretically optimal statistic $C^2$ is typical. We perform a systematic comparison between the detection statistics $C$ and $C|C|$, the latter having the same dynamic range as $C^2$ but differentiating between correlation and anti-correlation. Using a database of short waveform segments, each containing the signal on a 3-component seismometer from one of 51 closely spaced explosions, we attempt to detect P- and S- phase arrivals for all events using short waveform templates from each explosion as a reference event. We present empirical statistics of both $C$ and $C|C|$ traces and demonstrate that $C|C|$ detects confidently a higher proportion of the signals than $C$ without evidently increasing the likelihood of triggering erroneously. We recall from elementary statistics that $C^2$, also called the coefficient of determination, represents the fraction of the variance of one variable which can be explained by another variable. This means that the fraction of a segment of our incoming data that could be explained by our signal template decreases almost linearly with $C|C|$ but diminishes more rapidly as $C$ decreases. In most situations, replacing $C$ with $C|C|$ in operational correlation detectors may improve the detection sensitivity without hurting the performance-gain obtained through network stacking. It may also allow a better comparison between single-template correlation detectors and higher order multiple-template subspace detectors which, by definition, already apply an optimal detection statistic.


2019 ◽  
Vol 109 (5) ◽  
pp. 1653-1660 ◽  
Author(s):  
Ana C. Aguiar ◽  
Stephen C. Myers

Abstract We adapt the relative polarity method from Shelly et al. (2016) to compute focal mechanisms for microearthquakes associated with the 2014 hydroshearing stimulation at the Newberry volcano geothermal site. We focus the analysis on events relocated by Aguiar and Myers (2018), who report that six event clusters predominantly comprise the 2014 sequence. Data quality allows focal mechanism analysis for four of the six event clusters. We use Hardebeck and Shearer (2002, 2003; hereafter HASH) to compute focal mechanisms based on first‐motion polarities and S/P amplitude ratios. We manually determine P‐ and S‐wave polarities for a well‐recorded reference event in each cluster, then use waveform cross correlation to determine whether recordings of other events in the cluster are the same or reversed polarity at each network station. Most waveform polarities are consistent with the affiliated reference event, indicating similar focal mechanisms within each cluster. The deeper clusters are east–west‐striking normal faults, whereas the shallower clusters, close to the top of the open‐hole section of the borehole, are strike slip with east–west motion. Regional studies and prestimulation borehole breakouts find the maximum stress direction is vertical and maximum horizontal stress is approximately north–south. Fault geometry and focal mechanisms of microseismicity during the stimulation suggest that increased pressure from fluid injection predominantly caused changes in horizontal stress, consistent with predictions from numerical studies of stress change caused by fluid injection. At shallow depths, where previous studies suggest the difference between vertical and horizontal stress is lowest, injection appears to have rotated the direction of maximum stress from vertical to horizontal, resulting in strike‐slip motion. At greater depth, vertical stress continued to be the dominant direction during the stimulation, but fault orientation indicates either reactivation of pre‐existing fractures or rotation of the direction of maximum horizontal stress from approximately north–south to east–west.


ABEI Journal ◽  
2018 ◽  
Vol 20 (1) ◽  
Author(s):  
Eda Nagayama

This article proposes a fictional narrative as a postmemory representation as a conjunction of imaginary, appropriation of a position of alterity and the repertoire of Holocaust as a global reference event. Malinski (2000), by the Irish Síofra O'Donovan, will be seen as an articulation between resonance and effacement of the Holocaust. Albeit unintentionally, the novel may also support the historical revision that takes place contemporarily in Poland which intends to assure Poles strictly as Holocaust victims and impotent bystanders denying any effective participation in the genocide.Keywords: Postmemory; holocaust; affiliative postmemory; Irish literature; bystanding.


2018 ◽  
Vol 40 (3) ◽  
pp. 1246
Author(s):  
M. Pirli ◽  
J. Schweitzer

The Tripoli Seismic Array (TRISAR) is a small-aperture array designed to monitor and locate the seismicity in the area of Greece. In this study, its detection capabilities are discussed for regional and teleseismic events. A reference event list is compiled, consisting of events ofmb>5.0for regional and teleseismic distances (A>6°), according to the ISC On-line Bulletin. TRISAR automatically detected approximately 25% of these events over the entire investigated distance range. Although TRISAR slowness vector residuals are rather large, as expected for an array of such small aperture, the benefits resulting from the use of such a system for reporting regional and teleseismic activity is obvious.


2018 ◽  
Vol 31 (12) ◽  
pp. 4827-4845 ◽  
Author(s):  
Nikolaos Christidis ◽  
Andrew Ciavarella ◽  
Peter A. Stott

Attribution analyses of extreme events estimate changes in the likelihood of their occurrence due to human climatic influences by comparing simulations with and without anthropogenic forcings. Classes of events are commonly considered that only share one or more key characteristics with the observed event. Here we test the sensitivity of attribution assessments to such event definition differences, using the warm and wet winter of 2015/16 in the United Kingdom as a case study. A large number of simulations from coupled models and an atmospheric model are employed. In the most basic case, warm and wet events are defined relative to climatological temperature and rainfall thresholds. Several other classes of events are investigated that, in addition to threshold exceedance, also account for the effect of observed sea surface temperature (SST) anomalies, the circulation flow, or modes of variability present during the reference event. Human influence is estimated to increase the likelihood of warm winters in the United Kingdom by a factor of 3 or more for events occurring under any atmospheric and oceanic conditions, but also for events with a similar circulation or oceanic state to 2015/16. The likelihood of wet winters is found to increase by at least a factor of 1.5 in the general case, but results from the atmospheric model, conditioned on observed SST anomalies, are more uncertain, indicating that decreases in the likelihood are also possible. The robustness of attribution assessments based on atmospheric models is highly dependent on the representation of SSTs without the effect of human influence.


2017 ◽  
Vol 98 (6) ◽  
pp. 1139-1151 ◽  
Author(s):  
Sophie C. Lewis ◽  
Andrew D. King ◽  
Sarah E. Perkins-Kirkpatrick

Abstract The term “new normal” has been used in scientific literature and public commentary to contextualize contemporary climate events as an indicator of a changing climate due to enhanced greenhouse warming. A new normal has been used broadly but tends to be descriptive and ambiguously defined. Here we review previous studies conceptualizing this idea of a new climatological normal and argue that this term should be used cautiously and with explicit definition in order to avoid confusion. We provide a formal definition of a new climate normal relative to present based around record-breaking contemporary events and explore the timing of when such extremes become statistically normal in the future model simulations. Applying this method to the record-breaking global-average 2015 temperatures as a reference event and a suite of model climate models, we determine that 2015 global annual-average temperatures will be the new normal by 2040 in all emissions scenarios. At the regional level, a new normal can be delayed through aggressive greenhouse gas emissions reductions. Using this specific case study to investigate a climatological new normal, our approach demonstrates the greater value of the concept of a climatological new normal for understanding and communicating climate change when the term is explicitly defined. This approach moves us one step closer to understanding how current extremes will change in the future in a warming world.


2017 ◽  
Vol 9 (1) ◽  
Author(s):  
Francesco Stoppa ◽  
Claudia Principe ◽  
Mariangela Schiazza ◽  
Yu Liu ◽  
Paola Giosa ◽  
...  

AbstractVesuvius is a high-risk volcano and the 1631 Plinian eruption is a reference event for the next episode of explosive unrest. A complete stratigraphic and petrographic description of 1631 pyroclastics is given in this study. During the 1631 eruption a phonolite was firstly erupted followed by a tephritic phonolite and finally a phonolitic tephrite, indicating a layered magma chamber. We suggest that phonolitic basanite is a good candidate to be the primitive parental-melt of the 1631 eruption. Composition of apatite from the 1631 pyroclastics is different from those of CO


Sign in / Sign up

Export Citation Format

Share Document