scholarly journals Detection capability of the Italian network for teleseismic events

1994 ◽  
Vol 37 (3) ◽  
Author(s):  
R. Di Maro ◽  
A. Marchetti

The future GSE experiment is based on a global seismic monitoring system, that should be designed for monitoring compliance with a nuclear test ban treaty. Every country participating in the test will transmit data to the International Data Center. Because of the high quality of data required, we decided to conduct this study in order to determine the set of stations to be used in the experiment. The Italian telemetered seismological network can detect all events of at least magnitude 2.5 whose epicenters are inside the network itself. For external events the situation is different: the capabilíty of detection is conditioned not only by the noise condition of the station, but also by the relative position of epicenter and station. The ING bulletin (January 1991-June 1992) was the data set for the present work. Comparing these data with the National Earthquake Information Center (NEIC) bulletin, we established which stations are most reliable in detecting teleseismic events and, moreover, how distance and back-azimuth can influence event detection. Furthermore, we investigated the reliability of the automatic acquisition system in relation to teleseismic event detection.

1994 ◽  
Vol 37 (3) ◽  
Author(s):  
F. Ringdal

The UN Conference on Disarmament's Group of Scientific Experts (GSE) was established in 1976 to consider international co operative measures to detect and identify seismic events. Over the years, the GSE has developed and tested several concepts for an International Seismic Monitoring System (ISMS) for the purpose of assisting in the verification of a potential comprehensive test ban treaty. The GSE is now planning its third global technical test. (GSETT 3) in order to test new and revisled concepts for an ISMS. GSETT 3 wili be an unprecedented global effort to conduct an operationally realistic test of rapid collection, distribution and processing of seismie data. A global network of seismograph stations will provide data to an International Data Center, where the data will be processed an results made available to participants. The full scaIe phase of GSETT 3 is scheduled to begin in January 1995.


2021 ◽  
pp. 1-11
Author(s):  
Yanan Huang ◽  
Yuji Miao ◽  
Zhenjing Da

The methods of multi-modal English event detection under a single data source and isomorphic event detection of different English data sources based on transfer learning still need to be improved. In order to improve the efficiency of English and data source time detection, based on the transfer learning algorithm, this paper proposes multi-modal event detection under a single data source and isomorphic event detection based on transfer learning for different data sources. Moreover, by stacking multiple classification models, this paper makes each feature merge with each other, and conducts confrontation training through the difference between the two classifiers to further make the distribution of different source data similar. In addition, in order to verify the algorithm proposed in this paper, a multi-source English event detection data set is collected through a data collection method. Finally, this paper uses the data set to verify the method proposed in this paper and compare it with the current most mainstream transfer learning methods. Through experimental analysis, convergence analysis, visual analysis and parameter evaluation, the effectiveness of the algorithm proposed in this paper is demonstrated.


2021 ◽  
Author(s):  
Hansi Hettiarachchi ◽  
Mariam Adedoyin-Olowe ◽  
Jagdev Bhogal ◽  
Mohamed Medhat Gaber

AbstractSocial media is becoming a primary medium to discuss what is happening around the world. Therefore, the data generated by social media platforms contain rich information which describes the ongoing events. Further, the timeliness associated with these data is capable of facilitating immediate insights. However, considering the dynamic nature and high volume of data production in social media data streams, it is impractical to filter the events manually and therefore, automated event detection mechanisms are invaluable to the community. Apart from a few notable exceptions, most previous research on automated event detection have focused only on statistical and syntactical features in data and lacked the involvement of underlying semantics which are important for effective information retrieval from text since they represent the connections between words and their meanings. In this paper, we propose a novel method termed Embed2Detect for event detection in social media by combining the characteristics in word embeddings and hierarchical agglomerative clustering. The adoption of word embeddings gives Embed2Detect the capability to incorporate powerful semantical features into event detection and overcome a major limitation inherent in previous approaches. We experimented our method on two recent real social media data sets which represent the sports and political domain and also compared the results to several state-of-the-art methods. The obtained results show that Embed2Detect is capable of effective and efficient event detection and it outperforms the recent event detection methods. For the sports data set, Embed2Detect achieved 27% higher F-measure than the best-performed baseline and for the political data set, it was an increase of 29%.


1987 ◽  
Vol 77 (4) ◽  
pp. 1437-1445
Author(s):  
M. Baer ◽  
U. Kradolfer

Abstract An automatic detection algorithm has been developed which is capable of time P-phase arrivals of both local and teleseismic earthquakes, but rejects noise bursts and transient events. For each signal trace, the envelope function is calculated and passed through a nonlinear amplifier. The resulting signal is then subjected to a statistical analysis to yield arrival time, first motion, and a measure of reliability to be placed on the P-arrival pick. An incorporated dynamic threshold lets the algorithm become very sensitive; thus, even weak signals are timed precisely. During an extended performance evaluation on a data set comprising 789 P phases of local events and 1857 P phases of teleseismic events picked by an analyst, the automatic picker selected 66 per cent of the local phases and 90 per cent of the teleseismic phases. The accuracy of the automatic picks was “ideal” (i.e., could not be improved by the analyst) for 60 per cent of the local events and 63 per cent of the teleseismic events.


Author(s):  
Jeffrey Hanson ◽  
Ronan Le Bras ◽  
Douglas Brumbaugh ◽  
Jerry Guern ◽  
Paul Dysart ◽  
...  

2021 ◽  
Vol 2021 ◽  
pp. 1-8
Author(s):  
Wentao Yu ◽  
Xiaohui Huang ◽  
Qingjun Yuan ◽  
Mianzhu Yi ◽  
Sen An ◽  
...  

Detecting information security events from multimodal data can help analyze the evolution of events in the security field. The Tree-LSTM network that introduces the self-attention mechanism was used to construct the sentence-vectorized representation model (SAtt-LSTM: Tree-LSTM with self-attention) and then classify the candidate event sentences through the representation results of the SAtt-LSTM model to obtain the event of the candidate event sentence types. Event detection using sentence classification methods can solve the problem of error cascade based on pipeline methods, and the problem of CNN or RNN cannot make full use of the syntactic information of candidate event sentences in methods based on joint learning. The paper treats the event detection task as a sentence classification task. In order to verify the effectiveness and superiority of the method in this paper, the DuEE data set was used for experimental verification. Experimental results show that this model has better performance than methods that use chain structure LSTM, CNN, or only Tree-LSTM.


2001 ◽  
Vol 44 (1) ◽  
Author(s):  
M. Cocco ◽  
F. Ardizzoni ◽  
R. M. Azzara ◽  
L. Dall'Olio ◽  
A. Delladio ◽  
...  

Broadband seismograms recorded at a borehole three-component (high dynamic range) seismic station in the Po Valley (Northern Italy) were analyzed to study the velocity structure of the shallow sedimentary layers as well as the local site effects in soft sediments. The broadband borehole seismometer was installed at a depth of 135 m just below the quaternary basement, while a second digital broadband seismometer was installed in the same site at the Earth surface. The velocity structure in the shallower layers was determined both by means of cross-hole and up-hole measurements and by inverting seismic data recorded during a seismic exploration experiment.Velocity discontinuities are quite well related to the stratigraphy of the site. We are interested to record local earthquakes as well as regional and teleseismic events. The analyzed data set includes local, regional and teleseismic events, most of which were recorded during the seismic sequence that started on October 15, 1996, near Reggio Emilia 80 km away from the borehole site. The orientation of the borehole sensor is determined using the recordings of a teleseismic event and of some local earthquakes. The noise reduction for the borehole sensor is 2 decades in power spectral density at frequencies larger than 1.0 Hz. We studied the site amplification of the shallow alluvial layers by applying the spectral ratio method. We analyzed the spectral ratios of noise recorded by the surface and borehole seismometers as well as those from local earthquakes. We compared these observations with a theoretical model for the site response computed by the Haskell-Thomson method.


2021 ◽  
Vol 27 (3) ◽  
pp. 8-34
Author(s):  
Tatyana Cherkashina

The article presents the experience of converting non-targeted administrative data into research data, using as an example data on the income and property of deputies from local legislative bodies of the Russian Federation for 2019, collected as part of anticorruption operations. This particular empirical fragment was selected for the pilot study of administrative data, which includes assessing the possibility of integrating scattered fragments of information into a single database, assessing quality of data and their relevance for solving research problems, particularly analysis of high-income strata and the apparent trends towards individualization of private property. The system of indicators for assessing data quality includes their timeliness, availability, interpretability, reliability, comparability, coherence, errors of representation and measurement, and relevance. In the case of the data set in question, measurement errors are more common than representation errors. Overall the article emphasizes the notion that introducing new non-target data into circulation requires their preliminary testing, while data quality assessment becomes distributed both in time and between different subjects. The transition from created data to «obtained» data shifts the functions of evaluating its quality from the researcher-creator to the researcheruser. And though in this case data quality is in part ensured by the legal support for their production, the transformation of administrative data into research data involves assessing a variety of quality measurements — from availability to uniformity and accuracy.


Sign in / Sign up

Export Citation Format

Share Document