scholarly journals Algorithms of ionospheric anomalies detection in “Aurora” system of operational data analysis

2018 ◽  
Vol 62 ◽  
pp. 02002
Author(s):  
Yuryi Polozov ◽  
Nadezhda Fetisova

Algorithms for ionospheric data processing are presented in the paper. The algorithms are implemented in the real-time mode of ionospheric parameter analysis. They are a component of “Aurora” software system for geophysical data analysis. The algorithms allow us to estimate the state of the ionosphere in the region of Kamchatka Peninsula and to detect ionospheric anomalies. Assessment of the algorithms efficiency has shown that it is possible to use them to detect ionospheric anomalies that may occur on the eve of magnetic storms. The research is supported by the Russian Science Foundation Grant (Project No. 14-11-00194).

2013 ◽  
Vol 53 (A) ◽  
pp. 807-810
Author(s):  
I. I. Yashin ◽  
N. V. Ampilogov ◽  
I.I. Astapov ◽  
N.S. Barbashina ◽  
V.V. Borog ◽  
...  

Muon diagnostics is a technique for remote monitoring of active processes in the heliosphere and the magnetosphere of the Earth based on the analysis of angular variations of muon flux simultaneously detected from all directions of the upper hemisphere. To carry out muon diagnostics, special detectors – muon hodoscopes – which can detect muons from any direction with good angular resolution in real-time mode are required. We discuss approaches to data analysis and the results of studies of various extra-terrestrial processes detected by means of the wide aperture URAGAN muon hodoscope.


2019 ◽  
Vol 127 ◽  
pp. 01003 ◽  
Author(s):  
Yuryi Polozov ◽  
Nadezhda Fetisova

The paper presents the results of detection of ionospheric anomalies in online mode according to the ionosonde data at Paratunka station, Kamchatka peninsula (IKIR FEB RAS). The developed algorithms have been implemented in Aurora system for online geophysical data analysis (http://lsaoperanalysis.ikir.ru:9180/lsaoperanalysis.html). The algorithms allow us to detect sudden anomalous changes of varying intensity in the dynamics of ionospheric parameters, as well as to estimate their characteristics. The efficiency of the system and the possibility of its application in space weather forecast tasks have been shown on the examples of events occurred in 2019.


2021 ◽  
Author(s):  
Carolina Nunes ◽  
Jasper Anckaert ◽  
Fanny De Vloed ◽  
Jolien De Wyn ◽  
Kaat Durinck ◽  
...  

Biomedical researchers are moving towards high-throughput screening, as this allows for automatization, better reproducibility and more and faster results. High-throughput screening experiments encompass drug, drug combination, genetic perturbagen or a combination of genetic and chemical perturbagen screens. These experiments are conducted in real-time assays over time or in an endpoint assay. The data analysis consists of data cleaning and structuring, as well as further data processing and visualisation, which, due to the amount of data, can easily become laborious, time consuming, and error-prone. Therefore, several tools have been developed to aid researchers in this data analysis, but they focus on specific experimental set-ups and are unable to process data of several time points and genetic-chemical perturbagen screens together. To meet these needs, we developed HTSplotter, available as web tool and Python module, that performs automatic data analysis and visualisation of either endpoint or real-time assays from different high-throughput screening experiments: drug, drug combination, genetic perturbagen and genetic-chemical perturbagen screens. HTSplotter implements an algorithm based on conditional statements in order to identify experiment type and controls. After appropriate data normalization, HTSplotter executes downstream analyses such as dose-response relationship and drug synergism by the Bliss independence method. All results are exported as a text file and plots are saved in a PDF file. The main advantage of HTSplotter over other available tools is the automatic analysis of genetic-chemical perturbagen screens and real-time assays where results are plotted over time. In conclusion, HTSplotter allows for the automatic end-to-end data processing, analysis and visualisation of various high-throughput in vitro cell culture screens, offering major improvements in terms of versatility, convenience and time over existing tools.


2018 ◽  
Vol 14 (05) ◽  
pp. 19
Author(s):  
Haohao Yuan

The system designed is a ship collision avoidance system based on consistent use of satellite positioning technology, spread spectrum communication technology and a wireless sensor network. The system design includes: an information collecting terminal, a data processing terminal and a mobile data terminal as the three main parts. CC2530 is selected as the master chip for the information collecting terminal, and the GPS module with NEO-6M UBLOX satellite positioning function is used to obtain the latitude, longitude, heading and other information. The AS62-T30 wireless communication module is used to realize the data interaction between ships, and a 0.96-inch OLED display module is used to show the current location of the ship, thus realizing the GPS positioning data receiving, data analysis, information display, data integration and transmission, and other functions. In terms of the software in the data processing terminal, QT5 is selected as the development environment, and QtSql as the database to process and store the data packet sent by the information collecting terminal. The system has many functions including real-time data analysis and alarm, real-time location annotation, track query, route planning and weather forecasting, etc.


2020 ◽  
pp. 3-8
Author(s):  
Jala Aghazada

Data warehouse (DW) is the basis of systems for operational data analysis (OLAP-Online Analytical Processing). Data extracted from different sources transforms and load in DW. Proper organization of this process, which is called ETL (Extract, Transform, Load) has important significance in creation of DW and analytical data processing. Forms of organization, methods of realization and modeling of ETL processes are considered in this paper.


2013 ◽  
Vol 336-338 ◽  
pp. 295-302 ◽  
Author(s):  
Yang Ming Xie ◽  
Qing Li ◽  
Guo Qing Jiang

In order to thoroughly reflect the underground deformation of rock mass, in this article, a sensor system which study on the landslide is invented and the reliable fitting formula based on the experimental data is produced. In first part, we briefly introduce the fundamental principles and measuringways of the instrument, then describe the whole effective monitoring process, and in the data processing, finally obtain the efficacious fitting formula by analyzing basic steps of Levenberg-Marquardt algorithm and utilizing this algorithm to fit experimental data. The experiment demonstrates that the real-time underground displacement measurement is practical and can be applied to analyze early deformation of rock mass and warn the unstable situation.


2021 ◽  
Vol 315 ◽  
pp. 03027
Author(s):  
Yuri Ignatov ◽  
Oleg Tailakov ◽  
Evgeniy Saltymakov ◽  
Daniil Gorodilov

In modern times, the development of geology and geophysics is associated with complex experiments. The results of these experiments are large arrays of numerical data, which require processing and further analysis. If to process these data manually, it can be a very difficult and routine task. For such studies, specialized tools are important, which are necessary to significantly speed up the processing process and to render visualization of geophysical data in real time. The software is worked out to automate the geophysical data processing obtained after electrical exploration procedure. The designed postprocessor performs functions of data correction and geological and geophysical profile visualization. The user interface of the program provides researchers with the ability to interactively process the initial geophysical data.


2016 ◽  
Vol 12 (S325) ◽  
pp. 73-82 ◽  
Author(s):  
Pierre Dubath ◽  
Nikolaos Apostolakos ◽  
Andrea Bonchi ◽  
Andrey Belikov ◽  
Massimo Brescia ◽  
...  

AbstractEuclid is a Europe-led cosmology space mission dedicated to a visible and near infrared survey of the entire extra-galactic sky. Its purpose is to deepen our knowledge of the dark content of our Universe. After an overview of the Euclid mission and science, this contribution describes how the community is getting organized to face the data analysis challenges, both in software development and in operational data processing matters. It ends with a more specific account of some of the main contributions of the Swiss Science Data Center (SDC-CH).


Sign in / Sign up

Export Citation Format

Share Document