scholarly journals Evaluating Data Journeys: Climategate, Synthetic Data and the Benchmarking of Methods for Climate Data Processing

2020 ◽  
pp. 191-206
Author(s):  
Wendy S. Parker
2017 ◽  
Vol 8 (2) ◽  
pp. 88-105 ◽  
Author(s):  
Gunasekaran Manogaran ◽  
Daphne Lopez

Ambient intelligence is an emerging platform that provides advances in sensors and sensor networks, pervasive computing, and artificial intelligence to capture the real time climate data. This result continuously generates several exabytes of unstructured sensor data and so it is often called big climate data. Nowadays, researchers are trying to use big climate data to monitor and predict the climate change and possible diseases. Traditional data processing techniques and tools are not capable of handling such huge amount of climate data. Hence, there is a need to develop advanced big data architecture for processing the real time climate data. The purpose of this paper is to propose a big data based surveillance system that analyzes spatial climate big data and performs continuous monitoring of correlation between climate change and Dengue. Proposed disease surveillance system has been implemented with the help of Apache Hadoop MapReduce and its supporting tools.


Geophysics ◽  
2020 ◽  
Vol 85 (6) ◽  
pp. G129-G141
Author(s):  
Diego Takahashi ◽  
Vanderlei C. Oliveira Jr. ◽  
Valéria C. F. Barbosa

We have developed an efficient and very fast equivalent-layer technique for gravity data processing by modifying an iterative method grounded on an excess mass constraint that does not require the solution of linear systems. Taking advantage of the symmetric block-Toeplitz Toeplitz-block (BTTB) structure of the sensitivity matrix that arises when regular grids of observation points and equivalent sources (point masses) are used to set up a fictitious equivalent layer, we develop an algorithm that greatly reduces the computational complexity and RAM memory necessary to estimate a 2D mass distribution over the equivalent layer. The structure of symmetric BTTB matrix consists of the elements of the first column of the sensitivity matrix, which, in turn, can be embedded into a symmetric block-circulant with circulant-block (BCCB) matrix. Likewise, only the first column of the BCCB matrix is needed to reconstruct the full sensitivity matrix completely. From the first column of the BCCB matrix, its eigenvalues can be calculated using the 2D fast Fourier transform (2D FFT), which can be used to readily compute the matrix-vector product of the forward modeling in the fast equivalent-layer technique. As a result, our method is efficient for processing very large data sets. Tests with synthetic data demonstrate the ability of our method to satisfactorily upward- and downward-continue gravity data. Our results show very small border effects and noise amplification compared to those produced by the classic approach in the Fourier domain. In addition, they show that, whereas the running time of our method is [Formula: see text] s for processing [Formula: see text] observations, the fast equivalent-layer technique used [Formula: see text] s with [Formula: see text]. A test with field data from the Carajás Province, Brazil, illustrates the low computational cost of our method to process a large data set composed of [Formula: see text] observations.


2018 ◽  
Vol 44 ◽  
pp. 00083 ◽  
Author(s):  
Leszek Kuchar ◽  
Slawomir Iwanski

In this paper a new validation test for the spatial weather generator SWGEN producing the multisite daily time series of solar radiation, temperature and precipitation is presented. The method was tested by comparing statistics of 1000 years of generated data with extra long series of 35 years of observed weather parameters and 24 sites of meteorological stations for south-west Poland. The method evaluation showed that the means (sums) and variances of generated data were comparable with observed climatic data aggregated for months, seasons and years.


Energies ◽  
2021 ◽  
Vol 15 (1) ◽  
pp. 33
Author(s):  
Iker Elorza ◽  
Iker Arrizabalaga ◽  
Aritz Zubizarreta ◽  
Héctor Martín-Aguilar ◽  
Aron Pujana-Arrese ◽  
...  

Modern wind turbines depend on their blade pitch systems for start-ups, shutdowns, and power control. Pitch system failures have, therefore, a considerable impact on their operation and integrity. Hydraulic pitch systems are very common, due to their flexibility, maintainability, and cost; hence, the relevance of diagnostic algorithms specifically targeted at them. We propose one such algorithm based on sensor data available to the vast majority of turbine controllers, which we process to fit a model of the hydraulic pitch system to obtain significant indicators of the presence of the critical failure modes. This algorithm differs from state-of-the-art, model-based algorithms in that it does not numerically time-integrate the model equations in parallel with the physical turbine, which is demanding in terms of in situ computation (or, alternatively, data transmission) and is highly susceptible to drift. Our algorithm requires only a modest amount of local sensor data processing, which can be asynchronous and intermittent, to produce negligible quantities of data to be transmitted for remote storage and analysis. In order to validate our algorithm, we use synthetic data generated with state-of-the-art aeroelastic and hydraulic simulation software. The results suggest that a diagnosis of the critical wind turbine hydraulic pitch system failure modes based on our algorithm is viable.


Geophysics ◽  
1967 ◽  
Vol 32 (2) ◽  
pp. 207-224 ◽  
Author(s):  
John D. Marr ◽  
Edward F. Zagst

The more recent developments in common‐depth‐point techniques to attenuate multiple reflections have resulted in an exploration capability comparable to the development of the seismic reflection method. The combination of new concepts in digital seismic data processing with CDP techniques is creating unforeseen exploration horizons with vastly improved seismic data. Major improvements in multiple reflection and reverberation attenuation are now attainable with appropriate CDP geometry and special CDP stacking procedures. Further major improvements are clearly evident in the very near future with the use of multichannel digital filtering‐stacking techniques and the application of deconvolution as the first step in seismic data processing. CDP techniques are briefly reviewed and evaluated with real and experimental data. Synthetic data are used to illustrate that all seismic reflection data should be deconvolved as the first processing step.


2014 ◽  
Vol 1041 ◽  
pp. 129-134 ◽  
Author(s):  
Daniela Štaffenová ◽  
Radoslav Ponechal ◽  
Pavol Ďurica ◽  
Marek Cangár

This article describes a part of experimental measurement with assistance of independent weather station. Later on, after the data processing from chosen period of time (1 week in March 2014) measurements are described, graphic processed, interpreted and compared to the results of real meteorological station from SHMÚ. Convenient climatic units are elaborated. The influence of differences in measured data on results of calculations for thermal technical parameters of covering constructions is analysed.


2018 ◽  
Vol 11 (5) ◽  
pp. 3021-3029
Author(s):  
Stefanie Kremser ◽  
Jordis S. Tradowsky ◽  
Henning W. Rust ◽  
Greg E. Bodeker

Abstract. Upper-air measurements of essential climate variables (ECVs), such as temperature, are crucial for climate monitoring and climate change detection. Because of the internal variability of the climate system, many decades of measurements are typically required to robustly detect any trend in the climate data record. It is imperative for the records to be temporally homogeneous over many decades to confidently estimate any trend. Historically, records of upper-air measurements were primarily made for short-term weather forecasts and as such are seldom suitable for studying long-term climate change as they lack the required continuity and homogeneity. Recognizing this, the Global Climate Observing System (GCOS) Reference Upper-Air Network (GRUAN) has been established to provide reference-quality measurements of climate variables, such as temperature, pressure, and humidity, together with well-characterized and traceable estimates of the measurement uncertainty. To ensure that GRUAN data products are suitable to detect climate change, a scientifically robust instrument replacement strategy must always be adopted whenever there is a change in instrumentation. By fully characterizing any systematic differences between the old and new measurement system a temporally homogeneous data series can be created. One strategy is to operate both the old and new instruments in tandem for some overlap period to characterize any inter-instrument biases. However, this strategy can be prohibitively expensive at measurement sites operated by national weather services or research institutes. An alternative strategy that has been proposed is to alternate between the old and new instruments, so-called interlacing, and then statistically derive the systematic biases between the two instruments. Here we investigate the feasibility of such an approach specifically for radiosondes, i.e. flying the old and new instruments on alternating days. Synthetic data sets are used to explore the applicability of this statistical approach to radiosonde change management.


Sign in / Sign up

Export Citation Format

Share Document