scholarly journals Time-Series Mapping of PM10Concentration Using Multi-Gaussian Space-Time Kriging: A Case Study in the Seoul Metropolitan Area, Korea

2016 ◽  
Vol 2016 ◽  
pp. 1-10 ◽  
Author(s):  
No-Wook Park

This paper presents space-time kriging within a multi-Gaussian framework for time-series mapping of particulate matter less than 10 μm in aerodynamic diameter (PM10) concentration. To account for the spatiotemporal autocorrelation structures of monitoring data and to model the uncertainties attached to the prediction, conventional multi-Gaussian kriging is extended to the space-time domain. Multi-Gaussian space-time kriging presented in this paper is based on decomposition of the PM10concentrations into deterministic trend and stochastic residual components. The deterministic trend component is modelled and regionalized using the temporal elementary functions. For the residual component which is the main target for space-time kriging, spatiotemporal autocorrelation information is modeled and used for space-time mapping of the residual. The conditional cumulative distribution functions (ccdfs) are constructed by using the trend and residual components and space-time kriging variance. Then, the PM10concentration estimate and conditional variance are empirically obtained from the ccdfs at all locations in the study area. A case study using the monthly PM10concentrations from 2007 to 2011 in the Seoul metropolitan area, Korea, illustrates the applicability of the presented method. The presented method generated time-series PM10concentration mapping results as well as supporting information for interpretations, and led to better prediction performance, compared to conventional spatial kriging.

2016 ◽  
Author(s):  
Klaus Gierens ◽  
Kostas Eleftheratos

Abstract. In the present study we explore the capability of the intercalibrated HIRS brightness temperature data at channel 12 (the HIRS water vapour channel; T12) to reproduce ice supersaturation in the upper troposphere during the period 1979–2014. Focus is given on the transition from the HIRS 2 to the HIRS 3 instrument in the year 1999, which involved a shift of the central wavelength in channel 12 from 6.7 µm to 6.5 µm. It is shown that this shift produced a discontinuity in the time series of low T12 values ( 70 %) in the year 1999 which prevented us from maintaining a continuous, long term time series of ice saturation throughout the whole record (1979–2014). We present that additional corrections are required to the low T12 values in order to bring HIRS 3 levels down to HIRS 2 levels. The new corrections are based on the cumulative distribution functions of T12 from NOAA 14 and 15 satellites (that is, when the transition from HIRS 2 to HIRS 3 occurred). By applying these corrections to the low T12 values we show that the discontinuity in the time series caused by the transition of HIRS 2 to HIRS 3 is not apparent anymore when it comes to calculate extreme UTHi cases. We come up with a new time series for values found at the low tail of the T12 distribution, which can be further exploited for analyses of ice saturation and supersaturation cases. The validity of the new method with respect to typical intercalibration methods such as regression-based methods is presented and discussed.


2021 ◽  
Author(s):  
◽  
Caroline Moy

<p>This thesis considers the conventional SARIMA model and the EVT-GARCH model for forecasting electricity prices. However, we find that these models do not adequately capture the important characteristics of the electricity price data. A new model is developed, the EVT-SARIMA model, for forecasting electricity prices which is found to be the best at modelling the nature of the electricity prices. A time series of half-hourly electricity price data from the Hayward node in New Zealand is transformed into a daily average price series and using this resulting series, appropriate models are fitted for estimating and forecasting.  The new EVT-SARIMA model is used to simulate 1000 time series of daily electricity prices, over a 90 day period, to consider strategies for managing the risk associated with price volatility. The effects of different financial instruments on the cumulative distribution functions of predicted revenue obtained using our model are considered. Results suggest that different contracts have different effects on the predicted revenue. However, all contracts have the effect of reducing variability in the predicted revenue values and thus, should be used by a risk manager to reduce the range of probable revenue values. The quantity traded and which contracts to use is dependent on the objectives of the risk manager.</p>


2017 ◽  
Vol 10 (2) ◽  
pp. 681-693 ◽  
Author(s):  
Klaus Gierens ◽  
Kostas Eleftheratos

Abstract. In the present study we explore the capability of the intercalibrated HIRS brightness temperature data at channel 12 (the HIRS water vapour channel; T12) to reproduce ice supersaturation in the upper troposphere during the period 1979–2014. Focus is given on the transition from the HIRS 2 to the HIRS 3 instrument in the year 1999, which involved a shift of the central wavelength in channel 12 from 6.7 to 6.5 µm. It is shown that this shift produced a discontinuity in the time series of low T12 values ( < 235 K) and associated cases of high upper-tropospheric humidity with respect to ice (UTHi  > 70 %) in the year 1999 which prevented us from maintaining a continuous, long-term time series of ice saturation throughout the whole record (1979–2014). We show that additional corrections are required to the low T12 values in order to bring HIRS 3 levels down to HIRS 2 levels. The new corrections are based on the cumulative distribution functions of T12 from NOAA 14 and 15 satellites (that is, when the transition from HIRS 2 to HIRS 3 occurred). By applying these corrections to the low T12 values we show that the discontinuity in the time series caused by the transition of HIRS 2 to HIRS 3 is not apparent anymore when it comes to calculating extreme UTHi cases. We come up with a new time series for values found at the low tail of the T12 distribution, which can be further exploited for analyses of ice saturation and supersaturation cases. The validity of the new method with respect to typical intercalibration methods such as regression-based methods is presented and discussed.


2020 ◽  
Vol 95 ◽  
pp. 103857
Author(s):  
Asma Belhadi ◽  
Youcef Djenouri ◽  
Kjetil Nørvåg ◽  
Heri Ramampiaro ◽  
Florent Masseglia ◽  
...  

2020 ◽  
Vol 12 (21) ◽  
pp. 3505
Author(s):  
Muhammad Fulki Fadhillah ◽  
Arief Rizqiyanto Achmad ◽  
Chang-Wook Lee

The aims of this research were to map and analyze the risk of land subsidence in the Seoul Metropolitan Area, South Korea using satellite interferometric synthetic aperture radar (InSAR) time-series data, and three ensemble machine-learning models, Bagging, LogitBoost, and Multiclass Classifier. Of the types of infrastructure present in the Seoul Metropolitan Area, subway lines may be vulnerable to land subsidence. In this study, we analyzed Persistent Scatterer InSAR time-series data using the Stanford Method for Persistent Scatterers (StaMPS) algorithm to generate a deformation time-series map. Subsidence occurred at four locations, with a deformation rate that ranged from 6–12 mm/year. Subsidence inventory maps were prepared using deformation time-series data from Sentinel-1. Additionally, 10 potential subsidence-related factors were selected and subjected to Geographic Information System analysis. The relationship between each factor and subsidence occurrence was analyzed by using the frequency ratio. Land subsidence susceptibility maps were generated using Bagging, Multiclass Classifier, and LogitBoost models, and map validation was carried out using the area under the curve (AUC) method. Of the three models, Bagging produced the largest AUC (0.883), with LogitBoost and Multiclass Classifier producing AUCs of 0.871 and 0.856, respectively.


Water ◽  
2019 ◽  
Vol 11 (5) ◽  
pp. 1062 ◽  
Author(s):  
Anqi Wang ◽  
Dimitri P. Solomatine

Currently, practically no modeling study is expected to be carried out without some form of Sensitivity Analysis (SA). At the same time, there is a large number of various methods and it is not always easy for practitioners to choose one. The aim of this paper is to briefly review main classes of SA methods, and to present the results of the practical comparative analysis of applying them. Six different global SA methods: Sobol, eFAST (extended Fourier Amplitude Sensitivity Test), Morris, LH-OAT, RSA (Regionalized Sensitivity Analysis), and PAWN are tested on three conceptual rainfall-runoff models with varying complexity: (GR4J, Hymod, and HBV) applied to the case study of Bagmati basin (Nepal). The methods are compared with respect to effectiveness, efficiency, and convergence. A practical framework of selecting and using the SA methods is presented. The result shows that, first of all, all the six SA methods are effective. Morris and LH-OAT methods are the most efficient methods in computing SI and ranking. eFAST performs better than Sobol, and thus it can be seen as its viable alternative for Sobol. PAWN and RSA methods have issues of instability, which we think are due to the ways Cumulative Distribution Functions (CDFs) are built, and using Kolmogorov–Smirnov statistics to compute Sensitivity Indices. All the methods require sufficient number of runs to reach convergence. Difference in efficiency of different methods is an inevitable consequence of the differences in the underlying principles. For SA of hydrological models, it is recommended to apply the presented practical framework assuming the use of several methods, and to explicitly take into account the constraints of effectiveness, efficiency (including convergence), ease of use, and availability of software.


Sign in / Sign up

Export Citation Format

Share Document