scholarly journals Relevance of Input Data Time Series for Tax Revenue Forecasting

2015 ◽  
Vol 25 ◽  
pp. 518-529
Author(s):  
Ondřej Bayer
2015 ◽  
Vol 12 (8) ◽  
pp. 7437-7467 ◽  
Author(s):  
J. E. Reynolds ◽  
S. Halldin ◽  
C. Y. Xu ◽  
J. Seibert ◽  
A. Kauffeldt

Abstract. Concentration times in small and medium-sized watersheds (~ 100–1000 km2) are commonly less than 24 h. Flood-forecasting models then require data at sub-daily time scales, but time-series of input and runoff data with sufficient lengths are often only available at the daily time scale, especially in developing countries. This has led to a search for time-scale relationships to infer parameter values at the time scales where they are needed from the time scales where they are available. In this study, time-scale dependencies in the HBV-light conceptual hydrological model were assessed within the generalized likelihood uncertainty estimation (GLUE) approach. It was hypothesised that the existence of such dependencies is a result of the numerical method or time-stepping scheme used in the models rather than a real time-scale-data dependence. Parameter values inferred showed a clear dependence on time scale when the explicit Euler method was used for modelling at the same time steps as the time scale of the input data (1–24 h). However, the dependence almost fully disappeared when the explicit Euler method was used for modelling in 1 h time steps internally irrespectively of the time scale of the input data. In other words, it was found that when an adequate time-stepping scheme was implemented, parameter sets inferred at one time scale (e.g., daily) could be used directly for runoff simulations at other time scales (e.g., 3 or 6 h) without any time scaling and this approach only resulted in a small (if any) model performance decrease, in terms of Nash–Sutcliffe and volume-error efficiencies. The overall results of this study indicated that as soon as sub-daily driving data can be secured, flood forecasting in watersheds with sub-daily concentration times is possible with model-parameter values inferred from long time series of daily data, as long as an appropriate numerical method is used.


Author(s):  
K. Anders ◽  
L. Winiwarter ◽  
H. Mara ◽  
R. C. Lindenbergh ◽  
S. E. Vos ◽  
...  

Abstract. Near-continuously acquired terrestrial laser scanning (TLS) data contains valuable information on natural surface dynamics. An important step in geographic analyses is to detect different types of changes that can be observed in a scene. For this, spatiotemporal segmentation is a time series-based method of surface change analysis that removes the need to select analysis periods, providing so-called 4D objects-by-change (4D-OBCs). This involves higher computational effort than pairwise change detection, and efforts scale with (i) the temporal density of input data and (ii) the (variable) spatial extent of delineated changes. These two factors determine the cost and number of Dynamic Time Warping distance calculations to be performed for deriving the metric of time series similarity. We investigate how a reduction of the spatial and temporal resolution of input data influences the delineation of twelve erosion and accumulation forms, using an hourly five-month TLS time series of a sandy beach. We compare the spatial extent of 4D-OBCs obtained at reduced spatial (1.0 m to 15.0 m with 0.5 m steps) and temporal (2 h to 96 h with 2 h steps) resolution to the result from highest-resolution data. Many change delineations achieve acceptable performance with ranges of ±10 % to ±100 % in delineated object area, depending on the spatial extent of the respective change form. We suggest a locally adaptive approach to identify poor performance at certain resolution levels for the integration in a hierarchical approach. Consequently, the spatial delineation could be performed at high accuracy for specific target changes in a second iteration. This will allow more efficient 3D change analysis towards near-realtime, online TLS-based observation of natural surface changes.


2020 ◽  
Vol 1 (1) ◽  
Author(s):  
Rahmanta Ginting

The research to analyse effect net domestic product and SBI on tax revenue in Indonesia with independent variables  net domestic product and SBI also dependent variables tax revenue. Data is a time series between 1981 - 2010 with ordinary least square (OLS) and the model of formula used is multiply linier regression.  The research result shows that net domestic product gives a positive effect and significant on tax revenue in Indonesia on 99% level. While SBI has a negative effect and significant on tax revenue in Indonesia on 90% level.


2021 ◽  
Vol 6 (3) ◽  
pp. 6100-6107
Author(s):  
Nicolo Bargellesi ◽  
Alessandro Beghi ◽  
Mirco Rampazzo ◽  
Gian Antonio Susto

2021 ◽  
Author(s):  
Iuliia Burdun ◽  
Michel Bechtold ◽  
Viacheslav Komisarenko ◽  
Annalea Lohila ◽  
Elyn Humphreys ◽  
...  

<p>Fluctuations of water table depth (WTD) affect many processes in peatlands, such as vegetation development and emissions of greenhouse gases. Here, we present the OPtical TRApezoid Model (OPTRAM) as a new method for satellite-based monitoring of the temporal variation of WTD in peatlands. OPTRAM is based on the response of short-wave infrared reflectance to the vegetation water status. For five northern peatlands with long-term in-situ WTD records, and with diverse vegetation cover and hydrological regimes, we generate a suite of OPTRAM index time series using (a) different procedures to parametrise OPTRAM (peatland-specific manual vs. globally applicable automatic parametrisation in Google Earth Engine), and (b) different satellite input data (Landsat vs. Sentinel-2). The results based on the manual parametrisation of OPTRAM indicate a high correlation with in-situ WTD time-series for pixels with most suitable vegetation for OPTRAM application (mean Pearson correlation of 0.7 across sites), and we will present the performance differences when moving from a manual to an automatic procedure. Furthermore, for the overlap period of Landsat and Sentinel-2, which have different ranges and widths of short-wave infrared bands used for OPTRAM calculation, the impact of the satellite input data to OPTRAM will be analysed. Eventually, the challenge of merging different satellite missions in the derivation of OPTRAM time series will be explored as an important step towards a global application of OPTRAM for the monitoring of WTD dynamics in northern peatlands.</p>


2016 ◽  
Vol 43 (12) ◽  
pp. 1034-1043 ◽  
Author(s):  
Fatemeh Zahedi Tajrishi ◽  
Alireza Mirza Goltabar Roshan ◽  
Mehran Zeynalian ◽  
Javad Vaseghi Amiri

This study presents a methodology that utilizes a new combination of two compressed damage indices as input data of an artificial neural network (ANN) ensemble to detect multi-damages in the braces of cold formed steel shear walls. To identify an efficient input data for ANN, first, three main groups of damage indices are considered: modal parameter-based damage indices; frequency response functions (FRFs)-based damage indices and time series-based damage indices. Furthermore, principal component analysis (PCA) technique is applied to reduce the dimensions of FRFs and time series-based input pattern. By a sensitivity study, two suitable damage indices of PCA-compressed time series data and PCA-compressed FRFs are identified and then combined to produce a new efficient input data for a hierarchy of ANN ensembles. The numerical results show that the ANN ensemble-based damage detection approach with the proposed collection of two damage indices is effective and reliable.


2009 ◽  
Vol 26 (4) ◽  
pp. 806-817 ◽  
Author(s):  
M. G. G. Foreman ◽  
J. Y. Cherniawsky ◽  
V. A. Ballantyne

Abstract New computer software that permits more versatility in the harmonic analysis of tidal time series is described and tested. Specific improvements to traditional methods include the analysis of randomly sampled and/or multiyear data; more accurate nodal correction, inference, and astronomical argument adjustments through direct incorporation in the least squares matrix; multiconstituent inferences from a single reference constituent; correlation matrices and error estimates that facilitate decisions on the selection of constituents for the analysis; and a single program that analyzes one- or two-dimensional time series. This new methodology is evaluated through comparisons with results from old techniques and then applied to two problems that could not have been accurately solved with older software. They are (i) the analysis of ocean station temperature time series spanning 25 yr, and (ii) the analysis of satellite altimetry from a ground track whose proximity to land has led to significant data dropout. This new software is free as part of the Institute of Ocean Sciences (IOS) Tidal Package and can be downloaded, along with sample input data and an explanatory readme file.


Geophysics ◽  
1984 ◽  
Vol 49 (5) ◽  
pp. 521-524 ◽  
Author(s):  
John Halpenny

Data from automatic recording systems often require editing and filtering before they are suitable for computer analysis. The procedure described in this paper produces edited values at regular intervals from input data containing random noise, data gaps, and sudden steps or resets. It uses a Kalman filter with a fixed delay time to estimate the most probable data value at any time, based on information both before and after the time point. Isolated portions of a bad record can be recognized and removed, and steps or offsets are identified and measured. An example is shown of clean output produced from input which suffers from a variety of instrumental problems.


Sign in / Sign up

Export Citation Format

Share Document