scholarly journals Analisis Indeks Harga Saham Gabungan (IHSG) Menggunakan Metode Wavelet Thresholding

Author(s):  
Zahra Awaliya Fauziah ◽  
Junaidi ◽  
Lilies Handayani

Stock is one type of long-term investment in the capital market. The stock movement indicator that is most often used in analysis by investors is the  Indonesia Composite Index (ICI). ICI data is a variety of time series data, so it can be analyzed using forecasting. One forecasting method that can be used is the wavelet thresholding method. The wavelet threshold can analyze stationary, non-stationary, and nonlinear time series data by producing smooth estimates. The wavelet threshold has a wavelet filter and threshold parameters and threshold functions that can be used in analyzing. In this study MSE was assessed from several wavelet filters namely haar, daubechies, and coiflets filters at levels 1 to 7 with the thresholding function namely soft thresholding and thresholding parameters, namely minimax thresholding and sure thresholding. The data used is IHGS data in 2018 totaling 240 data. Based on the data analysis performed, MSE was obtained which means that the best filter provided in order 2 wavelet coiflet filter at level 2 and thresholding parameter is sure of thresholding with MSE value of 0.0094

Open Physics ◽  
2021 ◽  
Vol 19 (1) ◽  
pp. 360-374
Author(s):  
Yuan Pei ◽  
Lei Zhenglin ◽  
Zeng Qinghui ◽  
Wu Yixiao ◽  
Lu Yanli ◽  
...  

Abstract The load of the showcase is a nonlinear and unstable time series data, and the traditional forecasting method is not applicable. Deep learning algorithms are introduced to predict the load of the showcase. Based on the CEEMD–IPSO–LSTM combination algorithm, this paper builds a refrigerated display cabinet load forecasting model. Compared with the forecast results of other models, it finally proves that the CEEMD–IPSO–LSTM model has the highest load forecasting accuracy, and the model’s determination coefficient is 0.9105, which is obviously excellent. Compared with other models, the model constructed in this paper can predict the load of showcases, which can provide a reference for energy saving and consumption reduction of display cabinet.


Water ◽  
2021 ◽  
Vol 13 (4) ◽  
pp. 416
Author(s):  
Bwalya Malama ◽  
Devin Pritchard-Peterson ◽  
John J. Jasbinsek ◽  
Christopher Surfleet

We report the results of field and laboratory investigations of stream-aquifer interactions in a watershed along the California coast to assess the impact of groundwater pumping for irrigation on stream flows. The methods used include subsurface sediment sampling using direct-push drilling, laboratory permeability and particle size analyses of sediment, piezometer installation and instrumentation, stream discharge and stage monitoring, pumping tests for aquifer characterization, resistivity surveys, and long-term passive monitoring of stream stage and groundwater levels. Spectral analysis of long-term water level data was used to assess correlation between stream and groundwater level time series data. The investigations revealed the presence of a thin low permeability silt-clay aquitard unit between the main aquifer and the stream. This suggested a three layer conceptual model of the subsurface comprising unconfined and confined aquifers separated by an aquitard layer. This was broadly confirmed by resistivity surveys and pumping tests, the latter of which indicated the occurrence of leakage across the aquitard. The aquitard was determined to be 2–3 orders of magnitude less permeable than the aquifer, which is indicative of weak stream-aquifer connectivity and was confirmed by spectral analysis of stream-aquifer water level time series. The results illustrate the importance of site-specific investigations and suggest that even in systems where the stream is not in direct hydraulic contact with the producing aquifer, long-term stream depletion can occur due to leakage across low permeability units. This has implications for management of stream flows, groundwater abstraction, and water resources management during prolonged periods of drought.


2007 ◽  
pp. 88
Author(s):  
Wataru Suzuki ◽  
Yanfei Zhou

This article represents the first step in filling a large gap in knowledge concerning why Public Assistance (PA) use recently rose so fast in Japan. Specifically, we try to address this problem not only by performing a Blanchard and Quah decomposition on long-term monthly time series data (1960:04-2006:10), but also by estimating prefecturelevel longitudinal data. Two interesting findings emerge from the time series analysis. The first is that permanent shock imposes a continuously positive impact on the PA rate and is the main driving factor behind the recent increase in welfare use. The second finding is that the impact of temporary shock will last for a long time. The rate of the use of welfare is quite rigid because even if the PA rate rises due to temporary shocks, it takes about 8 or 9 years for it to regain its normal level. On the other hand, estimations of prefecture-level longitudinal data indicate that the Financial Capability Index (FCI) of the local government2 and minimum wage both impose negative effects on the PA rate. We also find that the rapid aging of Japan's population presents a permanent shock in practice, which makes it the most prominent contribution to surging welfare use.


2017 ◽  
Author(s):  
Easton R White

Long-term time series are necessary to better understand population dynamics, assess species' conservation status, and make management decisions. However, population data are often expensive, requiring a lot of time and resources. When is a population time series long enough to address a question of interest? We determine the minimum time series length required to detect significant increases or decreases in population abundance. To address this question, we use simulation methods and examine 878 populations of vertebrate species. Here we show that 15-20 years of continuous monitoring are required in order to achieve a high level of statistical power. For both simulations and the time series data, the minimum time required depends on trend strength, population variability, and temporal autocorrelation. These results point to the importance of sampling populations over long periods of time. We argue that statistical power needs to be considered in monitoring program design and evaluation. Time series less than 15-20 years are likely underpowered and potentially misleading.


Media Ekonomi ◽  
2017 ◽  
Vol 20 (1) ◽  
pp. 83
Author(s):  
Jumadin Lapopo

<p>Poverty is being a problem in all developing countries including Indonesia. Among goverment programs, poverty has become the center offattention in policy at both of the regional and national levels. Looking at thephenomenon of poverty, Islam present with solution to reduce poverty through Zakat. This study aims to analyze the effect of ZIS and Zakat Fitrah against poverty in Indonesia in 1998 until 2010, data used in this study is secondary data and uses time series data, for the dependent variabel is poverty and for independent variables are ZIS and Zakat Fitrah. The analysis tools used in this study is to use multiple regression analysis model and the assumptions of classical test using the software Eviews-4. In this study also concluded that the ZIS variables significantly affect to the reduction of poverty in Indonesia although the effect is very small. In the variable Zakat Fitrah not significantly affect poverty reduction in Indonesia because of the nature of Zakat Fitrah is for consumption and not for long-term needs. The results of this study can be used for the management of zakat to be able to develop the management and to get a better system for distribution of zakat so that the main purpose of zakat can be achieved to reduce poverty.<br />Keywords : Poverty, Zakat Fitrah, ZIS.</p>


2012 ◽  
Vol 2012 ◽  
pp. 1-15 ◽  
Author(s):  
Jia Chaolong ◽  
Xu Weixiang ◽  
Wang Futian ◽  
Wang Hanning

The combination of linear and nonlinear methods is widely used in the prediction of time series data. This paper analyzes track irregularity time series data by using gray incidence degree models and methods of data transformation, trying to find the connotative relationship between the time series data. In this paper, GM(1,1)is based on first-order, single variable linear differential equations; after an adaptive improvement and error correction, it is used to predict the long-term changing trend of track irregularity at a fixed measuring point; the stochastic linear AR, Kalman filtering model, and artificial neural network model are applied to predict the short-term changing trend of track irregularity at unit section. Both long-term and short-term changes prove that the model is effective and can achieve the expected accuracy.


2018 ◽  
Vol 7 (4.15) ◽  
pp. 25 ◽  
Author(s):  
Said Jadid Abdulkadir ◽  
Hitham Alhussian ◽  
Muhammad Nazmi ◽  
Asim A Elsheikh

Forecasting time-series data are imperative especially when planning is required through modelling using uncertain knowledge of future events. Recurrent neural network models have been applied in the industry and outperform standard artificial neural networks in forecasting, but fail in long term time-series forecasting due to the vanishing gradient problem. This study offers a robust solution that can be implemented for long-term forecasting using a special architecture of recurrent neural network known as Long Short Term Memory (LSTM) model to overcome the vanishing gradient problem. LSTM is specially designed to avoid the long-term dependency problem as their default behavior. Empirical analysis is performed using quantitative forecasting metrics and comparative model performance on the forecasted outputs. An evaluation analysis is performed to validate that the LSTM model provides better forecasted outputs on Standard & Poor’s 500 Index (S&P 500) in terms of error metrics as compared to other forecasting models.  


2020 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Yanhui Chen ◽  
Bin Liu ◽  
Tianzi Wang

PurposeThis paper applied grey wave forecasting in a decomposition–ensemble forecasting method for modelling the complex and non-linear features in time series data. This application aims to test the advantages of grey wave forecasting method in predicting time series with periodic fluctuations.Design/methodology/approachThe decomposition–ensemble method combines empirical mode decomposition (EMD), component reconstruction technology and grey wave forecasting. More specifically, EMD is used to decompose time series data into different intrinsic mode function (IMF) components in the first step. Permutation entropy and the average of each IMF are checked for component reconstruction. Then the grey wave forecasting model or ARMA is used to predict each IMF according to the characters of each IMF.FindingsIn the empirical analysis, the China container freight index (CCFI) is applied in checking prediction performance. Using two different time periods, the results show that the proposed method performs better than random walk and ARMA in multi-step-ahead prediction.Originality/valueThe decomposition–ensemble method based on EMD and grey wave forecasting model expands the application area of the grey system theory and graphic forecasting method. Grey wave forecasting performs better for data set with periodic fluctuations. Forecasting CCFI assists practitioners in the shipping industry in decision-making.


Sign in / Sign up

Export Citation Format

Share Document