scholarly journals OPTIMISING THE SMOOTHNESS AND ACCURACY OF MOVING AVERAGE FOR STOCK PRICE DATA

2018 ◽  
Vol 24 (3) ◽  
pp. 984-1003 ◽  
Author(s):  
Aistis RAUDYS ◽  
Židrina PABARŠKAITĖ

Smoothing time series allows removing noise. Moving averages are used in finance to smooth stock price series and forecast trend direction. We propose optimised custom moving average that is the most suitable for stock time series smoothing. Suitability criteria are defined by smoothness and accuracy. Previous research focused only on one of the two criteria in isolation. We define this as multi-criteria Pareto optimisation problem and compare the proposed method to the five most popular moving average methods on synthetic and real world stock data. The comparison was performed using unseen data. The new method outperforms other methods in 99.5% of cases on synthetic and in 91% on real world data. The method allows better time series smoothing with the same level of accuracy as traditional methods, or better accuracy with the same smoothness. Weights optimised on one stock are very similar to weights optimised for any other stock and can be used interchangeably. Traders can use the new method to detect trends earlier and increase the profitability of their strategies. The concept is also applicable to sensors, weather forecasting, and traffic prediction where both the smoothness and accuracy of the filtered signal are important.

Entropy ◽  
2021 ◽  
Vol 23 (8) ◽  
pp. 969
Author(s):  
Miguel C. Soriano ◽  
Luciano Zunino

Time-delayed interactions naturally appear in a multitude of real-world systems due to the finite propagation speed of physical quantities. Often, the time scales of the interactions are unknown to an external observer and need to be inferred from time series of observed data. We explore, in this work, the properties of several ordinal-based quantifiers for the identification of time-delays from time series. To that end, we generate artificial time series of stochastic and deterministic time-delay models. We find that the presence of a nonlinearity in the generating model has consequences for the distribution of ordinal patterns and, consequently, on the delay-identification qualities of the quantifiers. Here, we put forward a novel ordinal-based quantifier that is particularly sensitive to nonlinearities in the generating model and compare it with previously-defined quantifiers. We conclude from our analysis on artificially generated data that the proper identification of the presence of a time-delay and its precise value from time series benefits from the complementary use of ordinal-based quantifiers and the standard autocorrelation function. We further validate these tools with a practical example on real-world data originating from the North Atlantic Oscillation weather phenomenon.


Author(s):  
Vivek Vijay ◽  
Parmod Kumar Paul

A trading band, based on historical movements of a security price, suggests buy or sell pattern. Bollinger band is one of the most famous bands based on moving average and volatility of the security. The authors define a new trading band, namely Optimal Band, to forecast the buy or sell signals. This optimal band uses a linear function of local and absolute extrema of a given financial time series. The parameters of this linear function are then estimated by simple linear optimization technique. The authors then define different states using various upper and lower values of Bollinger band and the optimal band. The approach of Markov and Hidden Markov Models are used to forecast the future states of given time series. The authors apply all the techniques on the closing price of Bombay stock exchange and intra-day price series of crude oil and Nifty stock exchange.


Entropy ◽  
2021 ◽  
Vol 23 (10) ◽  
pp. 1305
Author(s):  
Shenghan Zhou ◽  
Bang Chen ◽  
Houxiang Liu ◽  
Xinpeng Ji ◽  
Chaofan Wei ◽  
...  

Smart transportation is an important part of smart urban areas, and travel characteristics analysis and traffic prediction modeling are the two key technical measures of building smart transportation systems. Although online car-hailing has developed rapidly and has a large number of users, most of the studies on travel characteristics do not focus on online car-hailing, but instead on taxis, buses, metros, and other traditional means of transportation. The traditional univariate variable hybrid time series traffic prediction model based on the autoregressive integrated moving average (ARIMA) ignores other explanatory variables. To fill the research gap on online car-hailing travel characteristics analysis and overcome the shortcomings of the univariate variable hybrid time series traffic prediction model based on ARIMA, based on online car-hailing operational data sets, we analyzed the online car-hailing travel characteristics from multiple dimensions, such as district, time, traffic jams, weather, air quality, and temperature. A traffic prediction method suitable for multivariate variables hybrid time series modeling is proposed in this paper, which uses the maximal information coefficient (MIC) to perform feature selection, and fuses autoregressive integrated moving average with explanatory variable (ARIMAX) and long short-term memory (LSTM) for data regression. The effectiveness of the proposed multivariate variables hybrid time series traffic prediction model was verified on the online car-hailing operational data sets.


Author(s):  
Zejian Li ◽  
Yongchuan Tang ◽  
Wei Li ◽  
Yongxing He

Unsupervised disentangled representation learning is one of the foundational methods to learn interpretable factors in the data. Existing learning methods are based on the assumption that disentangled factors are mutually independent and incorporate this assumption with the evidence lower bound. However, our experiment reveals that factors in real-world data tend to be pairwise independent. Accordingly, we propose a new method based on a pairwise independence assumption to learn the disentangled representation. The evidence lower bound implicitly encourages mutual independence of latent codes so it is too strong for our assumption. Therefore, we introduce another lower bound in our method. Extensive experiments show that our proposed method gives competitive performances as compared with other state-of-the-art methods.


2015 ◽  
Vol 26 ◽  
pp. vii99 ◽  
Author(s):  
Yu Uneno ◽  
Kei Taneishi ◽  
Masashi Kanai ◽  
Akiko Tamon ◽  
Kazuya Okamoto ◽  
...  

2019 ◽  
Vol 10 (1) ◽  
pp. 17
Author(s):  
Isnaini Nuzula Agustin

AbstractEfficient Market is the market where all traded securities prices reflects all available information. Market Efficient Hypotesis in the Weak Form stated that past stock price movement incorporated with current securities’s prices, thus it can be used to predicting the current price or return. The objective of this research is to examine the weak form of Efficient Market Hypothesis (EMH) in Indonesia Sharia Stock Index (ISSI) over the period of January 3rd2017 -February 8th 2019. To Examine the EMH, some appropriate tests are developed, these are: Run Test, Autocorrelation Test, Autoregressive Integrated Moving Average (ARIMA), and Paired Sample t-test. The result findings showing that ISSI is not efficient in the weak form during the period of the study. Moreover, in accordance with time series modelling result, the fitted model is ARIMA (1,1,1) with accuracy level of 78%. This result proved that ARIMA model successfully and accurately in forecasting ISSI indices. It can be implied that the historical stock index data in the past still described the stock index information in the future. Thus, technical analysis is still feasible to do as the guide for investors in conducting transactions in the capital market.AbstrakPasar yang efisien adalah pasar dimana semua harga sekuritas yang diperdagangkan telah mencerminkan semua informasi yang tersedia. Teori pasar efisien bentuk lemah menyatakan bahwa perubahan harga masa lalu tidak berhubungan dengan harga sekuritas sekarang, sehingga tidak dapat digunakan untuk memprediksi harga atau return dari sekuritas. Penelitian ini bertujuan untuk melakukan pengujian hipotesis pasar efisien bentuk lemah pada Indeks Saham Syariah Indonesia (ISSI). Data diambil pada periode 3 Januari 2017 – 8 Februari 2019. Pada tahap awal penelitian, Run test dan Autocorrelation test dilakukan untuk melihat apakah pasar efisien bentuk lemah berlaku pada ISSI. Selanjutnya dilakukan pembentukan pemodelan time series ARIMA untuk melihat teknik prediksi yang sesuai untuk memprediksi Indeks Saham ISSI. Hasil Run test dan Autocorrelation test menunjukkan bahwa hipotesis pasar efisien bentuk lemah tidak terbukti. Pada pembentukan model ARIMA, terlihat bahwa model yang sesuai adalah ARIMA (1,1,1) menghasilkan tingkat akurasi sebesar 78%. Hal ini membuktikan bahwa model ARIMA berhasil dan akurat digunakan untuk memprediksi Indeks Harga Saham ISSI. Oleh karena itu, analisis teknikal masih dapat digunakan oleh investor untuk menjadi pedoman dalam melakukan transaksi perdagangan di pasar modal.


2021 ◽  
Author(s):  
Prasanta Pal ◽  
Shataneek Banerjee ◽  
Amardip Ghosh ◽  
David R. Vago ◽  
Judson Brewer

<div> <div> <div> <p>Knowingly or unknowingly, digital-data is an integral part of our day-to-day lives. Realistically, there is probably not a single day when we do not encounter some form of digital-data. Typically, data originates from diverse sources in various formats out of which time-series is a special kind of data that captures the information about the time-evolution of a system under observation. How- ever, capturing the temporal-information in the context of data-analysis is a highly non-trivial challenge. Discrete Fourier-Transform is one of the most widely used methods that capture the very essence of time-series data. While this nearly 200-year-old mathematical transform, survived the test of time, however, the nature of real-world data sources violates some of the intrinsic properties presumed to be present to be able to be processed by DFT. Adhoc noise and outliers fundamentally alter the true signature of the frequency domain behavior of the signal of interest and as a result, the frequency-domain representation gets corrupted as well. We demonstrate that the application of traditional digital filters as is, may not often reveal an accurate description of the pristine time-series characteristics of the system under study. In this work, we analyze the issues of DFT with real-world data as well as propose a method to address it by taking advantage of insights from modern data-science techniques and particularly our previous work SOCKS. Our results reveal that a dramatic, never-before-seen improvement is possible by re-imagining DFT in the context of real-world data with appropriate curation protocols. We argue that our proposed transformation DFT21 would revolutionize the digital world in terms of accuracy, reliability, and information retrievability from raw-data. </p> </div> </div> </div>


2021 ◽  
Author(s):  
Prasanta Pal ◽  
Shataneek Banerjee ◽  
Amardip Ghosh ◽  
David R. Vago ◽  
Judson Brewer

<div> <div> <div> <p>Knowingly or unknowingly, digital-data is an integral part of our day-to-day lives. Realistically, there is probably not a single day when we do not encounter some form of digital-data. Typically, data originates from diverse sources in various formats out of which time-series is a special kind of data that captures the information about the time-evolution of a system under observation. How- ever, capturing the temporal-information in the context of data-analysis is a highly non-trivial challenge. Discrete Fourier-Transform is one of the most widely used methods that capture the very essence of time-series data. While this nearly 200-year-old mathematical transform, survived the test of time, however, the nature of real-world data sources violates some of the intrinsic properties presumed to be present to be able to be processed by DFT. Adhoc noise and outliers fundamentally alter the true signature of the frequency domain behavior of the signal of interest and as a result, the frequency-domain representation gets corrupted as well. We demonstrate that the application of traditional digital filters as is, may not often reveal an accurate description of the pristine time-series characteristics of the system under study. In this work, we analyze the issues of DFT with real-world data as well as propose a method to address it by taking advantage of insights from modern data-science techniques and particularly our previous work SOCKS. Our results reveal that a dramatic, never-before-seen improvement is possible by re-imagining DFT in the context of real-world data with appropriate curation protocols. We argue that our proposed transformation DFT21 would revolutionize the digital world in terms of accuracy, reliability, and information retrievability from raw-data. </p> </div> </div> </div>


2020 ◽  
Vol 34 (04) ◽  
pp. 5101-5108
Author(s):  
Xiao Ma ◽  
Peter Karkus ◽  
David Hsu ◽  
Wee Sun Lee

Recurrent neural networks (RNNs) have been extraordinarily successful for prediction with sequential data. To tackle highly variable and multi-modal real-world data, we introduce Particle Filter Recurrent Neural Networks (PF-RNNs), a new RNN family that explicitly models uncertainty in its internal structure: while an RNN relies on a long, deterministic latent state vector, a PF-RNN maintains a latent state distribution, approximated as a set of particles. For effective learning, we provide a fully differentiable particle filter algorithm that updates the PF-RNN latent state distribution according to the Bayes rule. Experiments demonstrate that the proposed PF-RNNs outperform the corresponding standard gated RNNs on a synthetic robot localization dataset and 10 real-world sequence prediction datasets for text classification, stock price prediction, etc.


Sign in / Sign up

Export Citation Format

Share Document