scholarly journals Fractal analysis of the economic sustainability of enterprise

2019 ◽  
Vol 65 ◽  
pp. 06005
Author(s):  
Andriy Matviychuk ◽  
Oleksandr Novoseletskyy ◽  
Serhii Vashchaiev ◽  
Halyna Velykoivanenko ◽  
Igor Zubenko

The article deals with the method of calculating the fractal analysis, the time series of economic sustainability of the industrial enterprise on the trend-resistant sustainability were investigated by estimating the depth of the long-term memory of the time series and constructing a phase portrait. According to the approach used, the “depth of the long memory” is estimated in terms of fuzzy sets. The approach to the estimation of the index of economic stability is developed, based on the methods of forming an integrated indicator consisting of an assessment of such subsystems as the industrial and technical, financial-economic and subsystem of main parameters of the market environment. These helps to estimate the economic stability of the enterprise in the conditions of incomplete information from purpose of making effective management decisions. Combination of techniques for the formation of an integral index and a fractal analysis of the assessment of its trend stability showed an effective result, which was confirmed by the experiments.

Author(s):  
Roberto J. Santillán- Salgado ◽  
Marissa Martínez Preece ◽  
Francisco López Herrera

This paper analyzes the returns and variance behavior of the largest specialized private pension investment funds index in Mexico, the SIEFORE Básica 1 (or, SB1). The analysis was carried out with time series techniques to model the returns and volatility of the SB1, using publicly available historical data for SB1. Like many standard financial time series, the SB1 returns show non-normality, volatility clusters and excess kurtosis. The econometric characteristics of the series were initially modeled using three GARCH family models: GARCH (1,1), TGARCH and IGARCH. However, due to the presence of highly persistent volatility, the series modeling was extended using Fractionally Integrated GARCH (FIGARCH) methods. To that end, an extended specification: an ARFIMA (p,d,q) and a FIGARCH model were incorporated. The evidence obtained suggests the presence of long memory effects both in the returns and the volatility of the SB1. Our analysis’ results have important implications for the risk management of the SB1. Keywords: Private Pension Funds, Time Series modelling, GARCH models, Long Term memory series


Fractals ◽  
2020 ◽  
Vol 28 (08) ◽  
pp. 2040032
Author(s):  
YELIZ KARACA ◽  
DUMITRU BALEANU

It has become vital to effectively characterize the self-similar and regular patterns in time series marked by short-term and long-term memory in various fields in the ever-changing and complex global landscape. Within this framework, attempting to find solutions with adaptive mathematical models emerges as a major endeavor in economics whose complex systems and structures are generally volatile, vulnerable and vague. Thus, analysis of the dynamics of occurrence of time section accurately, efficiently and timely is at the forefront to perform forecasting of volatile states of an economic environment which is a complex system in itself since it includes interrelated elements interacting with one another. To manage data selection effectively and attain robust prediction, characterizing complexity and self-similarity is critical in financial decision-making. Our study aims to obtain analyzes based on two main approaches proposed related to seven recognized indexes belonging to prominent countries (DJI, FCHI, GDAXI, GSPC, GSTPE, N225 and Bitcoin index). The first approach includes the employment of Hurst exponent (HE) as calculated by Rescaled Range ([Formula: see text]) fractal analysis and Wavelet Entropy (WE) in order to enhance the prediction accuracy in the long-term trend in the financial markets. The second approach includes Artificial Neural Network (ANN) algorithms application Feed forward back propagation (FFBP), Cascade Forward Back Propagation (CFBP) and Learning Vector Quantization (LVQ) algorithm for forecasting purposes. The following steps have been administered for the two aforementioned approaches: (i) HE and WE were applied. Consequently, new indicators were calculated for each index. By obtaining the indicators, the new dataset was formed and normalized by min-max normalization method’ (ii) to form the forecasting model, ANN algorithms were applied on the datasets. Based on the experimental results, it has been demonstrated that the new dataset comprised of the HE and WE indicators had a critical and determining direction with a more accurate level of forecasting modeling by the ANN algorithms. Consequently, the proposed novel method with multifarious methodology illustrates a new frontier, which could be employed in the broad field of various applied sciences to analyze pressing real-world problems and propose optimal solutions for critical decision-making processes in nonlinear, complex and dynamic environments.


Author(s):  
Serhii Ternov ◽  
Vasyl Fortuna

Contemporary literature suggests that the effective market hypothesis is not substantiated. Instead, it suggests the Fractal Market Hypothesis (FMH). Fractal markets are characterized by long-term memory. The main feature of the fractal market is that the frequency distribution of the indicator looks the same across diffe­ rent investment horizons. In such cases, it is said that for an appropriate indicator, the phenomenon of scale invariance is observed. All daily changes are correlated with all future daily changes, all weekly changes are correlated with all future weekly changes. There is no characteristic time scale, a key characteristic of the time series. The presence of memory in the time series can be characterized by the Hearst indicator. This paper analyzes the hryvnia to US dollar exchange rate for the period 04.06.14-04.01.15. Finding the Hearst index made it possible to conclude that there is or is not long-term memory in this series. The presence of long-term memory indi­ cates that the efficient market hypothesis is unjustified. The hypothesis was tested that the longer the averaging intervals are taken into account in the model, the Hearst's index decreases. The analysis does not have great predictive power, however, it allows to identify the presence or absence of long-term memory in the study process and thus to accept or reject the hypothesis of an effective market. That is, the series under study is persistent, thus demonstrating long-term me­ mory availability. Thus, since persistence is revealed, the hypothesis of an effective market for the exchange rate yield is not confirmed, but instead can be argued for the fractality of the hryvnia / dollar exchange rate yield. Therefore, the application of the proposed approach made it possible to find the Hearst rate for the hryvnia / dollar exchange rate. The value found indicates that the effective market hypothesis is not substantiated for at least such an exchange rate.


2017 ◽  
Vol 29 (3) ◽  
pp. 423-442 ◽  
Author(s):  
Geeta Duppati ◽  
Anoop S. Kumar ◽  
Frank Scrimgeour ◽  
Leon Li

Purpose The purpose of this paper is to assess to what extent intraday data can explain and predict long-term memory. Design/methodology/approach This article analysed the presence of long-memory volatility in five Asian equity indices, namely, SENSEX, CNIA, NIKKEI225, KO11 and FTSTI, using five-min intraday return series from 05 January 2015 to 06 August 2015 using two approaches, i.e. conditional volatility and realized volatility, for forecasting long-term memory. It employs conditional-generalized autoregressive conditional heteroscedasticity (GARCH), i.e. autoregressive fractionally integrated moving average (ARFIMA)-FIGARCH model and ARFIMA-asymmetric power autoregressive conditional heteroscedasticity (APARCH) models, and unconditional volatility realized volatility using autoregressive integrated moving average (ARIMA) and ARFIMA in-sample forecasting models to estimate the persistence of the long-term memory. Findings Given the GARCH framework, the ARFIMA-APARCH long-memory model gave the better forecast results signifying the importance of accounting for asymmetric information when modelling volatility in a financial market. Using the unconditional realized volatility results from the Singapore and Indian markets, the ARIMA model outperforms the ARFIMA model in terms of forecast performance and provides reasonable forecasts. Practical implications The issue of long memory has important implications for the theory and practice of finance. It is well-known that accurate volatility forecasts are important in a variety of settings including option and other derivatives pricing, portfolio and risk management. Social implications It could be said that using long-memory augmented models would give better results to investors so that they could analyse the market trends in returns and volatility in a more accurate manner and reach at an informed decision. This is useful to minimize the risks. Originality/value This research enhances the literature by estimating the influence of intraday variables on daily volatility. This is one of very few studies that uses conditional GARCH framework models and unconditional realized volatility estimates for forecasting long-term memory. The authors find that the methods complement each other.


Entropy ◽  
2021 ◽  
Vol 23 (12) ◽  
pp. 1672
Author(s):  
Sebastian Raubitzek ◽  
Thomas Neubauer

Measures of signal complexity, such as the Hurst exponent, the fractal dimension, and the Spectrum of Lyapunov exponents, are used in time series analysis to give estimates on persistency, anti-persistency, fluctuations and predictability of the data under study. They have proven beneficial when doing time series prediction using machine and deep learning and tell what features may be relevant for predicting time-series and establishing complexity features. Further, the performance of machine learning approaches can be improved, taking into account the complexity of the data under study, e.g., adapting the employed algorithm to the inherent long-term memory of the data. In this article, we provide a review of complexity and entropy measures in combination with machine learning approaches. We give a comprehensive review of relevant publications, suggesting the use of fractal or complexity-measure concepts to improve existing machine or deep learning approaches. Additionally, we evaluate applications of these concepts and examine if they can be helpful in predicting and analyzing time series using machine and deep learning. Finally, we give a list of a total of six ways to combine machine learning and measures of signal complexity as found in the literature.


2020 ◽  
Author(s):  
Xiaofeng Ji ◽  
Zhou Tang ◽  
Kejian Wang ◽  
Xianbin Li ◽  
Houqiang Li

1Summary1.1BackgroundThe outbreak of the new coronavirus infection broke out in Wuhan City, Hubei Province in December 2019, and has spread to 97 countries and regions around the world. Apart from China, there are currently three other severely affected areas, namely Italy, South Korea, and Iran. This poses a huge threat to China’s and even global public health security, challenges scientific research work such as disease surveillance and tracking, clinical treatment, and vaccine development, and it also brings huge uncertainty to the global economy. As of March 11, 2020, the epidemic situation in China is nearing its end, but the epidemic situation abroad is in the outbreak period. Italy has even taken measures to close the city nationwide, with a total of 118,020 cases of infection worldwide.1.2MethodThis article selects the data of newly confirmed cases of COVID-19 at home and abroad as the data sample. Among them: the data of newly confirmed cases abroad is represented by Italy, and the span is from February 13 to March 10. The data of newly confirmed cases at home are divided into two parts: Hubei Province and other provinces except Hubei Province, spanning from January 23 to March 3, and with February 12 as the cutting point, it”s divided into two periods, the growth period and the recession period. The rescaled range (R / S) analysis method and the dimensionless fractal Hurst exponent are used to measure the correlation of time series to determine whether the time series conforms to the fractal Brownian motion, that is, a biased random process. Contrast analysis of the meaning of H value in different stages and different overall H values in the same stage.1.3ResultsBased on R / S analysis and calculated Hurst value of newly confirmed cases in Hubei and non-Hubei provinces, it was found that the H value of Hubei Province in the first stage was 0.574, which is greater than 0.5, indicating that the future time series has a positive correlation and Fractal characteristics; The H value in the second stage is 1.368, which is greater than 1, which indicates that the future epidemic situation is completely preventable and controllable, and the second stage has a downward trend characteristic, which indicates that there is a high probability that the future time series will decline. The H values of the first and second stages of non-Hubei Province are 0.223 and 0.387, respectively, which are both less than 0.5, indicating that the time series of confirmed cases in the future is likely to return to historical points, and the H value in the second stage is greater than that in the first stage, indicating that the time series of confirmed cases in the second stage is more long-term memory than the time series of confirmed cases in the first stage. The daily absolute number of newly confirmed cases in Italy was converted to the daily growth rate of confirmed cases to eliminate the volatility of the data. The H value was 1.853, which was greater than 1, indicating that the time series of future confirmed cases is similar to the trend of historical changes. The daily rate of change in cases will continue to rise.1.4ConclusionAccording to the different interpretation of the H value obtained by the R / S analysis method, hierarchical isolation measures are adopted accordingly. When the H value is greater than 0.5, it indicates that the development of the epidemic situation in the area has more long-term memory, that is, when the number of confirmed cases in the past increases rapidly, the probability of the time series of confirmed cases in the future will continue the historical trend. Therefore, it is necessary to formulate strict anti-epidemic measures in accordance with the actual conditions of various countries, to detect, isolate, and treat early to reduce the base of infectious agents.


Fractals ◽  
2013 ◽  
Vol 21 (03n04) ◽  
pp. 1350018 ◽  
Author(s):  
BINGQIANG QIAO ◽  
SIMING LIU

To model a given time series F(t) with fractal Brownian motions (fBms), it is necessary to have appropriate error assessment for related quantities. Usually the fractal dimension D is derived from the Hurst exponent H via the relation D = 2-H, and the Hurst exponent can be evaluated by analyzing the dependence of the rescaled range 〈|F(t + τ) - F(t)|〉 on the time span τ. For fBms, the error of the rescaled range not only depends on data sampling but also varies with H due to the presence of long term memory. This error for a given time series then can not be assessed without knowing the fractal dimension. We carry out extensive numerical simulations to explore the error of rescaled range of fBms and find that for 0 < H < 0.5, |F(t + τ) - F(t)| can be treated as independent for time spans without overlap; for 0.5 < H < 1, the long term memory makes |F(t + τ) - F(t)| correlated and an approximate method is given to evaluate the error of 〈|F(t + τ) - F(t)|〉. The error and fractal dimension can then be determined self-consistently in the modeling of a time series with fBms.


Sign in / Sign up

Export Citation Format

Share Document