Forecasting and Technical Comparison of Inflation in Turkey With Box-Jenkins (ARIMA) Models and the Artificial Neural Network

2022 ◽  
pp. 1194-1216
Author(s):  
Erkan Işığıçok ◽  
Ramazan Öz ◽  
Savaş Tarkun

Inflation refers to an ongoing and overall comprehensive increase in the overall level of goods and services price in the economy. Today, inflation, which is attempted to be kept under control by central banks or, in the same way, whose price stability is attempted, consists of continuous price changes that occur in all the goods and services used by the consumers. Undoubtedly, in terms of economy, in addition to the realized inflation, inflation expectations are also gaining importance. This situation requires forecasting the future rates of inflation. Therefore, reliable forecasting of the future rates of inflation in a country will determine the policies to be applied by the decision-makers in the economy. The aim of this study is to predict inflation in the next period based on the consumer price index (CPI) data with two alternative techniques and to examine the predictive performance of these two techniques comparatively. Thus, the first of the two main objectives of the study are to forecast the future rates of inflation with two alternative techniques, while the second is to compare the two techniques with respect to statistical and econometric criteria and determine which technique performs better in comparison. In this context, the 9-month inflation in April-December 2019 was forecast by Box-Jenkins (ARIMA) models and Artificial Neural Networks (ANN), using the CPI data which consist of 207 data from January 2002 to March 2019 and the predictive performance of both techniques was examined comparatively. It was observed that the results obtained from both techniques were close to each other.

2020 ◽  
Vol 9 (4) ◽  
pp. 84-103 ◽  
Author(s):  
Erkan Işığıçok ◽  
Ramazan Öz ◽  
Savaş Tarkun

Inflation refers to an ongoing and overall comprehensive increase in the overall level of goods and services price in the economy. Today, inflation, which is attempted to be kept under control by central banks or, in the same way, whose price stability is attempted, consists of continuous price changes that occur in all the goods and services used by the consumers. Undoubtedly, in terms of economy, in addition to the realized inflation, inflation expectations are also gaining importance. This situation requires forecasting the future rates of inflation. Therefore, reliable forecasting of the future rates of inflation in a country will determine the policies to be applied by the decision-makers in the economy. The aim of this study is to predict inflation in the next period based on the consumer price index (CPI) data with two alternative techniques and to examine the predictive performance of these two techniques comparatively. Thus, the first of the two main objectives of the study are to forecast the future rates of inflation with two alternative techniques, while the second is to compare the two techniques with respect to statistical and econometric criteria and determine which technique performs better in comparison. In this context, the 9-month inflation in April-December 2019 was forecast by Box-Jenkins (ARIMA) models and Artificial Neural Networks (ANN), using the CPI data which consist of 207 data from January 2002 to March 2019 and the predictive performance of both techniques was examined comparatively. It was observed that the results obtained from both techniques were close to each other.


2019 ◽  
Author(s):  
Timo Walter ◽  
Leon Wansleben

The title of our contribution refers to Alexander Kluge’s movie, “Der Angriff der Gegenwart auf die übrige Zeit” (“the assault of the present on the rest of time”). The question we ask is how financialized capitalism shapes and formats the politics of the future. Our central tenet is that, far from providing an engine ’imagining’ futures that substantively guide (collective) actions, finance ‘consumes’ forecasts, plans, or visions in its present coordination process. While the “oscillation” between present futures and future presents has been identified as a defining feature of modern conceptions of contingency, freedom, and choice (Luhmann; Esposito), these two temporal modalities are collapsed in contemporary financial markets in an ongoing ‘pricing in’ of various possible future states. Projected futures do not substantively shape collective paths towards them or instruct social learning, but are calculatively assimilated to improve coordination between present prices. Fatally, central banks have been at the forefront of “synchronist” (Langenohl) finance, believing that as long as numeric calibration of their own and the markets’ expectations as expressed in prices align, they have rendered capitalism governable. Under this regime, central banks really do not govern inflation, but inflation expectations as expressed in the “yield curve” and built into interest rate derivatives. We argue that financial techniques built on the efficient market hypothesis and the Black-Scholes-Merton formula, as two theoretical articulations of this modern “synchronist” (Langenohl) temporality of finance, allow central banks to ignore possible “random” fluctuations in actual inflation and concentrate on the internal calibration of present futures as the sole criterion for monetary policy success. We show that the resultant “assault” on “future presents” was an important factor in the run-up to the crisis of 2007-9. Central banks deliberately attempted to eliminate uncertainties in markets about the future course of monetary policies. For that purpose, shared fictions about the underlying logics of Western economies (real interest rates, NAIRU etc.) were rigidly built into the structures of asset prices. Moreover, since central banks and market actors aligned their expectations over real interest rates, market actors could act as if their uncertainties about future liquidity needs could be neglected, since current money market and official lending rates were supposed to already define the price of liquidity tomorrow. In the last part of the contribution, we will extend this argument to contemporary quantitative easing, to show how it reinforces the pitfalls of generating expectations of economic prosperity and stability via the contemporary financial system.


2012 ◽  
pp. 32-47
Author(s):  
S. Andryushin ◽  
V. Kuznetsova

The paper analyzes central banks macroprudencial policy and its instruments. The issues of their classification, option, design and adjustment are connected with financial stability of overall financial system and its specific institutions. The macroprudencial instruments effectiveness is evaluated from the two points: how they mitigate temporal and intersectoral systemic risk development (market, credit, and operational). The future macroprudentional policy studies directions are noted to identify the instruments, which can be used to limit the financial systemdevelopment procyclicality, mitigate the credit and financial cycles volatility.


2020 ◽  
Vol 26 (1) ◽  
pp. 205-210
Author(s):  
Dumitru Iancu ◽  
Dorel Badea

AbstractWe communicate and decide every day, but the complexity of the context in which we do these things is increasing. Today, the cultural structure of the organization’s members, due to the need to have competent employees in correlation with the established objectives, is somewhat puzzled and dynamical. Thus, the decision-makers must take into account (mandatory) the cultural basis of the subordinates when choosing the best alternative for solving an organizational problem. From this perspective, Hofstede’s model can be one of the explanatory modalities of the organization’s cultural characteristics as a basis to identify the action’s solutions in that organization for the future.


2019 ◽  
Author(s):  
Chem Int

Recently, process control in wastewater treatment plants (WWTPs) is, mostly accomplished through examining the quality of the water effluent and adjusting the processes through the operator’s experience. This practice is inefficient, costly and slow in control response. A better control of WTPs can be achieved by developing a robust mathematical tool for performance prediction. Due to their high accuracy and quite promising application in the field of engineering, Artificial Neural Networks (ANNs) are attracting attention in the domain of WWTP predictive performance modeling. This work focuses on applying ANN with a feed-forward, back propagation learning paradigm to predict the effluent water quality of the Habesha brewery WTP. Data of influent and effluent water quality covering approximately an 11-month period (May 2016 to March 2017) were used to develop, calibrate and validate the models. The study proves that ANN can predict the effluent water quality parameters with a correlation coefficient (R) between the observed and predicted output values reaching up to 0.969. Model architecture of 3-21-3 for pH and TN, and 1-76-1 for COD were selected as optimum topologies for predicting the Habesha Brewery WTP performance. The linear correlation between predicted and target outputs for the optimal model architectures described above were 0.9201 and 0.9692, respectively.


Politics ◽  
2021 ◽  
pp. 026339572199148
Author(s):  
Anthony Costello

On the 25 March 2017, leaders of the EU27 and European Union (EU) institutions ratified the Rome Declaration. They committed to invite citizens to discuss Europe’s future and to provide recommendations that would facilitate their decision-makers in shaping their national positions on Europe. In response, citizens’ dialogues on the future of Europe were instituted across the Union to facilitate public participation in shaping Europe. This paper explores Ireland’s set of dialogues which took place during 2018. Although event organisers in Ireland applied a relatively atypical and more systematic and participatory approach to their dialogues, evidence suggests that Irelands’ dialogues were reminiscent of a public relations exercise which showcased the country’s commitment to incorporating citizens into the debate on Europe while avoiding a deliberative design which could have strengthened the quality of public discourse and the quality of public recommendations. Due to an absence of elite political will for a deliberative process, as well as structural weaknesses in design, participants’ recommendations lacked any clear and prescriptive direction which could shape Ireland’s national position on the future of Europe in any constructive or meaningful way.


2021 ◽  
Vol 13 (7) ◽  
pp. 165
Author(s):  
Paulo Rupino Cunha ◽  
Paulo Melo ◽  
Helder Sebastião

We analyze the path from cryptocurrencies to official Central Bank Digital Currencies (CBDCs), to shed some light on the ultimate dematerialization of money. To that end, we made an extensive search that resulted in a review of more than 100 academic and grey literature references, including official positions from central banks. We present and discuss the characteristics of the different CBDC variants being considered—namely, wholesale, retail, and, for the latter, the account-based, and token-based—as well as ongoing pilots, scenarios of interoperability, and open issues. Our contribution enables decision-makers and society at large to understand the potential advantages and risks of introducing CBDCs, and how these vary according to many technical and economic design choices. The practical implication is that a debate becomes possible about the trade-offs that the stakeholders are willing to accept.


2021 ◽  
Vol 12 (1) ◽  
Author(s):  
Joseph Friedman ◽  
Patrick Liu ◽  
Christopher E. Troeger ◽  
Austin Carter ◽  
Robert C. Reiner ◽  
...  

AbstractForecasts and alternative scenarios of COVID-19 mortality have been critical inputs for pandemic response efforts, and decision-makers need information about predictive performance. We screen n = 386 public COVID-19 forecasting models, identifying n = 7 that are global in scope and provide public, date-versioned forecasts. We examine their predictive performance for mortality by weeks of extrapolation, world region, and estimation month. We additionally assess prediction of the timing of peak daily mortality. Globally, models released in October show a median absolute percent error (MAPE) of 7 to 13% at six weeks, reflecting surprisingly good performance despite the complexities of modelling human behavioural responses and government interventions. Median absolute error for peak timing increased from 8 days at one week of forecasting to 29 days at eight weeks and is similar for first and subsequent peaks. The framework and public codebase (https://github.com/pyliu47/covidcompare) can be used to compare predictions and evaluate predictive performance going forward.


Sign in / Sign up

Export Citation Format

Share Document