index number theory
Recently Published Documents


TOTAL DOCUMENTS

49
(FIVE YEARS 9)

H-INDEX

10
(FIVE YEARS 1)

2021 ◽  
Vol 10 (1) ◽  
Author(s):  
Kirill Muradov

AbstractA trivial case in input–output structural decomposition analysis is a decomposition of a product of variables, or factors, where one factor is an inverse—typically, Leontief inverse—of a sum of other factors. There may be dozens and hundreds of such factors that describe the changes in subsets of technical coefficients. The existing literature offers ambiguous guidance in this case. The solution that is consistent with the index number theory may be virtually infeasible. The simplified ad hoc solutions require the researcher to make arbitrary choices, lead to biased estimates and do not ensure the consistency-in-aggregation of factors. This paper reviews the ad hoc solutions to the said problem and describes a numerical test to identify the best-performing solution. It is found that calculating the average of the two polar decomposition forms for each factor is superior to other approximations in terms of minimising the errors.


2021 ◽  
Vol 14 (8) ◽  
pp. 370
Author(s):  
William A. Barnett ◽  
Van H. Nguyen

Since Barnett derived the user cost price of money, the economic theory of monetary services aggregation has been developed and extended into a field of its own with solid foundations in microeconomic theory. Divisia monetary aggregates have repeatedly been shown to be strictly preferable to their simple sum counterparts, which have no competent foundations in microeconomic aggregation or index number theory. However, most central banks in the world, including that of Singapore, the Monetary Authority of Singapore (MAS), still report their monetary aggregates as simple summations. Recent macroeconomic research about Singapore tends to focus on exchange rates as a monetary policy target but ignores the aggregate quantity of money. Is that because quantities of money are irrelevant to economic activity? To examine the role of monetary quantities as potential monetary instruments, indicators, or targets and their relevance to predicting real economic activity in Singapore, this paper applies the user cost of money formula and the recently developed credit-card-augmented Divisia monetary aggregates formula to construct monetary services indexes for Singapore. We produce those state-of-the-art monetary services indexes from Jan 1991 to Mar 2021. We see that Divisia measures behave differently from simple sum measures in the period before the year 2000, while interest rates were high. Credit-card-augmented Divisia monetary services move closely with the conventional Divisia monetary aggregates, since the volume of credit card transactions in Singapore is relatively small compared with other monetary service assets. In future work, we plan to use our data to explore central bank policy in Singapore and to propose improvements in that policy. By making our data available to the public, we encourage others to do the same.


2020 ◽  
Author(s):  
Kirill Muradov

Abstract A trivial case in input-output structural decomposition analysis is a decomposition of a product of variables, or factors, where one factor is an inverse -- typically Leontief inverse -- of a sum of other factors. There may be dozens and hundreds of such factors that describe the changes in subsets of technical coefficients. The existing literature offers ambiguous guidance in this case. The solution that is consistent with the index number theory may be virtually infeasible. The simplified ad hoc solutions require the researcher to make arbitrary choices, lead to biased estimates and do not ensure the consistency-in-aggregation of factors. This paper reviews the ad hoc solutions to the said problem and describes a numerical test to identify the best-performing solution. It is found that calculating the average of the two polar decomposition forms for each factor is superior to other approximations in terms of minimising the errors.


Author(s):  
Syed Khaled Rahman

The purpose of the study was to measure and decompose total factor productivity efficiency (TFPE) into technical, scale, and mix efficiency with a view to explore weak efficiency dimension of 22 leasing companies (2013-2017) through DEA in a constant return to scale approach. Indices are used they satisfy all economically relevant axioms and tests from index number theory by making reliable multi-temporal and/or multi-lateral comparisons of efficiency. Three input and two output variables from published reports of sample firms were used to measure efficiency. It is seen that average TFPE of all leasing firms is only 31.86% while the average OTE is 64.28%. By using the same input firms can increase their output by 47.1%. IME of the firms is healthy (0.7821) which means that firms are at satisfactory efficiency level in mixing their inputs. There are opportunities for firms to increase efficiency by operating at MPSS on an unrestricted frontier. Except for RISE, efficiency measures are not significantly different from one year to another rather differ from one firm to another.


2020 ◽  
pp. 1-47
Author(s):  
Bert M. Balk ◽  
Alicia N. Rambaldi ◽  
D. S. Prasada Rao

This paper offers a framework for measuring global growth and inflation, built on standard index number theory, national accounts principles, and the concepts and methods for international macro-economic comparisons. Our approach provides a sound basis for purchasing power parity (PPP)- and exchange rate (XR)-based global growth and inflation measures. The Sato–Vartia index number system advocated here offers very similar results to a Fisher system but has the added advantage of allowing a complete decomposition with PPP or XR effects. For illustrative purposes, we present estimates of global growth and inflation for 141 countries over the years 2005 and 2011. The contribution of movements in XRs and PPPs to global inflation are presented. The aggregation properties of the method are also discussed.


2020 ◽  
Author(s):  
Erik Brynjolfsson ◽  
Avinash Collis ◽  
W. Erwin Diewert ◽  
Felix Eggers ◽  
Kevin J. Fox

A puzzling development over the past 15 years is decline in Total Factor Productivity in many advanced economies. Part of this decline may be due to the rapid growth of free digital goods. Statistical agencies have no reliable way to measure the benefits of the introduction of free goods. This is true even when the provision of the goods is paid for via advertising. Yet these free goods are enormously popular and surely create substantial utility for households. In this paper, we suggest a methodology which will allow statistical agencies to form rough approximations to the benefits that flow to households from new free goods. The present paper draws heavily on the contributions of Brynjolfsson, Collis, Diewert, Eggers and Fox (2019) (subsequent references will be to BCDEF) and Diewert, Fox and Schreyer (2019). In section I, we outline how the reservation price methodology introduced by Hicks (1940; 114) can be used to measure the consumption benefits to households of new products that are provided at zero cost or costs that are close to zero. This Hicksian approach relies on normal index number theory but requires the estimation of reservation prices. In section II, we show how choice experiments about compensation for product withdrawals can be used to estimate these reservation prices. Section III concludes with a summary and implications.


Author(s):  
Tomasz Kijek ◽  
Anna Matras-Bolibok

The aim of the paper is to assess the impact of knowledge-intensive specialisation on Total Factor Productivity (TFP) in the EU regional scope. To calculate TFP defined as the aggregated output-input ratio, we employ the multiplicatively-complete Färe-Primont index as it satisfies all economically-relevant axioms and tests from the index number theory. The knowledge intensive specialisation of EU regions is captured by the statistics on high-tech industry and knowledge-intensive services, i.e. the employment in high-tech sectors as a percentage of total employment (HTS). The research sample consists of 248 EU regions at NUTS 2 level. The key findings of the study indicate that the employment in high-tech manufacturing and knowledge-intensive services is not distributed uniformly in the EU regional space. Similarly, TFP also varies substantially across the EU regions. Moreover, the results of the research model estimation show that specialisation in high-tech manufacturing and knowledge-intensive services directly affects regional TFP. The main implication of our analysis for the policymakers is to explore and support knowledge-intensive specialisation patterns, that should be built upon existing regional technological competencies and human capital endowment according to the smart specialisation strategies approach.


Author(s):  
Ashiq Mohd Ilyas ◽  
S. Rajasekaran

Purpose The purpose of this paper is to analyse the performance of the Indian non-life (general) insurance sector in terms of total factor productivity (TFP) over the period 2005–2016. Design/methodology/approach This study utilises Färe‒Primont index (FPI) to access the change in TFP and its components: technical change, technical efficiency and mix and scale efficiency over the observation period. Moreover, it employs the Mann–Whitney U-test to scrutinise the difference between the public and the private insurers in terms of growth in productivity. Findings The results reveal that the insurance sector possesses a very low level of TFP. Also, the results divulge an improvement of 11.98 per cent in TFP of the insurance sector at an annual average rate of 12.41 per cent over the observation period. The growth in productivity is mainly attributable to the improvement of 10.81 per cent in the scale‒mix efficiency. The progress in scale‒mix efficiency is mainly the result of improvements in residual scale and residual mix efficiency. The results also show that the privately owned insurers have experienced a high productivity growth rate than the state-owned insurers. Practical implications The results hold practical implications for the regulators, policymakers and decision makers of the Indian non-life insurance companies. Originality/value This study is the first of its kind to use FPI, which satisfies all economically relevant axioms and tests defined by the index number theory to comprehensively access the change in TFP of the Indian non-life insurance sector.


2019 ◽  
Vol 47 (2) ◽  
pp. 111-128
Author(s):  
Humberto A. Brea-Solís ◽  
Emili Grifell-Tatjé

Purpose The purpose of this paper is to understand how a major retailer like Kmart lost its dominant position in the American retail industry. Design/methodology/approach This paper contains a decomposition of profit change into meaningful economic drivers using a methodology that combines frontier analysis with index number theory. The empirical analysis is complemented with a description of Kmart’s business model produced from corporate documents and other sources. Findings A quantification of Kmart’s business model performance expressed in monetary terms. This assessment is presented by CEO tenures showing the contribution of different economic drivers to the evolution of profits. Practical implications The study’s empirical results highlight the importance of the correct implementation of all aspects of the business model in order to achieve success. Originality/value This paper presents a new empirical framework to assess business model performance. Despite Kmart’s important role in American discount retailing history there have been very few studies that have analyzed its downfall. This paper contributes by filling that gap.


2018 ◽  
pp. 1-17
Author(s):  
Makram El-Shagi

It has repeatedly been shown that properly constructed monetary aggregates based on index number theory (such as Divisia money) vastly outperform traditional measures of money (i.e. simple sum money) in empirical models. However, opponents of Divisia frequently claim that Divisia is “too complex” for little gain. And indeed, at first glance it looks as if simple sum and Divisia sum exhibit similar dynamics. In this paper, we want to build deeper understanding of how and when Divisia and simple sum differ empirically using monthly US data from 1990M1 to 2007M12. In particular, we look at how they respond differently to monetary policy shocks, which seems to be the most essential aspect of those differences from the perspective of the policy maker. We use a very rich, fairly agnostic setup that allows us to identify many potential nonlinearities, building on a smoothed local projections approach with automatic selection of the relevant interaction terms. We find, that—while the direction of change is often similar—the precise dynamics differ sharply. In particular in times of economic uncertainty, when the proper assessment of monetary policy is most relevant, those existing differences are drastically augmented.


Sign in / Sign up

Export Citation Format

Share Document