Journal of CENTRUM Cathedra (JCC) The Business and Economics Research Journal
Latest Publications


TOTAL DOCUMENTS

116
(FIVE YEARS 0)

H-INDEX

10
(FIVE YEARS 0)

Published By Emerald (Mcb Up )

1851-6599

Author(s):  
Nan Hu ◽  
Rong Huang ◽  
Xu Li ◽  
Ling Liu

Purpose Existing literature in experimental accounting research suggests that accounting professionals and people with accounting backgrounds tend to have a lower level of moral reasoning and ethical development. Motivated by these findings, this paper aims to examine whether chief executive officers (CEOs) with accounting backgrounds have an impact on firms’ earnings management behavior and the level of accounting conservatism. Design/methodology/approach The authors classify CEOs into those with and without accounting backgrounds using BoardEx data. Using discretionary accruals from several different models, they do not find that CEOs with accounting backgrounds are more likely to engage in income-increasing accruals. However, the authors find that CEOs with accounting backgrounds exhibit lower levels of conservatism, proxied by C-scores and T-scores (Basu, 1997). This finding suggests that CEOs with accounting backgrounds recognize bad news more quickly than good news, consistent with the accounting principle of “anticipating all losses but anticipating no gains”. Findings The authors show that firms whose CEOs have accounting backgrounds exhibit lower levels of accounting conservatism. However, these firms do not exhibit higher levels of income-increasing discretionary accruals. This study documents the impact of CEOs’ educational backgrounds on firms’ accounting choices and confirms prior findings in experimental accounting research using large sample archival data. Originality/value This paper is the first study that investigates the impact of CEOs’ accounting backgrounds on firms’ financial reporting policy. The findings may have some policy implications. If accounting backgrounds of CEOs can make a significant difference on firms’ behavior, it is reasonable to make CEOs accountable for the quality of financial reporting. This paper is one of the first to empirically test inferences drawn by experimental accounting research. There has been a gap between archival and experimental accounting studies. The authors propose that interesting research questions can be addressed by filling in such a gap.


Author(s):  
Chenzhang Bao ◽  
Indranil Bardhan

Purpose The purpose of this study is to evaluate the determinants of health outcomes of dialysis patients, while specifically focusing on the role of dialysis process measures and dialysis practice characteristics. The dialysis industry is facing a major transition from a volume-based health care system to a value-based cost-efficient care model, in the USA. Under the bundled Prospective Payment System, the treatment-based payment model is subject to meeting quality thresholds as defined by clinical process measures including dialysis adequacy and anemia management. Few studies have focused on studying these two processes and their association with the quality of patient health outcomes. Design/methodology/approach In this study, the authors focus on identifying the determinants of patient health outcomes among freestanding dialysis clinics, using a large cross-sectional data set of 4,571 dialysis clinics in the USA. The authors use econometric analyses to estimate the association between dialysis facility characteristics and practice patterns and their association with dialysis process measures and hospitalization risk. Findings The authors find that reusing dialyzers and increasing the number of dialysis stations is associated with higher levels of clinical quality. This research indicates that deploying more nurses on-site allows patients to avail adequate dialysis, while increasing the supply of physicians can hurt anemia control process. In addition, the authors report that offering peritoneal dialysis and late night shifts are not beneficial practices in terms of their impact on the hospitalization risk. Research limitations/implications While early studies of dialysis care mainly focused on the associations between practice patterns and patient outcomes, this research reveals the underlying mechanisms of these relationships by exploring the mediation effects of clinical dialysis processes on patient outcomes. The results indicate that dialysis process measures mediate the impact of the operational characteristics of dialysis centers on patient hospitalization rates. Practical implications This study offers several managerial insights for owners and operators of dialysis clinics with respect to the association between managerial and clinical practices that they deploy within dialysis clinics and their impact on clinical quality measures as well as hospitalization risk of patients. Managers can draw on this study to optimize staffing levels in their dialysis clinics, and implement innovative clinical practices. Social implications Considering the growth in healthcare expenditures in developing and developed countries, and specifically for costly diagnoses such as dialyses, this study offers several insights related to the inter-relationships between dialysis practice patterns and their clinical quality measures. Originality/value This study makes several major contributions. First, the authors address the extant gap in the literature on the relationships between dialysis facility and practice characteristics and clinical outcomes, while specifically highlighting the role of clinical process measures as antecedents of patient hospitalization ratio, a key metric used to measure performance of dialysis clinics. Second, this study sheds light on the underlying mechanisms that serve as enablers of the dialysis adequacy and anemia management. To the best of the authors’ knowledge, this is the first study to explore these relationships in the dialysis industry. The authors’ approach provides a new direction for future studies to explore the pathways that may impact clinical quality measures in the delivery of dialysis services.


Author(s):  
Vincent Charles ◽  
Rajiv D. Banker

Author(s):  
Guy D. Fernando ◽  
Alex Thevaranjan

Purpose This paper aims to study the impact of audit quality on the components of executive cash compensation. It is predicted that as audit quality improves, greater emphasis will be placed on the incentive components of cash compensation, and lower emphasis on the salary (fixed) component. Specifically, it is predicted that as audit quality enhances, greater emphasis will be placed on earnings and sales revenues in determining executive cash compensation. Using auditor specialization as a proxy for audit quality, empirical support is provided for all of our predictions. Design/methodology/approach This paper provides empirical support with agency theoretic predictions. Findings This paper developed the following hypotheses: H1 – in executive cash compensation, more weight is being placed on earnings-based measures as auditor specialization improves; H2 – in executive cash compensation, more weight is also being placed on sales revenues as auditor specialization improves; H3 – in executive cash compensation, salary levels decrease as auditor specialization improves; and H4 – the impact of auditor specialization on the weight on earnings, sales and the salary levels is lower in the post-Sarbanes–Oxley Act (SOX) period compared to pre-SOX period. Research limitations/implications First, the article limits itself to cash compensation, while current executive compensation is largely made of equity. Second, the measure of audit quality used, ‘national level auditor specialization’, may not be as effective in the post-SOX era. Practical implications Compensation committees should pay attention to audit quality (in whatever way it may be proxied by) in determining executive compensation. Originality/value This is the first paper to show that audit quality not only improves the earnings response coefficient in firm valuation but also enhances the weight placed on earnings (and sales revenues) in executive compensation.


Author(s):  
Hsihui Chang ◽  
Helen HL Choy

Purpose This paper aims to examine the effect of the Sarbanes–Oxley Act (SOX), which was signed by President George W. Bush and came into effect on July 30, 2002, on firm productivity. Design/methodology/approach The authors use the total factor productivity (TFP) as our measure of firm productivity. Findings Analyzing annual firm-level data from the Compustat database for the period of 1991-2006, the authors find that firm productivity increases at a higher rate in the post-SOX period. The results indicate that, although firms incur significant costs in complying with the requirements of the SOX, they also benefit from these requirements as evidenced by the improved productivity over time post-SOX. There is also a shift in the output elasticities from capital toward labor. The SOX has a positive effect on the output elasticity of labor but a negative impact on that of capital. Research limitations/implications The results have the following important implications. The SOX is a value-enhancing regulation in that it not only strengthens a firm’s corporate governance but also improves its productivity. However, compliance with the SOX can impose a long-term cost on firms: the decrease in the capital investment, leading to a decline in the output elasticity of capital. If this decline in the capital investment continues, it can have an adverse effect on firm productivity in the long term. Originality/value This paper extends the literature along the line of the actual operational effects of the SOX regulation by examining its effect on the productivity of firms.


Author(s):  
Pierre Jinghong Liang ◽  
Madhav Rajan ◽  
Korok Ray

Purpose This paper aims to explore the design of management teams when the critical task facing individual managers is monitoring the performance of worker teams and producing performance measures under uncertain information environments. Design/methodology/approach The authors use a multi-agent LEN framework – linear contract, exponential utility and normal density – to model the incentive provision and organizational design. Findings The main lesson is that the use of performance measures under uncertainty is greatly affected by the potential for free-riding in the very monitoring activities which generate the measures to begin with. Accordingly, the value of having a management team, that is the incremental benefit of having a second manager, depends on the monitoring technology. Of particular importance are the potential free-riding in monitoring effort among multiple managers and synergies gained from having more than one manager, such as correlation among the performance measures produced or improvement due to splitting workers pool into separate groups for each manager to monitor separately. Originality/value The paper pushes this line of research further by explicitly modeling the endogenous process of signal generation within a rich economic environment. In this environment, number of workers being evaluated and number of managers who produce the signals are both endogenous. Furthermore, both workers and managers are subject to moral hazard problem. In particular, the managers suffer from potential free-riding problems but may benefit from synergistic forces due to team monitoring.


Author(s):  
Taylor Boyd ◽  
Grace Docken ◽  
John Ruggiero

Purpose The purpose of this paper is to improve the estimation of the production frontier in cases where outliers exist. We focus on the case when outliers appear above the true frontier due to measurement error. Design/methodology/approach The authors use stochastic data envelopment analysis (SDEA) to allow observed points above the frontier. They supplement SDEA with assumptions on the efficiency and show that the true frontier in the presence of outliers can be derived. Findings This paper finds that the authors’ maximum likelihood approach outperforms super-efficiency measures. Using simulations, this paper shows that SDEA is a useful model for outlier detection. Originality/value The model developed in this paper is original; the authors add distributional assumptions to derive the optimal quantile with SDEA to remove outliers. The authors believe that the value of the paper will lead to many citations because real-world data are often subject to outliers.


Author(s):  
Vincent Charles ◽  
Rajiv D. Banker

Author(s):  
Juan Aparicio

Purpose The purpose of this paper is to provide an outline of the major contributions in the literature on the determination of the least distance in data envelopment analysis (DEA). The focus herein is primarily on methodological developments. Specifically, attention is mainly paid to modeling aspects, computational features, the satisfaction of properties and duality. Finally, some promising avenues of future research on this topic are stated. Design/methodology/approach DEA is a methodology based on mathematical programming for the assessment of relative efficiency of a set of decision-making units (DMUs) that use several inputs to produce several outputs. DEA is classified in the literature as a non-parametric method because it does not assume a particular functional form for the underlying production function and presents, in this sense, some outstanding properties: the efficiency of firms may be evaluated independently on the market prices of the inputs used and outputs produced; it may be easily used with multiple inputs and outputs; a single score of efficiency for each assessed organization is obtained; this technique ranks organizations based on relative efficiency; and finally, it yields benchmarking information. DEA models provide both benchmarking information and efficiency scores for each of the evaluated units when it is applied to a dataset of observations and variables (inputs and outputs). Without a doubt, this benchmarking information gives DEA a distinct advantage over other efficiency methodologies, such as stochastic frontier analysis (SFA). Technical inefficiency is typically measured in DEA as the distance between the observed unit and a “benchmarking” target on the estimated piece-wise linear efficient frontier. The choice of this target is critical for assessing the potential performance of each DMU in the sample, as well as for providing information on how to increase its performance. However, traditional DEA models yield targets that are determined by the “furthest” efficient projection to the evaluated DMU. The projected point on the efficient frontier obtained as such may not be a representative projection for the judged unit, and consequently, some authors in the literature have suggested determining closest targets instead. The general argument behind this idea is that closer targets suggest directions of enhancement for the inputs and outputs of the inefficient units that may lead them to the efficiency with less effort. Indeed, authors like Aparicio et al. (2007) have shown, in an application on airlines, that it is possible to find substantial differences between the targets provided by applying the criterion used by the traditional DEA models, and those obtained when the criterion of closeness is utilized for determining projection points on the efficient frontier. The determination of closest targets is connected to the calculation of the least distance from the evaluated unit to the efficient frontier of the reference technology. In fact, the former is usually computed through solving mathematical programming models associated with minimizing some type of distance (e.g. Euclidean). In this particular respect, the main contribution in the literature is the paper by Briec (1998) on Hölder distance functions, where formally technical inefficiency to the “weakly” efficient frontier is defined through mathematical distances. Findings All the interesting features of the determination of closest targets from a benchmarking point of view have generated, in recent times, the increasing interest of researchers in the calculation of the least distance to evaluate technical inefficiency (Aparicio et al., 2014a). So, in this paper, we present a general classification of published contributions, mainly from a methodological perspective, and additionally, we indicate avenues for further research on this topic. The approaches that we cite in this paper differ in the way that the idea of similarity is made operative. Similarity is, in this sense, implemented as the closeness between the values of the inputs and/or outputs of the assessed units and those of the obtained projections on the frontier of the reference production possibility set. Similarity may be measured through multiple distances and efficiency measures. In turn, the aim is to globally minimize DEA model slacks to determine the closest efficient targets. However, as we will show later in the text, minimizing a mathematical distance in DEA is not an easy task, as it is equivalent to minimizing the distance to the complement of a polyhedral set, which is not a convex set. This complexity will justify the existence of different alternatives for solving these types of models. Originality/value As we are aware, this is the first survey in this topic.


Author(s):  
Jérôme Boutang ◽  
Michel De Lara

Purpose In a modern world increasingly perceived as uncertain, the mere purchase of a household cleaning product, or a seemingly harmless bottle of milk, conveys interrogations about potential hazards, from environmental to health impacts. The main purpose of this paper is to suggest that risk could be considered as one of the major dimensions of choice for a wide range of concerns and markets, alongside aspiration/satisfaction, and tackled efficiently by mobilizing the recent findings of cognitive sciences, neurosciences and evolutionary psychology. It is felt that consumer research could benefit more widely from psychological and evolutionary-grounded risk theories. Design/methodology/approach In this study, some 50 years of marketing management literature, as well as risk-specialized literature, was examined in an attempt to get a grasp of how risk is handled by consumer sciences and of whether they make some use of the most recent academic works on mental biases, non-mainstream decision-making processes or evolutionary roots of behavior. We then tested and formulated several hypotheses regarding risk profiles and preferences in the sector of insurance, by participating in an Axa Research Fund–Paris School of Economics research project. Findings It is suggested that consumer profiles could be enriched by risk-taking attitudes, that risk could be part of the “reason why” of brand positioning, and that brand, as well as public policy communication, could benefit from a targeted use of risk perception biases. Originality/value This paper proposes to apply evolutionary-based psychological concepts to build perceptual maps describing people and consumers on both aspiration and risk attitude axis, and to design communication tools according to psychological research on message framing and biases. Such an approach mobilizes not only the recent findings of cognitive sciences and neurosciences but also the understanding of the roots of risk attitudes and perception. Those maps and framing could probably be applied to many sectors, markets and public issues, from commodities to personal products and services (food, luxury goods, electronics, financial products, tourism, design or insurance).


Sign in / Sign up

Export Citation Format

Share Document