A multidimensional longitudinal meta-analysis of quality costing research

2018 ◽  
Vol 35 (2) ◽  
pp. 405-429 ◽  
Author(s):  
Evrikleia Chatzipetrou ◽  
Odysseas Moschidis

Purpose The purpose of this paper is to examine the longitudinal evolution of quality costs measurement, depicted in 99 real data studies of the last 30 years. A meta-analysis of these articles is conducted, in order to highlight the evolution of the variables that have been used for the study of quality costing, in relation to the date of publication, business sector and geographical origin of each paper. Design/methodology/approach The analysis of the cost components has been conducted with the use of multiple correspondence analysis, which is a useful tool for the exploration of the interrelations among all elements, aiming at the identification of the dominant and most substantial tendencies in their structure. Findings The findings suggest that the level of analysis of quality costs is related to the date of publication, the business sector and the origin of each research. Furthermore, it is pointed out that the most prominent prevention costs are related to suppliers’ assurance, internal audit and new product’s design and development. Appraisal costs are mostly defined by quality audits and procurement costs, while failure costs by defect/failure analysis, low quality losses, complaint investigation and concessions and warranty claims. Originality/value The present paper is a longitudinal meta-analysis of 99 quality cost papers that have been published in the last 30 years. It explores the evolution of research in quality costing, not only in relation to the cost components in use, but also in terms of date of publication, business sector and geographical origin of the studies.

2014 ◽  
Vol 8 (1) ◽  
pp. 100-120 ◽  
Author(s):  
Yun Seng Lim ◽  
Siong Lee Koh ◽  
Stella Morris

Purpose – Biomass waste can be used as fuel in biomass power plants to generate electricity. It is a type of renewable energy widely available in Malaysia because 12 million tons of the biomass waste is produced every year. At present, only 5 per cent of the total biomass waste in Sabah, one of the states in Malaysia, is used to generate electricity for on-site consumption. The remaining 95 per cent of the biomass waste has not been utilized because the transportation cost for shifting the waste from the plantations to the power plants is substantial, hence making the cost of the biomass generated electricity to be high. Therefore, a methodology is developed and presented in this paper to determine the optimum geographic distribution and capacities of the biomass power plants around a region so that the cost of biomass generated electricity can be minimized. The paper aims to discuss these issues. Design/methodology/approach – The methodology is able to identify the potential locations of biomass power plants on any locations on a region taking into account the operation and capital costs of the power plants as well as the cost of connecting the power plants to the national grid. The methodology is programmed using Fortran. Findings – This methodology is applied to Sabah using the real data. The results generated from the methodology show the best locations and capacities of biomass power plants in Sabah. There are 20 locations suitable for biomass power plants. The total capacity of these biomass power plants is 4,996 MW with an annual generation of 35,013 GWh. This is sufficient to meet all the electricity demand in Sabah up to 2030. Originality/value – The methodology is an effective tool to determine the best geographic locations and sizes of the biomass power plants around a region.


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Behnam Malmir ◽  
Christopher W. Zobel

PurposeWhen a large-scale outbreak such as the COVID-19 pandemic happens, organizations that are responsible for delivering relief may face a lack of both provisions and human resources. Governments are the primary source for the humanitarian supplies required during such a crisis; however, coordination with humanitarian NGOs in handling such pandemics is a vital form of public-private partnership (PPP). Aid organizations have to consider not only the total degree of demand satisfaction in such cases but also the obligation that relief goods such as medicine and foods should be distributed as equitably as possible within the affected areas (AAs).Design/methodology/approachGiven the challenges of acquiring real data associated with procuring relief items during the COVID-19 outbreak, a comprehensive simulation-based plan is used to generate 243 small, medium and large-sized problems with uncertain demand, and these problems are solved to optimality using GAMS. Finally, post-optimality analyses are conducted, and some useful managerial insights are presented.FindingsThe results imply that given a reasonable measure of deprivation costs, it can be important for managers to focus less on the logistical costs of delivering resources and more on the value associated with quickly and effectively reducing the overall suffering of the affected individuals. It is also important for managers to recognize that even though deprivation costs and transportation costs are both increasing as the time horizon increases, the actual growth rate of the deprivation costs decreases over time.Originality/valueIn this paper, a novel mathematical model is presented to minimize the total costs of delivering humanitarian aid for pandemic relief. With a focus on sustainability of operations, the model incorporates total transportation and delivery costs, the cost of utilizing the transportation fleet (transportation mode cost), and equity and deprivation costs. Taking social costs such as deprivation and equity costs into account, in addition to other important classic cost terms, enables managers to organize the best possible response when such outbreaks happen.


2019 ◽  
Vol 9 (3) ◽  
pp. 440-456
Author(s):  
Seyed Ehsan Zahed ◽  
Sirwan Shahooei ◽  
Ferika Farooghi ◽  
Mohsen Shahandashti ◽  
Siamak Ardekani

Purpose The purpose of this paper is to conduct life-cycle cost analysis of a short-haul underground freight transportation (UFT) system for the Dallas Fort Worth international airport. Design/methodology/approach The research approach includes: identifying the cost components of the proposed airport UFT system; estimating life-cycle cost (LCC) of system components using various methods; determining life-cycle cash flows; evaluating the reliability of the results using sensitivity analysis; and assessing the validity of the results using analogues cases. Findings Although the capital cost of constructing an airport UFT system seems to be the largest cost of such innovative projects, annual costs for running the system are more significant, taking a life-cycle perspective. System administrative cost, tunnel operation and maintenance, and tunnel construction cost are the principle cost components of the UFT system representing approximately 46, 24 and 19 percent of the total LCC, respectively. The shipping cost is estimated to be $4.14 per ton-mile. Although this cost is more than the cost of transporting cargos by trucks, the implementation of UFT systems could be financially justified considering their numerous benefits. Originality/value This paper, for the first time, helps capital planners understand the LCC of an airport UFT system with no or limited past experience, and to consider such innovative solutions to address airport congestion issues.


2015 ◽  
Vol 26 (6) ◽  
pp. 966-983 ◽  
Author(s):  
Benjamin Blair ◽  
Jenny Kehl ◽  
Rebecca Klaper

Purpose – Pharmaceutical and personal care products (PPCPs) and phosphorus are pollutants that can cause a wide array of negative environmental impacts. Phosphorus is a regulated pollutant in many industrial countries, while PPCPs are widely unregulated. Many technologies designed to remove phosphorus from wastewater can remove PPCPs, therefore the purpose of this paper is to explore the ability of these technologies to also reduce the emission of unregulated PPCPs. Design/methodology/approach – Through meta-analysis, the authors use the PPCPs’ risk quotient (RQ) to measure and compare the effectiveness of different wastewater treatment technologies. The RQ data are then applied via a case study that uses phosphorus effluent regulations to determine the ability of the recommended technologies to also mitigate PPCPs. Findings – The tertiary membrane bioreactor and nanofiltration processes recommended to remove phosphorus can reduce the median RQ from PPCPs by 71 and 81 percent, respectively. The ultrafiltration technology was estimated to reduce the median RQ from PPCPs by 28 percent with no cost in addition to the costs expected under the current phosphorus effluent regulations. RQ reduction is expected with a membrane bioreactor and the cost of upgrading to this technology was found to be $11.76 per capita/year. Practical implications – The authors discuss the management implications, including watershed management, alternative PPCPs reduction strategies, and water quality trading. Originality/value – The evaluation of the co-management of priority and emerging pollutants illuminates how the removal of regulated pollutants from wastewater could significantly reduce the emission of unregulated PPCPs.


2000 ◽  
Vol 6 (1_suppl) ◽  
pp. 4-6 ◽  
Author(s):  
Pamela Whitten ◽  
Charles Kingsley ◽  
Jim Grigsby

We attempted a meta-analysis of telemedicine research studies of the costs associated with telemedicine. First, we performed a search of six well known databases with a variety of relevant keywords. After discarding non-English publications, books and duplicate publications resulting from the same study, we were left with 551 articles for analysis. Our second step was to separate the articles into two groups: those with and those without quantitative cost data. Only 38 articles contained any type of real data. Because many of these 38 studies proved to be inadequately designed or conducted, we were unable to peform a traditional meta-analysis. Furthermore, there were a number of disturbing features common to these studies, including the omission of the number of consultations or patients, almost non-existent longitudinal data collection and lack of uniformity in cost analyses. We conclude that it is premature for any statements to be made, either positive or negative, regarding the cost-effectiveness of telemedicine in general.


2016 ◽  
Vol 29 (5) ◽  
pp. 714-738 ◽  
Author(s):  
Melanie Roussy ◽  
Marion Brivot

Purpose – The purpose of this paper is to characterize how those who perform (internal auditors), mandate (audit committee (AC) members), use (AC members and external auditors) and normalize (the Institute of Internal Auditors (IIA)) internal audit work, respectively make sense of the notion of “internal audit quality” (IAQ). Design/methodology/approach – This study is predicated on the meta-analysis of extant literature on IAQ, 56 interviews with internal auditors and AC members of public or para-public sector organizations in Canada, and archival documents published by the IIA, analyzed in the light of framing theory. Findings – Four interpretative schemes (or frames) emerge from the analysis, called “manager,” “éminence grise,” “professional” and “watchdog.” They respectively correspond to internal auditors’, AC members’, the IIA’s and external auditors’ viewpoints and suggest radically different perspectives on how IAQ should be defined and controlled (via input, throughput, output or professional controls). Research limitations/implications – Empirically, the authors focus on rare research data. Theoretically, the authors delineate four previously undocumented competing frames of IAQ. Practical implications – Practically, the various governance actors involved in assessing IAQ can learn from the study that they should confront their views to better coordinate their quality control efforts. Originality/value – Highlighting the contrast between these frames is important because, so far, extant literature has predominantly focussed on only one perspective on IAQ, that of external auditors. The authors suggest that IAQ is more polysemous and complex than previously acknowledged, which justifies the qualitative and interpretive approach.


2015 ◽  
Vol 26 (2) ◽  
pp. 238-253 ◽  
Author(s):  
Ahren Johnston

Purpose – The purpose of this paper is to investigate the cost of improving service to a motor carrier in the intermodal market. The paper further seeks to validate the existence of two dimensions of service with differing impacts on costs. The physical capacity dimension is related to the traditional view that higher quality costs more and the human performance dimension is related to the production management view that higher quality can actually save money by reducing the need for scrap and rework. Design/methodology/approach – To determine the cost of improved service, a translog cost function that included variables for each of the two dimensions of service quality was estimated. Because the data were centered prior to estimation, the first-order coefficients are interpretable as elasticity of cost with respect to quality. Findings – Results of estimation show that improvements to the physical capacity dimension lead to higher costs and improvements to the human performance have no significant impact on costs. Research limitations/implications – The major limitation of this study is that it is restricted to a single carrier and total costs were allocated according to transit time not specific costs. Practical implications – Results of this study would help a carrier or other service provider determine which aspects of service to focus on in order to improve service with minimal impact on costs. Originality/value – The value of this paper lies in verifying the existence of two dimensions of service and estimating how they impact costs.


2019 ◽  
Vol 32 (1) ◽  
pp. 114-135
Author(s):  
Andrea Garlatti ◽  
Paolo Fedele ◽  
Silvia Iacuzzi ◽  
Grazia Garlatti Costa

Purpose Coproduction is both a recurrent way of organizing public services and a maturing academic field. The academic debate has analyzed several facets, but one deserves further analysis: its impact on the cost efficiency of public services. The purpose of this paper is to aim at systematizing the findings on the relationship between coproduction and cost efficiency and at developing insights for future research. Design/methodology/approach This paper is based on a structured literature review (SLR), following the approach proposed by Massaro, Dumay and Guthrie. The SLR approach differs from traditional narrative reviews since, like other meta-analysis methods, it adopts a replicable and transparent process. At the same time, when compared to most common meta-analysis or systematic review logics, it is better suited to incorporate evidence from case studies and etnographies. This makes the method especially suited to public administration and management studies. Findings Results shed light on the nature of the academic literature relating coproduction to cost efficiency, on what type of costs are affected and how and on the meaningfulness of productivity measures when public services are co-produced. Originality/value In times of fiscal distress for many governments, the paper contributes to research and practice in systematically re-assessing the effects of coproduction on public budgets.


2014 ◽  
Vol 114 (8) ◽  
pp. 1229-1245 ◽  
Author(s):  
Dilupa Nakandala ◽  
Henry Lau ◽  
Jingjing Zhang

Purpose – The purpose of this paper is to investigate the total cost function of an inventory system with a reorder point/order quantity policy where the lead time is controllable based on the cost paid by the buyer for the service. Design/methodology/approach – Cost functions are presented to investigate how the changes in lead time affect different components of inventory cost in the present of random demand. Two methods including an iteration technique and Simulated Annealing (SA) algorithm are presented to deal with the cost optimization issue. The application of proposed model is illustrated using numerical case scenarios. Findings – The cost functions show that besides ordering cost, change in stochastic demand during lead time is the major factor that affects the other cost components such as holding and penalty costs. This finding is validated by numerical study. Results also show that performance of SA algorithm is highly similar to iteration methodology, while the former one is easier in application. Practical implications – This paper develops less complex, more pragmatic methods, easily adoptable by logistics managers for cost minimization. This paper also analyzes and highlights the unique characteristics and features of these two approaches that can help practitioners in making the right choice when faced with the identified logistics issue. Originality/value – This research explicitly investigate impacts of changing lead time on inventory cost components which enables informed decision making and inventory system planning for cost optimization by logistics practitioners. Two methodologies that can be easily used by practitioners without deep mathematical analysis and is cost effective are introduced to solve the optimization problem. Detailed roadmaps of how to implement proposed approaches have been illustrated by different case scenarios.


Author(s):  
Rabab Hayek ◽  
Guillaume Raschia ◽  
Patrick Valduriez ◽  
Noureddine Mouaddib

PurposeThe goal of this paper is to contribute to the development of both data localization and description techniques in P2P systems.Design/methodology/approachThe approach consists of introducing a novel indexing technique that relies on linguistic data summarization into the context of P2P systems.FindingsThe cost model of the approach, as well as the simulation results have shown that the approach allows the efficient maintenance of data summaries, without incurring high traffic overhead. In addition, the cost of query routing is significantly reduced in the context of summaries.Research limitations/implicationsThe paper has considered a summary service defined on the APPA's architecture. Future works have to study the extension of this work in order to be generally applicable to any P2P data management system.Practical implicationsThis paper has mainly studied the quantitative gain that could be obtained in query processing from exploiting data summaries. Future works aim to implement this technique on real data (not synthetic) in order to study the qualitative gain that can be obtained from approximately answering a query.Originality/valueThe novelty of the approach shown in the paper relies on the double exploitation of the summaries in P2P systems: data summaries allow for a semantic‐based query routing, and also for an approximate query answering, using their intentional descriptions.


Sign in / Sign up

Export Citation Format

Share Document