DEA and Monte Carlo simulation approach towards green car selection

2017 ◽  
Vol 24 (5) ◽  
pp. 1234-1252 ◽  
Author(s):  
Anand Prakash ◽  
Rajendra P. Mohanty

Purpose Automakers are engaged in manufacturing both efficient and inefficient green cars. The purpose of this paper is to categorize efficient green cars and inefficient green cars followed by improving efficiencies of identified inefficient green cars for distribution fitting. Design/methodology/approach The authors have used 2014 edition of secondary data published by the Automotive Research Centre of the Automobile Club of Southern California. The paper provides the methodology of applying data envelopment analysis (DEA) consisting of 50 decision-making units (DMUs) of green cars with six input indices (emission, braking, ride quality, acceleration, turning circle, and luggage capacity) and two output indices (miles per gallon and torque) integrated with Monte Carlo simulation for drawing significant statistical inferences graphically. Findings The findings of this study showed that there are 27 efficient and 23 inefficient DMUs along with improvement matrix. Additionally, the study highlighted the best distribution fitting of improved efficient green cars for respective indices. Research limitations/implications This study suffers from limitations associated with 2014 edition of secondary data used in this research. Practical implications This study may be useful for motorists with efficient listing of green cars, whereas automakers can be benefitted with distribution fitting of improved efficient green cars using Monte Carlo simulation for calibration. Originality/value The paper uses DEA to empirically examine classification of green cars and applies Monte Carlo simulation for distribution fitting to improved efficient green cars to decide appropriate range of their attributes for calibration.

2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Abroon Qazi ◽  
Mecit Can Emre Simsekler

PurposeThis paper aims to develop a process for prioritizing project risks that integrates the decision-maker's risk attitude, uncertainty about risks both in terms of the associated probability and impact ratings, and correlations across risk assessments.Design/methodology/approachThis paper adopts a Monte Carlo Simulation-based approach to capture the uncertainty associated with project risks. Risks are prioritized based on their relative expected utility values. The proposed process is operationalized through a real application in the construction industry.FindingsThe proposed process helped in identifying low-probability, high-impact risks that were overlooked in the conventional risk matrix-based prioritization scheme. While considering the expected risk exposure of individual risks, none of the risks were located in the high-risk exposure zone; however, the proposed Monte Carlo Simulation-based approach revealed risks with a high probability of occurrence in the high-risk exposure zone. Using the expected utility-based approach alone in prioritizing risks may lead to ignoring few critical risks, which can only be captured through a rigorous simulation-based approach.Originality/valueMonte Carlo Simulation has been used to aggregate the risk matrix-based data and disaggregate and map the resulting risk profiles with underlying distributions. The proposed process supported risk prioritization based on the decision-maker's risk attitude and identified low-probability, high-impact risks and high-probability, high-impact risks.


2021 ◽  
Vol 11 (3) ◽  
pp. 1-18
Author(s):  
Raj V. Amonkar ◽  
Tuhin Sengupta ◽  
Debasis Patnaik

Learning outcomes The learning outcomes of this paper are as follows: to understand the context of seaport logistics and supply chain design structure, to apply Monte Carlo simulation in the interface of the supply chain and to analyze the Monte Carlo simulation algorithm and statistical techniques for identifying the key seaport logistics factors. Case overview/synopsis It was 9:00 p.m. on November 10, 2020, and Nishadh Amonkar, the CEO of OCTO supply chain management (SCM) was glued to the television watching the final cricket match of the Indian Premier League, 2020. Amonkar’s mobile phone rang and it was a call from Vinod Nair, a member Logistics Panel of Ranji Industries Federation. Nair informed Amonkar that it was related to the rejection of several export consignments of agricultural products from Ranji (in the western part of India). The rejection was due to the deterioration in the quality of the exported agricultural products during transit from Ranji to various locations in Europe. Complexity academic level This course is suitable at the MBA level for the following courses: Operations research (Focus/Session: Applications on Monte Carlo Simulation). SCM (Focus/Session: Global SCM, Logistics Planning, Distribution Network). Logistics management (Focus/Session: Transportation Planning). Business statistics (Focus/Session: Application of Hypothesis Testing). Supplementary materials Teaching Notes are available for educators only. Subject code CSS 9: Operations and logistics.


2018 ◽  
Vol 11 (5) ◽  
pp. 754-770 ◽  
Author(s):  
Cássio da Nóbrega Besarria ◽  
Nelson Leitão Paes ◽  
Marcelo Eduardo Alves Silva

Purpose Housing prices in Brazil have displayed an impressive growth in recent years, raising some concerns about the existence of a bubble in housing markets. In this paper, the authors implement an empirical methodology to identify whether or not there is a bubble in housing markets in Brazil. Design/methodology/approach Based on a theoretical model that establish that, in the absence of a bubble, a long-run equilibrium relationship should be observed between the market price of an asset and its dividends. The authors implement two methodologies. First, the authors assess whether there is a cointegration relationship between housing prices and housing rental prices. Second, the authors test whether the price-to-rent ratio is stationary. Findings The authors’ results show that there is evidence of a bubble in housing prices in Brazil. However, given the short span of the data, the authors perform a Monte Carlo simulation and show that the cointegration tests may be biased in small samples. Therefore, the authors should be caution when assessing the results. Research limitations/implications The results obtained from the cointegration analysis can be biased for small samples. Practical implications The information on the excessive increase of the prices of the properties in relation to their fundamental value can help in the decision-making on investment of the economic agents. Social implications These results corroborate the hypothesis that Brazil has an excessive appreciation in housing prices, and, as Silva and Besarria (2018) have suggested, this behavior explains, in part, the fact that the central bank has taken this issue into account when deciding about the stance of monetary policy of Brazil. Originality/value The originality is linked to the use of the Gregory-Hansen method of cointegration in the identification of bubbles and discussion of the limitations of the research through Monte Carlo simulation.


2020 ◽  
Vol 27 (10) ◽  
pp. 3095-3113
Author(s):  
Lihui Zhang ◽  
Guyu Dai ◽  
Xin Zou ◽  
Jianxun Qi

PurposeInterrupting work continuity provides a way to improve some project performance, but unexpected and harmful interruptions may impede the implementation. This paper aims to mitigate the negative impact caused by work continuity uncertainty based on the notion of robustness.Design/methodology/approachThis paper develops a float-based robustness measurement method for the work continuity uncertainty in repetitive projects. A multi-objective optimization model is formulated to generate a schedule that achieves a balance between crew numbers and robustness. This model is solved using two modules: optimization module and decision-making module. The Monte Carlo simulation is designed to validate the effectiveness of the generated schedule.FindingsThe results confirmed that it is necessary to consider the robustness as an essential factor when scheduling a repetitive project with uncertainty. Project managers may develop a schedule that is subject to delays if they only make decisions according to the results of the deadline satisfaction problem. The Monte Carlo simulation validated that an appropriate way to measure robustness is conducive to generating a schedule that can avoid unnecessary delay, compared to the schedule generated by the traditional model.Originality/valueAvailable studies assume that the work continuity is constant, but it cannot always be maintained when affected by uncertainty. This paper regards the work continuity as a new type of uncertainty factor and investigates how to mitigate its negative effects. The proposed float-based robustness measurement can measure the ability of a schedule to absorb unpredictable and harmful interruptions, and the proposed multi-objective scheduling model provides a way to incorporate the uncertainty into a schedule.


2014 ◽  
Vol 12 (3) ◽  
pp. 307-315 ◽  
Author(s):  
Sekar Vinodh ◽  
Gopinath Rathod

Purpose – The purpose of this paper is to present an integrated technical and economic model to evaluate the reusability of products or components. Design/methodology/approach – Life cycle assessment (LCA) methodology is applied to obtain the product’s environmental performance. Monte Carlo simulation is utilized for enabling sustainable product design. Findings – The results show that the model is capable of assessing the potential reusability of used products, while the usage of simulation significantly increases the effectiveness of the model in addressing uncertainties. Research limitations/implications – The case study has been conducted in a single manufacturing organization. The implications derived from the study are found to be practical and useful to the organization. Practical implications – The paper reports a case study carried out for an Indian rotary switches manufacturing organization. Hence, the model is practically feasible. Originality/value – The article presents a study that investigates LCA and simulation as enablers of sustainable product design. Hence, the contributions of this article are original and valuable.


2017 ◽  
Vol 23 (3) ◽  
pp. 537-554
Author(s):  
Anindya Chakrabarty ◽  
Zongwei Luo ◽  
Rameshwar Dubey ◽  
Shan Jiang

Purpose The purpose of this paper is to develop a theoretical model of a jump diffusion-mean reversion constant proportion portfolio insurance strategy under the presence of transaction cost and stochastic floor as opposed to the deterministic floor used in the previous literatures. Design/methodology/approach The paper adopts Merton’s jump diffusion (JD) model to simulate the price path followed by risky assets and the CIR mean reversion model to simulate the path followed by the short-term interest rate. The floor of the CPPI strategy is linked to the stochastic process driving the value of a fixed income instrument whose yield follows the CIR mean reversion model. The developed model is benchmarked against CNX-NIFTY 50 and is back tested during the extreme regimes in the Indian market using the scenario-based Monte Carlo simulation technique. Findings Back testing the algorithm using Monte Carlo simulation across the crisis and recovery phases of the 2008 recession regime revealed that the portfolio performs better than the risky markets during the crisis by hedging the downside risk effectively and performs better than the fixed income instruments during the growth phase by leveraging on the upside potential. This makes it a value-enhancing proposition for the risk-averse investors. Originality/value The study modifies the CPPI algorithm by re-defining the floor of the algorithm to be a stochastic mean reverting process which is guided by the movement of the short-term interest rate in the economy. This development is more relevant for two reasons: first, the short-term interest rate changes with time, and hence the constant yield during each rebalancing steps is not practically feasible; second, the historical literatures have revealed that the short-term interest rate tends to move opposite to that of the equity market. Thereby, during the bear run the floor will increase at a higher rate, whereas the growth of the floor will stagnate during the bull phase which aids the model to capitalize on the upward potential during the growth phase and to cut down on the exposure during the crisis phase.


2014 ◽  
Vol 9 (4) ◽  
pp. 505-519 ◽  
Author(s):  
Dilip Kumar

Purpose – The purpose of this paper is to test the efficient market hypothesis for major Indian sectoral indices by means of long memory approach in both time domain and frequency domain. This paper also tests the accuracy of the detrended fluctuation analysis (DFA) approach and the local Whittle (LW) approach by means of Monte Carlo simulation experiments. Design/methodology/approach – The author applies the DFA approach for the computation of the scaling exponent in the time domain. The robustness of the results is tested by the computation of the scaling exponent in the frequency domain by means of the LW estimator. The author applies moving sub-sample approach on DFA to study the evolution of market efficiency in Indian sectoral indices. Findings – The Monte Carlo simulation experiments indicate that the DFA approach and the LW approach provides good estimates of the scaling exponent as the sample size increases. The author also finds that the efficiency characteristics of Indian sectoral indices and their stages of development are dynamic in nature. Originality/value – This paper has both methodological and empirical originality. On the methodological side, the author tests the small sample properties of the DFA and the LW approaches by using simulated series of fractional Gaussian noise and find that both the approach possesses superior properties in terms of capturing the scaling behavior of asset prices. On the empirical side, the author studies the evolution of long-range dependence characteristics in Indian sectoral indices.


2020 ◽  
Vol 15 (4) ◽  
pp. 1277-1300
Author(s):  
Ignacio Contreras

Purpose Data envelopment analysis (DEA) is a mathematical method for the evaluation of the relative efficiency of a set of alternatives, which produces multiple outputs by consuming multiple inputs. Each unit is evaluated on the basis of the weighted output over the weighted input ratio with a free selection of weights and is allowed to select its own weighting scheme for both inputs and outputs so that the individual evaluation is optimized. However, several situations can be found in which the variability between weighting profiles is unsuitable. In those cases, it seems more appropriate to consider a common vector of weights. The purpose of this paper is to include a systematic revision of the existing literature regarding the procedures to determine a common set of weights (CSW) in the DEA context. The contributions are classified with respect to the methodology and to the main aim of the procedure. The discussion and findings of this paper provide insights into future research on the topic. Design/methodology/approach This paper includes a systematic revision of the existing literature about the procedures to determine a CSW in the DEA context. The contributions are classified with respect to the methodology and to the main aim of the procedure. Findings The discussion and findings of the literature review might insights into future research on the topic. Originality/value This papers revise the state of the art on the topic of models with CSW in DEA methodology and propose a systematic classification of the contributions with respect to several criteria. The paper would be useful for both theoretical and practical future research on the topic.


Author(s):  
Viviana Elizabeth Zárate-Mirón ◽  
Rosina Moreno Serrano

Purpose This paper aims to evaluate whether the integration of smart specialization strategies (S3) into clusters significantly impacts their efficiency for countries that still do not implement this policy. This study tests three effects: whether the kind of policies envisaged through an S3 strategy impacts cluster’s efficiency; whether this impact changes with the technological intensity of the clusters; to determine which S3 is more suitable for sub-clusters at different levels of technological intensity. Design/methodology/approach The Mexican economy is taken as case of study because it has a proper classification of its industries intro Porter’s cluster’s definition but still does not adopt the S3 policy. Through data envelopment analysis (DEA), this study evaluates the cluster’s efficiency increment when variables representing the S3 elements are included. Findings The results show that strategies following the S3 had a significant impact in all clusters, but when clusters were classified by technological intensity, the impact on efficiency is higher in clusters in the medium low-tech group. Practical implications According to the results in the DEA, it can be concluded that these S3 strategies have the potential to increase the clusters’ productivity significantly. These results make convenient the adoption of the S3 policy by countries that already count with a properly cluster definition. Originality/value These findings contribute to the lack of studies that analyze the join implementation of S3 on clusters.


Sign in / Sign up

Export Citation Format

Share Document