cost curve
Recently Published Documents


TOTAL DOCUMENTS

243
(FIVE YEARS 63)

H-INDEX

18
(FIVE YEARS 3)

Healthcare ◽  
2021 ◽  
Vol 9 (12) ◽  
pp. 1753
Author(s):  
Brad Beauvais ◽  
Clemens Scott Kruse ◽  
Lawrence Fulton ◽  
Matthew Brooks ◽  
Michael Mileski ◽  
...  

Background/Purpose: The purpose of this research is to determine if the tradeoffs that Kissick proposed among cost containment, quality, and access remain as rigidly interconnected as originally conceived in the contemporary health care context. Although many have relied on the Kissick model to advocate for health policy decisions, to our knowledge the model has never been empirically tested. Some have called for policy makers to come to terms with the premise of the Kissick model tradeoffs, while others have questioned the model, given the proliferation of quality-enhancing initiatives, automation, and information technology in the health care industry. One wonders whether these evolutionary changes alter or disrupt the originality of the Kissick paradigms themselves. Methods: Structural equation modeling (SEM) was used to evaluate the Kissick hypothetical relationships among the unobserved constructs of cost, quality, and access in hospitals for the year 2018. Hospital data were obtained from Definitive Healthcare, a subscription site that contains Medicare data as well as non-Medicare data for networks, hospitals, and clinics (final n = 2766). Results: Reporting significant net effects as defined by our chosen study variables, we find that as quality increases, costs increase, as access increases, quality increases, and as access increases, costs increase. Policy and Practice Implications: Our findings lend continued relevance to a balanced approach to health care policy reform efforts. Simultaneously bending the health care cost curve, increasing access to care, and advancing quality of care is as challenging now as it was when the Kissick model was originally conceived.


Author(s):  
Brad Beauvais ◽  
Clemens Scott Kruse ◽  
Lawrence Fulton ◽  
Matthew Brooks ◽  
Michael Mileski ◽  
...  

The purpose of this research is to determine if the tradeoffs that Kissick proposed among cost containment, quality, and access remain as rigidly interconnected as originally conceived in the contemporary health care context. Although many have relied on the Kissick model to advocate for health policy decisions, to our knowledge, the model has never been empirically tested. Some have called for policy makers to come to terms with the premise of the Kissick model tradeoffs, others have questioned the model given the proliferation of quality enhancing initiatives, automation, and information technology in the health care industry. One wonders whether these evolutionary changes alter or disrupt the originality of the Kissick paradigms themselves. Methods: Structural Equation Modeling (SEM) was used to evaluate the Kissick hypothetical relationships among the unobserved constructs of cost, quality, and access in hospitals for the year 2018. Hospital data were obtained from Definitive Healthcare a subscription site which contains Medicare data as well as non-Medicare data for networks, hospitals, and clinics (final n= 2,766). Results: Reporting significant net effects as defined by our chosen study variables, we find that as quality increases costs increase, as access increases quality increases, and as access increases, costs increase. Policy and Practice Implications: Our findings lend continued relevance to a balanced approach to health care policy reform efforts. Simultaneously bending the health care cost curve, increasing access to care, and advancing quality of care is as challenging now as it was when the Kissick model was originally conceived.


Author(s):  
Erik Wackers ◽  
Niek Stadhouders ◽  
Anthony Heil ◽  
Gert Westert ◽  
Simone van Dulmen ◽  
...  

Background: A lack of knowledge exists on real world hospital strategies that seek to improve quality, while reducing or containing costs. The aim of this study is to identify hospitals that have implemented such strategies and determine factors influencing the implementation. Methods: We searched PubMed, EMBASE, Web of Science, Cochrane Library and EconLit for case studies on hospital-wide strategies aiming to increase quality and reduce costs. Additionally, grey literature databases, Google and selected websites were searched. We used inductive coding to identify factors relating to implementation of the strategies. Results: The literature search identified 4198 papers, of which our included 17 papers describe 19 case studies from five countries, mostly from the US. To accomplish their goals, hospitals use different management strategies, such as continuous quality improvement, clinical pathways, Lean, Six Sigma and value-based healthcare. Reported effects on both quality and costs are predominantly positive. Factors identified to be relevant for implementation were categorized in eleven themes: 1) strategy, 2) leadership, 3) engagement, 4) reorganization, 5) finances, 6) data and information technology (IT), 7) projects, 8) support, 9) skill development, 10) culture, and 11) communication. Recurring barriers for implementation are a lack of physician engagement, insufficient financial support, and poor data collection. Conclusion: Hospital strategies that explicitly aim to provide high quality care at low costs may be a promising option to bend the cost curve while improving quality. We found a limited amount of studies, and varying contexts across case studies. This underlines the importance of integrated evaluation research. When implementing a quality enhancing, cost reducing strategy, we recommend considering eleven conditions for successful implementation that we were able to derive from the literature.


2021 ◽  
Vol 24 (2) ◽  
pp. 27-30
Author(s):  
Mite Tomov ◽  
◽  
Cvetanka Velkoska ◽  
◽  
◽  
...  

This paper presents four approaches to the graphic interpretation of the quality costs structure definition models: classical, modern, modified, and visionary approach. These give rise to theoretical graphic quality costs models and illustrate the relationship between the quality costs categories, as well as the relationship between the quality costs categories and the total quality costs and the quality level. The paper comparatively analyzes the underlying assumptions, existing knowledge, and principles characteristic of each approach. This contributes to the shaping of the quality costs categories curves and the overall quality cost curve in the theoretical models. The conducted analysis in the paper will enable forecasting the trend of development of theoretical graphic models and identification of potential stakeholders that contribute to changes in the structure and the behavior of the quality costs categories, and thus the behavior of the overall quality costs.


2021 ◽  
Author(s):  
Carlos D Santos ◽  
Luís F Costa ◽  
Paulo B Brito

Abstract Markup cyclicality has been central for debating policy effectiveness and understanding business-cycle fluctuations. However, measuring the cyclicality of markups is as important as understanding the microeconomic mechanisms underlying that cyclicality. The latter requires measurement of firm-level markups and separating supply from demand shocks. We construct a novel dataset with detailed (multi-)product-level prices for individual firms. By estimating a structural model of supply and demand, we evaluate how companies adjust prices and marginal costs as a response to shocks. We find that price markups respond positively to supply shocks and negatively to demand shocks. The mechanism explaining the observed markup behaviour is the same for both shocks: incomplete pass-through of changes along the marginal-cost curve to price adjustments. These observed price and output responses are consistent with dynamic demand considerations. Finally, we use our estimated shocks to show how aggregate markup fluctuations in the sample period are mostly explained by aggregate demand shocks.


2021 ◽  
Author(s):  
◽  
Christian Seifert

<p>With the increasing connectivity of and reliance on computers and networks, important aspects of computer systems are under a constant threat. In particular, drive-by-download attacks have emerged as a new threat to the integrity of computer systems. Drive-by-download attacks are clientside attacks that originate fromweb servers that are visited byweb browsers. As a vulnerable web browser retrieves a malicious web page, the malicious web server can push malware to a user's machine that can be executed without their notice or consent. The detection of malicious web pages that exist on the Internet is prohibitively expensive. It is estimated that approximately 150 million malicious web pages that launch drive-by-download attacks exist today. Socalled high-interaction client honeypots are devices that are able to detect these malicious web pages, but they are slow and known to miss attacks. Detection ofmaliciousweb pages in these quantitieswith client honeypots would cost millions of US dollars. Therefore, we have designed a more scalable system called a hybrid client honeypot. It consists of lightweight client honeypots, the so-called low-interaction client honeypots, and traditional high-interaction client honeypots. The lightweight low-interaction client honeypots inspect web pages at high speed and forward only likely malicious web pages to the high-interaction client honeypot for a final classification. For the comparison of client honeypots and evaluation of the hybrid client honeypot system, we have chosen a cost-based evaluation method: the true positive cost curve (TPCC). It allows us to evaluate client honeypots against their primary purpose of identification of malicious web pages. We show that costs of identifying malicious web pages with the developed hybrid client honeypot systems are reduced by a factor of nine compared to traditional high-interaction client honeypots. The five main contributions of our work are:  High-Interaction Client Honeypot The first main contribution of our work is the design and implementation of a high-interaction client honeypot Capture-HPC. It is an open-source, publicly available client honeypot research platform, which allows researchers and security professionals to conduct research on malicious web pages and client honeypots. Based on our client honeypot implementation and analysis of existing client honeypots, we developed a component model of client honeypots. This model allows researchers to agree on the object of study, allows for focus of specific areas within the object of study, and provides a framework for communication of research around client honeypots.  True Positive Cost Curve As mentioned above, we have chosen a cost-based evaluationmethod to compare and evaluate client honeypots against their primary purpose of identification ofmaliciousweb pages: the true positive cost curve. It takes into account the unique characteristics of client honeypots, speed, detection accuracy, and resource cost and provides a simple, cost-based mechanism to evaluate and compare client honeypots in an operating environment. As such, the TPCC provides a foundation for improving client honeypot technology. The TPCC is the second main contribution of our work.  Mitigation of Risks to the Experimental Design with HAZOP - Mitigation of risks to internal and external validity on the experimental design using hazard and operability (HAZOP) study is the third main contribution. This methodology addresses risks to intent (internal validity) as well as generalizability of results beyond the experimental setting (external validity) in a systematic and thorough manner.  Low-Interaction Client Honeypots - Malicious web pages are usually part of a malware distribution network that consists of several servers that are involved as part of the drive-by-download attack. Development and evaluation of classification methods that assess whether a web page is part of a malware distribution network is the fourth main contribution. Hybrid Client Honeypot System - The fifth main contribution is the hybrid client honeypot system. It incorporates the mentioned classification methods in the form of a low-interaction client honeypot and a high-interaction client honeypot into a hybrid client honeypot systemthat is capable of identifying malicious web pages in a cost effective way on a large scale. The hybrid client honeypot system outperforms a high-interaction client honeypot with identical resources and identical false positive rate.</p>


2021 ◽  
Author(s):  
◽  
Christian Seifert

<p>With the increasing connectivity of and reliance on computers and networks, important aspects of computer systems are under a constant threat. In particular, drive-by-download attacks have emerged as a new threat to the integrity of computer systems. Drive-by-download attacks are clientside attacks that originate fromweb servers that are visited byweb browsers. As a vulnerable web browser retrieves a malicious web page, the malicious web server can push malware to a user's machine that can be executed without their notice or consent. The detection of malicious web pages that exist on the Internet is prohibitively expensive. It is estimated that approximately 150 million malicious web pages that launch drive-by-download attacks exist today. Socalled high-interaction client honeypots are devices that are able to detect these malicious web pages, but they are slow and known to miss attacks. Detection ofmaliciousweb pages in these quantitieswith client honeypots would cost millions of US dollars. Therefore, we have designed a more scalable system called a hybrid client honeypot. It consists of lightweight client honeypots, the so-called low-interaction client honeypots, and traditional high-interaction client honeypots. The lightweight low-interaction client honeypots inspect web pages at high speed and forward only likely malicious web pages to the high-interaction client honeypot for a final classification. For the comparison of client honeypots and evaluation of the hybrid client honeypot system, we have chosen a cost-based evaluation method: the true positive cost curve (TPCC). It allows us to evaluate client honeypots against their primary purpose of identification of malicious web pages. We show that costs of identifying malicious web pages with the developed hybrid client honeypot systems are reduced by a factor of nine compared to traditional high-interaction client honeypots. The five main contributions of our work are:  High-Interaction Client Honeypot The first main contribution of our work is the design and implementation of a high-interaction client honeypot Capture-HPC. It is an open-source, publicly available client honeypot research platform, which allows researchers and security professionals to conduct research on malicious web pages and client honeypots. Based on our client honeypot implementation and analysis of existing client honeypots, we developed a component model of client honeypots. This model allows researchers to agree on the object of study, allows for focus of specific areas within the object of study, and provides a framework for communication of research around client honeypots.  True Positive Cost Curve As mentioned above, we have chosen a cost-based evaluationmethod to compare and evaluate client honeypots against their primary purpose of identification ofmaliciousweb pages: the true positive cost curve. It takes into account the unique characteristics of client honeypots, speed, detection accuracy, and resource cost and provides a simple, cost-based mechanism to evaluate and compare client honeypots in an operating environment. As such, the TPCC provides a foundation for improving client honeypot technology. The TPCC is the second main contribution of our work.  Mitigation of Risks to the Experimental Design with HAZOP - Mitigation of risks to internal and external validity on the experimental design using hazard and operability (HAZOP) study is the third main contribution. This methodology addresses risks to intent (internal validity) as well as generalizability of results beyond the experimental setting (external validity) in a systematic and thorough manner.  Low-Interaction Client Honeypots - Malicious web pages are usually part of a malware distribution network that consists of several servers that are involved as part of the drive-by-download attack. Development and evaluation of classification methods that assess whether a web page is part of a malware distribution network is the fourth main contribution. Hybrid Client Honeypot System - The fifth main contribution is the hybrid client honeypot system. It incorporates the mentioned classification methods in the form of a low-interaction client honeypot and a high-interaction client honeypot into a hybrid client honeypot systemthat is capable of identifying malicious web pages in a cost effective way on a large scale. The hybrid client honeypot system outperforms a high-interaction client honeypot with identical resources and identical false positive rate.</p>


2021 ◽  
Vol 894 (1) ◽  
pp. 012011
Author(s):  
Z D Nurfajrin ◽  
B Satiyawira

Abstract The Indonesian government has followed up the Paris Agreement with Law No. 16 of 2016 by setting an ambitious emission reduction target of 29% by 2030, and this figure could even increase to 41% if supported by international assistance. In line with this, mitigation efforts are carried out in the energy sector. Especially in the energy sector, it can have a significant impact when compared to other sectors due to an increase in energy demand, rapid economic growth, and an increase in living standards that will push the rate of emission growth in the energy sector up to 6. 7% per year. The bottom-up AIM/end-use energy model can select the technologies in the energy sector that are optimal in reducing emissions and costs as a long-term strategy in developing national low-carbon technology. This model can use the Marginal Abatement Cost (MAC) approach to evaluate the potential for GHG emission reductions by adding a certain amount of costs for each selected technology in the target year compared to the reference technology in the baseline scenario. In this study, three scenarios were used as mitigation actions, namely CM1, CM2, CM3. The Abatement Cost Curve tools with an assumed optimum tax value of 100 USD/ton CO2eq, in the highest GHG emission reduction potential, are in the CM3 scenario, which has the most significant reduction potential, and the mitigation costs are not much different from other scenarios. For example, PLTU – supercritical, which can reduce a significant GHG of 37.39 Mtoe CO2eq with an emission reduction cost of -23.66 $/Mtoe CO2eq.


Minerals ◽  
2021 ◽  
Vol 11 (11) ◽  
pp. 1170
Author(s):  
Sitong Ren ◽  
Yang Liu ◽  
Gaofeng Ren

China has committed to peak its carbon emissions by 2030, which puts forward a new issue for underground metal mines—selecting a cleaner mining method which requires less energy and generates less carbon emissions. This paper proposes an enterprise-level model to estimate life-cycle energy consumption and carbon emissions, which takes more carbon sources (e.g., cement and carbon sink loss) into consideration to provide more comprehensive insights. Moreover, this model is integrated with the energy-conservation supply curve and the carbon abatement cost curve to involve production capacity utilization in the prediction of future performance. These two approaches are applied to 30 underground iron mines. The results show that (1) caving-based cases have lower energy consumption and carbon emissions, i.e., 673.64 GJ/kt ore, 52.21 GJ/kt ore (only considering electricity and fossil fuel), and 12.11 CO2 eq/kt ore, as compared the backfilling-based cases, i.e., 710.08 GJ/kt ore, 63.70 GJ/kt ore, and 40.50 t CO2 eq/kt ore; (2) caving-based cases present higher carbon-abatement potential (more than 12.95%) than the backfilling-based vases (less than 9.68%); (3) improving capacity utilization facilitates unit cost reduction to mitigate energy consumption and carbon emissions, and the energy-conservation and carbon-abatement potentials will be developed accordingly.


2021 ◽  
Author(s):  
Sam Jones ◽  
Adam Joyce ◽  
Nikhil Balasubramanian

Abstract Objectives/Scope There are many different views on the Energy Transition. What is agreed is that to achieve current climate change targets, the journey to deep decarbonisation must start now. Scope 3 emissions are clearly the major contributor to total emissions and must be actively reduced. However, if Oil and Gas extraction is to be continued, then operators must understand, measure, and reduce Scope 1 and 2 emissions. This paper examines the constituent parts of typical Scope 1 emissions for O&G assets and discusses a credible pathway and initial steps towards decarbonisation of operations. Methods, Procedures, Process Emissions from typical assets are investigated: data is examined to determine the overall and individual contributions of Scope 1 emissions. A three tiered approach to emissions savings is presented: – Reduce overall energy usage – Seek to Remove environmental losses – Replace energy supply with low carbon alternatives A simple method, used to assess carbon emissions, based on an abatement of carbon from a cost per CO2 tonne averted basis is described. This method, Marginal Abatement Cost Curve (MACC), is based solely on cost efficiency. Other criteria such as safety, weight, footprint and reliability are not considered. Credible pathway for reduction of Scope 1 emissions is presented. Taking appropriate actions as described in the pathway, contributors are eliminated in a strategic order, allowing operators to contribute to deep decarbonisation. Results, Observations, Conclusions A typical offshore installation was modelled with a number of carbon abatement measures implemented. Results are presented as cost effective or non-cost-effective CO2 measures together with the residual CO2 emissions. Based on the data presented, many of the replace measures have a higher cost per tonne of CO2 abated than reduce and remove measure. These findings indicate that additional technological advancement may be needed to make alternative power solutions commercially viable. It also indicates that several CO2 abatement measures are cost effective today. The pathway proposes actions to implement carbon savings for offshore operators, it differentiates actions which can be taken today and those which require further technological advancement before they become commercially viable. The intent of this pathway is to demonstrate that the energy transition is not solely the preserve of the largest operators and every company can take positive steps towards supporting decarbonisation. Novel/Additive Information The world needs security of energy supply. Hydrocarbons are still integral; however, oil and gas operators must contribute to carbon reduction for society to meet the energy transition challenges. As government and societal appetite for decarbonisation heightens, demands are growing for traditional hydrocarbon assets to reduce their carbon footprint if they are to remain part of the energy mix. Society and therefore regulators will demand that more is done to address emissions during this transitional phase, consequently necessitating that direct emissions are reduced as much as possible. The pathway is accessible to all today, we need not wait for novel technologies to act.


Sign in / Sign up

Export Citation Format

Share Document