scholarly journals A cost-benefit analysis of GPU-based EC2 instances for a deep learning algorithm

2019 ◽  
Author(s):  
Eva Malta ◽  
Charles Rodamilans ◽  
Sandra Avila ◽  
Edson Borin

This paper analyzes the cost-benefit of using EC2 instances, specif- ically the p2 and p3 virtual machine types, which have GPU accelerators, to execute a machine learning algorithm. This analysis includes the runtime of a convolutional neural network executions, and it takes into consideration the necessary time to stabilize the accuracy value with different batch sizes. Also, we measure the cost of using each machine type, and we define a relation be- tween this cost and the execution time for each virtual machine. The results show that, although the price per hour of the p3 instance is three times bigger, it is faster and costs almost the same as the p2 instance type to train the deep learning algorithm.

2007 ◽  
pp. 70-84 ◽  
Author(s):  
E. Demidova

This article analyzes definitions and the role of hostile takeovers at the Russian and European markets for corporate control. It develops the methodology of assessing the efficiency of anti-takeover defenses adapted to the conditions of the Russian market. The paper uses the cost-benefit analysis, where the costs and benefits of the pre-bid and post-bid defenses are compared.


1999 ◽  
Vol 40 (10) ◽  
pp. 153-159 ◽  
Author(s):  
D. H. Newsome ◽  
C. D. Stephen

Many countries are investing in measures to improve surface water quality, but the investment programmes for so doing are increasingly becoming subject to cost-benefit analysis. Whilst the cost of control measures can usually be determined for individual improvement schemes, there are currently no established procedures for valuing the benefits attributable to improved surface water quality. The paper describes a methodology that has been derived that now makes this possible.


Animals ◽  
2021 ◽  
Vol 11 (5) ◽  
pp. 1297
Author(s):  
Juntae Kim ◽  
Hyo-Dong Han ◽  
Wang Yeol Lee ◽  
Collins Wakholi ◽  
Jayoung Lee ◽  
...  

Currently, the pork industry is incorporating in-line automation with the aim of increasing the slaughtered pork carcass throughput while monitoring quality and safety. In Korea, 21 parameters (such as back-fat thickness and carcass weight) are used for quality grading of pork carcasses. Recently, the VCS2000 system—an automatic meat yield grading machine system—was introduced to enhance grading efficiency and therefore increase pork carcass production. The VCS2000 system is able to predict pork carcass yield based on image analysis. This study also conducted an economic analysis of the system using a cost—benefit analysis. The subsection items of the cost-benefit analysis considered were net present value (NPV), internal rate of return (IRR), and benefit/cost ratio (BC ratio), and each method was verified through sensitivity analysis. For our analysis, the benefits were grouped into three categories: the benefits of reducing labor costs, the benefits of improving meat yield production, and the benefits of reducing pig feed consumption through optimization. The cost-benefit analysis of the system resulted in an NPV of approximately 615.6 million Korean won, an IRR of 13.52%, and a B/C ratio of 1.65.


2009 ◽  
Vol 68 (10) ◽  
pp. 2479-2484 ◽  
Author(s):  
Jean-Charles Hourcade ◽  
Philippe Ambrosi ◽  
Patrice Dumas

2004 ◽  
Vol 61 (7) ◽  
pp. 1269-1284 ◽  
Author(s):  
RIC Chris Francis ◽  
Steven E Campana

In 1985, Boehlert (Fish. Bull. 83: 103–117) suggested that fish age could be estimated from otolith measurements. Since that time, a number of inferential techniques have been proposed and tested in a range of species. A review of these techniques shows that all are subject to at least one of four types of bias. In addition, they all focus on assigning ages to individual fish, whereas the estimation of population parameters (particularly proportions at age) is usually the goal. We propose a new flexible method of inference based on mixture analysis, which avoids these biases and makes better use of the data. We argue that the most appropriate technique for evaluating the performance of these methods is a cost–benefit analysis that compares the cost of the estimated ages with that of the traditional annulus count method. A simulation experiment is used to illustrate both the new method and the cost–benefit analysis.


1993 ◽  
Vol 31 (11) ◽  
pp. 41-44

The relationship between drug costs and treatment choices was the subject of the first annual Drug and Therapeutics Bulletin symposium held in March 1993.* In a time of severe financial constraints for the NHS it is important that the money available is well spent. In the case of treatment that means the benefits must be worth the cost. There is, however, no agreed way of deciding when a particular health benefit to an individual is worth the cost to the NHS. Drug prices are easier to measure and more consistent than the prices of other treatments, and may be more amenable to cost-benefit analysis. Treatment choices are made primarily by doctors but with critical input from patients, pharmacists, nurses and health service managers. In this article we give an overview of the symposium at which speakers described ways in which drug costs and treatment choices were tackled in general practice (Ann McPherson, John Howie), in hospital (Dorothy Anderson), in clinical research and audit (Iain Chalmers, Alison Frater), by consumers (Anna Bradley), by health economists (Mike Drummond) and by government (Joe Collier). We also take into account points raised in discussion by the participants.


2018 ◽  
Vol 10 (12) ◽  
pp. 4668 ◽  
Author(s):  
Antonio Nesticò ◽  
Shuquan He ◽  
Gianluigi De Mare ◽  
Renato Benintendi ◽  
Gabriella Maselli

The process of allocating financial resources is extremely complex—both because the selection of investments depends on multiple, and interrelated, variables, and constraints that limit the eligibility domain of the solutions, and because the feasibility of projects is influenced by risk factors. In this sense, it is essential to develop economic evaluations on a probabilistic basis. Nevertheless, for the civil engineering sector, the literature emphasizes the centrality of risk management, in order to establish interventions for risk mitigation. On the other hand, few methodologies are available to systematically compare ante and post mitigation design risk, along with the verification of the economic convenience of these actions. The aim of the paper is to demonstrate how these limits can be at least partially overcome by integrating, in the traditional Cost-Benefit Analysis schemes, the As Low as Reasonably Practicable (ALARP) logic. According to it, the risk is tolerable only if it is impossible to reduce it further or if the costs to mitigate it are disproportionate to the benefits obtainable. The research outlines the phases of an innovative protocol for managing investment risks. On the basis of a case study dealing with a project for the recovery and transformation of an ancient medieval village into a widespread-hotel, the novelty of the model consists of the characterization of acceptability and tolerability thresholds of the investment risk, as well as its ability to guarantee the triangular balance between risks, costs and benefits deriving from mitigation options.


2021 ◽  
Vol 2021 ◽  
pp. 1-11
Author(s):  
Bangtong Huang ◽  
Hongquan Zhang ◽  
Zihong Chen ◽  
Lingling Li ◽  
Lihua Shi

Deep learning algorithms are facing the limitation in virtual reality application due to the cost of memory, computation, and real-time computation problem. Models with rigorous performance might suffer from enormous parameters and large-scale structure, and it would be hard to replant them onto embedded devices. In this paper, with the inspiration of GhostNet, we proposed an efficient structure ShuffleGhost to make use of the redundancy in feature maps to alleviate the cost of computations, as well as tackling some drawbacks of GhostNet. Since GhostNet suffers from high computation of convolution in Ghost module and shortcut, the restriction of downsampling would make it more difficult to apply Ghost module and Ghost bottleneck to other backbone. This paper proposes three new kinds of ShuffleGhost structure to tackle the drawbacks of GhostNet. The ShuffleGhost module and ShuffleGhost bottlenecks are utilized by the shuffle layer and group convolution from ShuffleNet, and they are designed to redistribute the feature maps concatenated from Ghost Feature Map and Primary Feature Map. Besides, they eliminate the gap of them and extract the features. Then, SENet layer is adopted to reduce the computation cost of group convolution, as well as evaluating the importance of the feature maps which concatenated from Ghost Feature Maps and Primary Feature Maps and giving proper weights for the feature maps. This paper conducted some experiments and proved that the ShuffleGhostV3 has smaller trainable parameters and FLOPs with the ensurance of accuracy. And with proper design, it could be more efficient in both GPU and CPU side.


Sign in / Sign up

Export Citation Format

Share Document