Does Congestion Always Hurt? Managing Discount Under Congestion in a Game-Theoretic Setting

Author(s):  
Rajib L. Saha ◽  
Sumanta Singha ◽  
Subodha Kumar

Many firms buy cloud services from cloud vendors, such as Amazon Web Services to serve end users. One of the key factors that affect the quality of cloud services is congestion. Congestion leads to a potential loss of end users, resulting in lower demand for cloud services. Although discount can stimulate demand, its effect under congestion is ambiguous; a higher discount leads to higher demand, but it can further lead to higher congestion, thereby lowering demand. We explore how congestion moderates both cloud vendor pricing and the buyer’s fulfillment decisions. We seek to answer how the congestion sensitivity of the end users and the cost of technology impact buyer profitability and the cloud vendor’s choice of discount. We also examine how the cost of technology determines the buyer’s willingness to pass on savings to end users. Our results show that the buyer is not necessarily worse off even when the end users are more intolerant to congestion. In fact, when end users are more congestion sensitive, the demand for cloud services can sometimes increase, and the discount offered by the vendor can decrease. We also observe that a lower cost of technology can sometimes hurt the buyer, and the buyer can pass on lower benefits to end users.

2020 ◽  
Vol 2020 ◽  
pp. 1-21
Author(s):  
Jinglve Wang ◽  
Guohua Zhou

In contrast to the econometric models that have been commonly used throughout a large portion of the literature, we develop six game-theoretic models to analyze governmental subsidy strategies in different market environments and to investigate the question of whether government subsidies crowd in or crowd out private investment in R&D activities. Based on realistic situations, we classify governmental subsidy strategies into three types, namely, no subsidy provided, subsidies provided based on the price of the end products, and subsidies provided based on the cost of R&D. In addition, according to whether competition exists in the market, we classify markets into monopoly markets and duopoly markets. Our research shows (a) that the relationship between government subsidies and private R&D investment is deeply impacted by the form of the subsidies used; (b) that the characteristic value of the R&D project and the competitive environment of the market are the two key factors that should be considered when governments decide which form of subsidy to employ; and (c) the optimal amount for each type of subsidy.


2018 ◽  
Vol 7 (4) ◽  
pp. 2457
Author(s):  
Rajeev Tiwari ◽  
Shuchi Upadhyay ◽  
Gunjan Lal ◽  
Varun Tanwar

Today, there is a data workload that needs to be managed efficiently. There are many ways for the management and scheduling of processes, which can impact the performance and quality of the product and highly available, scalable web hosting can be a complex and expensive proposition. Traditional web architectures don’t offer reliability. So in this work a Scrum Console is being designed for managing a process which will be hosted on Amazon Web Services (AWS) [2] which provides a reliable, scalable, highly available and high performance infrastructure web application. The Scrum Console Platform facilitates the collaboration of various members of a team to manage projects together. The Scrum Console Platform has been developed using JSP, Hibernate & Oracle 12c Enterprise Edition Database. The Platform is deployed as a web application on AWS Elastic Beanstalk which automates the deployment, management and monitoring of the application while relying on the underlying AWS resources such EC2, S3, RDS, CloudWatch, autoscaling, etc.


2018 ◽  
Vol 7 (2.21) ◽  
pp. 420 ◽  
Author(s):  
Devansh Sharma ◽  
R Anandan ◽  
A Manikandan ◽  
Kumar Narayanan ◽  
C Swaraj Paul

The emails that are being sent today are done manually through the use of email providers like Gmail, Yahoo, Hotmail etc. with a copy paste mechanism of the same message. This requires more human resource and time. The system now being developed will be able to send the emails to thousands number of user without any copy-paste mechanism in very less time and with great accuracy handling all errors, if occurs, through the use of Amazon web services & Google Cloud services. Thus, the messages received by every user will be unique instead of the same message being sent to all. The email micro service will be responsible for handling out all the essential needs, verification & validation, automation, clean code and follows a proper SDLC as per current industry standards. The track ability feature will also be provided to track all the emails that are being sent to the users. Thus, the automation will be done and will save a lot of human resource and time.  


Author(s):  
Jon D Hill

Abstract Summary Voice assistants have become increasingly embedded in consumer electronics, as the quality of their interaction improves and the cost of hardware continues to drop. Despite their ubiquity, these assistants remain underutilized as a means of accessing biological research data. Gene Teller is a voice assistant service based on the Alexa Skills Kit and Amazon Lambda functions that enables scientists to query for gene-centric information in an intuitive manner. It includes several features, such as synonym disambiguation and short-term memory, that enable a natural conversational interaction, and is extensible to include new resources. The underlying architecture, based on Simple Storage Service and Amazon Web Services Lambda, is cost efficient and scalable. Availability and implementation A publicly accessible version of Gene Teller is available as an Alexa Skill from the Amazon Marketplace at https://www.amazon.com/dp/B08BRD8SS8. The source code is freely available on GitHub at https://github.com/solinvicta/geneTeller.


2019 ◽  
Vol 7 (1) ◽  
pp. 1-6
Author(s):  
Anne-Laure Mention ◽  
João José Pinto Ferreira ◽  
Marko Torkkeli

‘Our mind-set will be to avoid the moonshot’ said Boeing CEO James McNerney at a Wall Street analysts meeting in Seattle nearly 5 years ago (see Gates, 2014). The ambitious, exploratory and risky endeavour dubbed as moonshot project of the Boeing 787 Dreamliner had sunk billions of dollars in an industry where end-users demanded more comfort and convenience for less cost. According to McNerney, moonshots do not work in a price-sensitive environment. It is argued that they also tend to take the focus away from more immediate value capture opportunities as seen through Google’s loss on its core Cloud Platform to Amazon Web Services (AWS). Google’s parent company Alphabet which oversees Google X (a semi-secret moonshot project lab) more recently reported that it had incurred a US$1.3billion in operating loss on moonshot projects with a sizeable increase in compensation of employees and executives working on these projects (Alphabet, 2018). Notably, none of the Google X lab spin-outs (e.g. Loon – a balloon-based internet project, Waymo – self-driving car project, Wing – drone delivery project) have been identified as commercially viable. Despite the uncertainties and failures, the focus on moonshot innovations continues to proliferate in academia (Kaur, Kaur and Singh, 2016; Strong and Lynch, 2018) and practice (Martinez, 2018). Yourden (1997) even wrote an interesting book on perseverance and tenacity to keep going even after failed projects. Proponents of moonshot thinking have claimed that it can help solve society’s biggest challenges (e.g. cure cancer, see Kovarik, 2018) with some suggesting to encourage such thinking by paying failure bonuses (Figueroa, 2018). Yet others remain sceptical, positing that moonshot is ‘awesome and pointless’ (Haigh, 2019, p.4). A proverbial question, thus, emerges: are moonshot innovations simply wishful thinking or can they be part of business-as-usual? In part, the answer may be two-fold – 1) understanding the value of moonshot thinking, and 2) understanding moonshot challenges.  (...)


Author(s):  
Anurag Choudhary

Abstract: Cloud services are being provided by various giant corporations notably Amazon Web Services, Microsoft Azure, Google Cloud Platform, and others. In this scenario, we address the most prominent web service provider, which is Amazon Web Services, which comprises the Elastic Compute Cloud functionality. Amazon offers a comprehensive package of computing solutions to let businesses establish dedicated virtual clouds while maintaining complete configuration control over their working environment. An organization needs to interact with several other technologies; however, instead of installing the technologies, the company may just buy the technology available online as a service. Amazon's Elastic Compute Cloud Web service, delivers highly customizable computing capacity throughout the cloud, allowing developers to establish applications with high scalability. Explicitly put, an Elastic Compute Cloud is a virtual platform that replicates a physical server on which you may host your applications. Instead of acquiring your own hardware and connecting it to a network, Amazon provides you with almost endless virtual machines to deploy your applications while they control the hardware. This review will focus on the quick overview of the Amazon Web Services Elastic Compute Cloud which also containing the features, pricing, and challenges. Finally, unanswered obstacles, and future research directions in Amazon Web Services Elastic Compute Cloud, are addressed. Keywords: Cloud Computing, Cloud Service Provider, Amazon Web Services, Amazon Elastic Compute Cloud, AWS EC2


2021 ◽  
Vol 7 ◽  
pp. e617
Author(s):  
Sundus Naseer ◽  
Qurratul-Ain Minhas ◽  
Khalid Saleem ◽  
Ghazanfar Farooq Siddiqui ◽  
Naeem Bhatti ◽  
...  

The wireless networks face challenges in efficient utilization of bandwidth due to paucity of resources and lack of central management, which may result in undesired congestion. The cognitive radio (CR) paradigm can bring efficiency, better utilization of bandwidth, and appropriate management of limited resources. While the CR paradigm is an attractive choice, the CRs selfishly compete to acquire and utilize available bandwidth that may ultimately result in inappropriate power levels, causing degradation in network’s Quality of Service (QoS). A cooperative game theoretic approach can ease the problem of spectrum sharing and power utilization in a hostile and selfish environment. We focus on the challenge of congestion control that results in inadequate and uncontrolled access of channels and utilization of resources. The Nash equilibrium (NE) of a cooperative congestion game is examined by considering the cost basis, which is embedded in the utility function. The proposed algorithm inhibits the utility, which leads to the decrease in aggregate cost and global function maximization. The cost dominance is a pivotal agent for cooperation in CRs that results in efficient power allocation. Simulation results show reduction in power utilization due to improved management in cognitive radio resource allocation.


2021 ◽  
Vol 60 (13) ◽  
pp. 1-24
Author(s):  
Hong Kim Duong ◽  
Marco Fasan ◽  
Giorgio Gotti

PurposePrevious literature provides mixed evidence about the effectiveness of a code of ethics in limiting managerial opportunism. While some studies find that code of ethics is merely window-dressing, others find that they do influence managers' behavior. The present study investigates whether the quality of a code of ethics decreases the cost of equity by limiting managerial opportunism.Design/methodology/approachIn order to test the hypothesis, the authors perform an empirical analysis on a sample of US companies in the 2004–2012 period. The results are robust to a battery of robustness analyses that the authors performed in order to take care of endogeneity.FindingsEmpirical results indicate that a higher quality code of ethics is associated with a lower cost of equity. In other words, firms with a more comprehensive code of ethics and better-designed implementation procedures limit managerial opportunism and pay a lower cost of equity because they are perceived by investors to be less risky.Research limitations/implicationsPractical implicationsSocial implicationsOriginality/valueThe authors contribute to the literature in two ways. First, by looking at the market reaction to the code of ethics, thus capturing all its indirect possible benefits and second, by measuring not only the existence but also the quality of a code of ethics. Based on the results, policymakers may choose to further promote codes of ethics as an effective corporate governance mechanism.


2014 ◽  
pp. 1481-1497
Author(s):  
Salah Merad ◽  
Rogério de Lemos ◽  
Tom Anderson

This chapter considers the problem of optimally selecting services during run-time with respect to their non-functional attributes and costs. Commercial pressures for reducing the cost of managing complex software systems are changing the way in which systems are designed and built. The reason behind this shift is the need for dealing with changes efficiently and effectively, which may include removing the human operator from the process of decision-making. In service-oriented computing, in particular, the run-time selection and integration of services may soon become a reality since services are readily available. Assuming that each component service has a specific functional and non-functional profile, the challenge now is to define a decision maker that is able to select services that satisfy the system requirements and optimise the quality of services under cost constraints. The approach presented in this chapter describes a game theoretic solution by formulating the problem as a bargaining game.


Sign in / Sign up

Export Citation Format

Share Document