Performance Evaluation of Cloud Services for Russian Companies

Author(s):  
Alexey Bataev
Sensors ◽  
2019 ◽  
Vol 19 (20) ◽  
pp. 4559
Author(s):  
Park ◽  
Park

The middleware framework for IoT collaboration services should provide efficient solutions to context awareness and uncertainty issues among multiple collaboration domains. However, existing middleware frameworks are mostly limited to a single system, and developing self-adaptive IoT collaboration services using existing frameworks requires developers to take considerable time and effort. Furthermore, the developed IoT collaboration services are often dependent on a particular domain, which cannot easily be referenced in other domains. This paper proposes a cloud-based middleware framework that provides a set of cloud services for self-adaptive IoT collaboration services. The proposed middleware framework is generic in the sense that it clearly separates domain-dependent components from the layers that leverage existing middleware frameworks. In addition, the proposed framework allows developers to upload domain-dependent components onto the cloud, search for registered components, and launch Virtual Machine (VM) running a new MAPE cycle via a convenient web-based interface. The feasibility of the proposed framework has been shown with a simulation of an IoT collaboration service that traces a criminal suspect. The performance evaluation shows that the proposed middleware framework runs with an overhead of only 6% compared to pure Java-based middleware and is scalable as the number of VMs increases up to 16.


2015 ◽  
Vol 54 ◽  
pp. 24-30 ◽  
Author(s):  
M. Jaiganesh ◽  
B. Ramadoss ◽  
A. Vincent Antony Kumar ◽  
S. Mercy

2018 ◽  
Vol 8 (1) ◽  
pp. 80-96 ◽  
Author(s):  
Sanjay P. Ahuja ◽  
Niharika Deval

Infrastructure-as-a-service is a cloud service model that allows customers to outsource computing resources such as servers and storage. This article evaluates four IaaS cloud services - Amazon EC2, Microsoft Azure, Google Compute Engine and Rackspace Cloud in a vendor-neutral approach with regards to system parameter usage including server, file I/O and network utilization. Thus, system-level benchmarking provides objective comparison of cloud providers from performance standpoint. Unixbench, Dbench and Iperf are the System-level benchmarks chosen to test the performance of server, file I/O and network respectively. In order to capture the variation in performance, the tests were performed at different times on weekdays and weekends. With each offering, the benchmarks are tested on different configurations to provide an insight to the cloud users in selection of provider followed by appropriate VM sizing according to the workload requirement. In addition to the performance evaluation, price-per-performance value of all the providers is also examined and compared.


Author(s):  
Santoso Wibowo ◽  
Hepu Deng ◽  
Wei Xu

This paper formulates the performance evaluation of cloud services as a multicriteria group decision making problem, and presents a fuzzy multicriteria group decision making method for evaluating the performance of cloud services. Interval-valued intuitionistic fuzzy numbers are used to model the inherent subjectiveness and imprecision of the performance evaluation process. An effective algorithm is developed based on the technique for order preference by similarity to ideal solution method and the Choquet integral operator for adequately solving the performance evaluation problem. An example is presented to demonstrate the applicability of the proposed fuzzy multicriteria group decision making method for solving the multicriteria group decision making problem in real world situations.


2019 ◽  
Vol 26 (1) ◽  
pp. 78
Author(s):  
Rajeev Ranjan Yadav ◽  
Gleidson A. S. Campos ◽  
Erica Teixeira Gomes Sousa ◽  
Fernando Aires Lins

On-demand services and reduced costs made cloud computing a popular mechanism to provide scalable resources according to the user’s expectations. This paradigm is an important role in business and academic organizations, supporting applications and services deployed based on virtual machines and containers, two different technologies for virtualization. Cloud environments can support workloads generated by several numbers of users, that request the cloud environment to execute transactions and its performance should be evaluated and estimated in order to achieve clients satisfactions when cloud services are offered. This work proposes a performance evaluation strategy composed of a performance model and a methodology for evaluating the performance of services configured in virtual machines and containers in cloud infrastructures. The performance model for the evaluation of virtual machines and containers in cloud infrastructures is based on stochastic Petri nets. A case study in a real public cloud is presented to illustrate the feasibility of the performance evaluation strategy. The case study experiments were performed with virtual machines and containers supporting workloads related to social networks transactions.


2013 ◽  
Vol 5 (3) ◽  
pp. 75-93
Author(s):  
Zheng Li ◽  
Liam O’Brien ◽  
He Zhang ◽  
Rajiv Ranjan

Appropriate performance evaluations of commercial Cloud services are crucial and beneficial for both customers and providers to understand the service runtime, while suitable experimental design and analysis would be vital for practical evaluation implementations. However, there seems to be a lack of effective methods for Cloud services performance evaluation. For example, in most of the existing evaluation studies, experimental factors (also called parameters or variables) were considered randomly and intuitively, experimental sample sizes were determined on the fly, and few experimental results were comprehensively analyzed. To address these issues, the authors suggest applying Design of Experiments (DOE) to Cloud services evaluation. To facilitate applying DOE techniques, this paper introduces an experimental factor framework and a set of DOE application scenarios. As such, new evaluators can explore and conveniently adapt our work to their own experiments for performance evaluation of commercial Cloud services.


Sign in / Sign up

Export Citation Format

Share Document