QUALITY ASSURANCE OF SOFTWARE AND HARDWARE COMPLEXES FOR DATA STORAGE AND PROCESSING CENTERS

2018 ◽  
pp. 145-150
Author(s):  
Ya. A. Ivakin ◽  
S. A. Morozov ◽  
V. M. Balashov ◽  
M. S. Smirnova

The article analyzes and presents an analysis of the structure of software and hardware systems, reviewed the documentation of regulatory and technical regulation, identified the main performance indicators for software and hardware systems for data storage and processing centers. A generalized representation of the software and hardware structure of software and hardware complexes for data centers in the form of an embedded scheme was developed and presented. Also, the article identifies and structures the basic or typical services supported by modern software and hardware systems of data centers. The role and place of software and hardware complexes of data centers in information support of state and corporate governance bodies are determined. The main indicators of the quality of the functioning of the software-hardware complexes of data center are presented in the work when hosting services are provided. The problem of creation of the normative and technical base and scientific and methodological tools for assessment and improvement of the quality of the corresponding software and hardware complexes is revealed. The structure of software and hardware systems, reviews the documentation of regulatory and technical regulation, determines the main performance indicators for software and hardware systems for data storage and processing centers.

Now-a-days, Data Center Infrastructure Management is increasing workplace efficiency, reducing overhead costs, and maintaining availability are some of the top priorities for data center administrators. The markets of data center have evolved not only in reply to the quantity of data, but also to increasing demands approximately connectivity up-time, security, and lesser IT costs. Information Technology is growing more rapidly than ever and new challenges appear every day. Various challenges are data storage and processing, and data centers have a critical role in this situation. As data centers get bigger and bigger, more energy is required and availability is primary. In this paper it is discussed the quality of DCIM by taking into requirements and the quality evaluation problem, here is a proposed method to analyze the quality of DCIM for energy costs, availability and new systems of data center infrastructure management (DCIM). With the proposed method of evolution will definitely resolve the problems of Data Centers so that the throughput of data centre will increase efficiently.


2017 ◽  
Vol 27 (4) ◽  
Author(s):  
Hassan Hadi Saleh

The security of data storage in “cloud” is big challenge because the data keep within resources that may be accessed by particular machines. The managing of these data and services may not be high reliable. Therefore, the security of data is highly challenging. To increase the security of data in data center of cloud, we have introduced good method to ensure data security in “cloud computing” by methods of data hiding using color images which is called steganography. The fundamental objective of this paper is to prevent "Data Access” by unauthorized or opponent users. This scheme stores data at data centers within edges of color images and retrieves data from it when it is wanted.


2021 ◽  
Vol 850 (1) ◽  
pp. 012018
Author(s):  
T Renugadevi ◽  
D Hari Prasanth ◽  
Appili Yaswanth ◽  
K Muthukumar ◽  
M Venkatesan

Abstract Data centers are large-scale data storage and processing systems. It is made up of a number of servers that must be capable of handling large amount of data. As a result, data centers generate a significant quantity of heat, which must be cooled and kept at an optimal temperature to avoid overheating. To address this problem, thermal analysis of the data center is carried out using numerical methods. The CFD model consists of a micro data center, where conjugate heat transfer effects are studied. A micro data center consists of servers aligned with air gaps alternatively and cooling air is passed between the air gaps to remove heat. In the present work, the design of data center rack is made in such a way that the cold air is in close proximity to servers. The temperature and airflow in the data center are estimated using the model. The air gap is optimally designed for the cooling unit. Temperature distribution of various load configurations is studied. The objective of the study is to find a favorable loading configuration of the micro data center for various loads and effectiveness of distribution of load among the servers.


Data centers networks supports heterogeneous kind of applications like social networking, e-commerce, web search, video data hosting, computation-intensive and data-storage. It has high-bandwidth links, low propagation delay and commodity switches with small-size buffers. Under cluster-based storage environment, data center supports barrier-synchronized manyto-one communication pattern where multiple worker nodes simultaneously transmit bulk of data to single aggregator node by running standard TCP protocol. This synchronized transmission may overload aggregator’s switch buffer, which leads to severe packet loss and overall throughput fall. This is called as TCP Incast problem. This paper analyses issue of TCP Incast and provides detailed survey about several solutions at link, transport and application layer to mitigate impact of TCP Incast in data center network. Solutions are described with their procedural approach for alleviating Incast. Comparative evaluation between these solutions provides understanding about their merits, demerits and applicability under various implementation circumstances.


Author(s):  
Sean Cawley

Technological advancement and increasing data collection activities compelled the call for a National Data Center in 1965. Theoretically, the Center would increase efficiency and diminish costs, as the inefficiencies of information transfer between agencies and organizations steadily rose. However, a firestorm of criticism met the proposal from a number of sectors due to a perceived lack of privacy concerns, which eventually spelled the Center's demise. The destruction of an explicit locale for data storage and retrieval, however, catalyzed the formation of numerous implicit data centers that jeopardized privacy to a far greater degree than it was originally feared the Center would. The history of the National Data Center's demise and the subsequent construction of implicit data centers consists of a useful case study when considering the proper reaction to perceived privacy concerns regarding new technologies.


2019 ◽  
Vol 5 ◽  
pp. e211
Author(s):  
Hadi Khani ◽  
Hamed Khanmirza

Cloud computing technology has been a game changer in recent years. Cloud computing providers promise cost-effective and on-demand resource computing for their users. Cloud computing providers are running the workloads of users as virtual machines (VMs) in a large-scale data center consisting a few thousands physical servers. Cloud data centers face highly dynamic workloads varying over time and many short tasks that demand quick resource management decisions. These data centers are large scale and the behavior of workload is unpredictable. The incoming VM must be assigned onto the proper physical machine (PM) in order to keep a balance between power consumption and quality of service. The scale and agility of cloud computing data centers are unprecedented so the previous approaches are fruitless. We suggest an analytical model for cloud computing data centers when the number of PMs in the data center is large. In particular, we focus on the assignment of VM onto PMs regardless of their current load. For exponential VM arrival with general distribution sojourn time, the mean power consumption is calculated. Then, we show the minimum power consumption under quality of service constraint will be achieved with randomize assignment of incoming VMs onto PMs. Extensive simulation supports the validity of our analytical model.


2018 ◽  
Vol 7 (3.27) ◽  
pp. 220
Author(s):  
R Parameswari ◽  
K Vani

The advancement of telecommunication in the medical field makes the diagnosis and treatment of patients in a faster manner because the patient information are stored and maintained in an electronic storage device. But there are some issues related to physical data storage, privacy by accessing user data and security etc. With the help of the cloud computing these issues are reduced now. The mobile health care system can improve the quality of patient care and reduce a medical cost for both patients and hospitals. The simulation tool CloudSim is used to generate the data center details, provider’s details and processing time.  For the analysis, the cloud analyst tool can simulate the data center and others.  The patient information in the Healthcare Information System (HIS) and Electronic Medical Records (EMRs) is stored in a data center or cloud platform in a secure manner.  


2019 ◽  
Vol 9 (18) ◽  
pp. 3850 ◽  
Author(s):  
Diogo Macedo ◽  
Radu Godina ◽  
Pedro Dinis Gaspar ◽  
Pedro da Silva ◽  
Miguel Trigueiros Covas

In recent years, reducing energy consumption has been relentlessly pursued by researchers and policy makers with the purpose of achieving a more sustainable future. The demand for data storage in data centers has been steadily increasing, leading to an increase in size and therefore to consume more energy. Consequently, the reduction of the energy consumption of data center rooms is required and it is with this perspective that this paper is proposed. By using Computational Fluid Dynamics (CFD), it is possible to model a three-dimensional model of the heat transfer and air flow in data centers, which allows forecasting the air speed and temperature range under diverse conditions of operation. In this paper, a CFD study of the thermal performance and airflow in a real data center processing room with 208 racks under different thermal loads and airflow velocities is proposed. The physical-mathematical model relies on the equations of mass, momentum and energy conservation. The fluid in this study is air and it is modeled as an ideal gas with constant properties. The model of the effect of turbulence is made by employing a k–ε standard model. The results indicate that it is possible to reduce the thermal load of the server racks by improving the thermal performance and airflow of the data center room, without affecting the correct operation of the server racks located in the sensible regions of the room.


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Elena Parmelli ◽  
Miranda Langendam ◽  
Thomas Piggott ◽  
Jan Adolfsson ◽  
Elie A. Akl ◽  
...  

Abstract Background In 2017, the European Commission’s Joint Research Centre (JRC) started developing a methodological framework for a guideline-based quality assurance (QA) scheme to improve cancer quality of care. During the first phase of the work, inconsistency emerged about the use of terminology for the definition, the conceptual underpinnings and the way QA relates to health questions that are answered in guidelines. The objective of this final of three articles is to propose a conceptual framework for an integrated approach to guideline and QA development and clarify terms and definitions for key elements. This work will inform the upcoming European Commission Initiative on Colorectal Cancer (ECICC). Methods A multidisciplinary group of 23 experts from key organizations in the fields of guideline development, performance measurement and quality assurance participated in a mixed method approach including face-to-face dialogue and several rounds of virtual meetings. Informed by results of a systematic literature review that indicated absence of an existing framework and practical examples, we first identified the relations of key elements in guideline-based QA and then developed appropriate concepts and terminology to provide guidance. Results Our framework connects the three key concepts of quality indicators, performance measures and performance indicators integrated with guideline development. Quality indicators are constructs used as a guide to monitor, evaluate, and improve the quality of the structure, process and outcomes of healthcare services; performance measures are tools that quantify or describe measurable elements of practice performance; and performance indicators are quantifiable and measurable units or scores of practice, which should be guided by guideline recommendations. Conclusions The inconsistency in the way key terms of QA are used and defined has confused the field. Our conceptual framework defines the role, meaning and interactions of the key elements for improving quality in healthcare. It directly builds on the questions asked in guidelines and answered through recommendations. These findings will be applied in the forthcoming ECICC and for the future updates of ECIBC. These are large-scale integrated projects aimed at improving healthcare quality across Europe through the development of guideline-based QA schemes; this will help in implementing and improving our approach.


2020 ◽  
Vol 12 (19) ◽  
pp. 8030 ◽  
Author(s):  
Abdulrahman Housawi ◽  
Amal Al Amoudi ◽  
Basim Alsaywid ◽  
Miltiadis Lytras ◽  
Yara H. bin Μoreba ◽  
...  

The Kingdom of Saudi Arabia is undergoing a major transformation in response to a revolutionary vision of 2030, given that healthcare reform is one of the top priorities. With the objective of improving healthcare and allied professional performance in the Kingdom to meet the international standards, the Saudi Commission for Health Specialties (SCFHS) has recently developed a strategic plan that focuses on expanding training programs’ capacity to align with the increasing demand for the country’s healthcare workforce, providing comprehensive quality assurance and control to ensure training programs uphold high quality standards, and providing advanced training programs benchmarked against international standards. In this research paper, we describe our attempt for developing a general framework for key performance indicators (KPIs) and the related metrics, with the aim of contributing to developing new strategies for better medical training compatible with the future. We present the results of a survey conducted in the Kingdom of Saudi Arabia (KSA), for the enhancement of quality of postgraduate medical training. The recent developments in the field of learning analytics present an opportunity for utilizing big data and artificial intelligence in the design and implementation of socio-technical systems with significant potential social impact. We summarize the key aspects of the Training Quality Assurance Initiative and suggest a new approach for designing a new data and services ecosystem for personalized health professionals training in the KSA. The study also contributes to the theoretical knowledge on the integration of sustainability and medical training and education by proposing a framework that can enhance future initiatives from various health organizations.


Sign in / Sign up

Export Citation Format

Share Document