scholarly journals Power Usage Efficiency (PUE) Optimization with Counterpointing Machine Learning Techniques for Data Center Temperatures

Author(s):  
Rajendra Kumar ◽  
Sunil Kumar Khatri ◽  
Mario José Diván

The rapid increase in the IT infrastructure has led to demands in more Data Center Space & Power to fulfil the Information and Communication Technology (ICT) services hosting requirements. Due to this, more electrical power is being consumed in Data Centers therefore Data Center power & cooling management has become quite an important and challenging task. Direct impacting aspects affecting the power energy of data centers are power and commensurate cooling losses. It is difficult to optimise the Power Usage Efficiency (PUE) of the Data Center using conventional methods which essentially need knowledge of each Data Center facility and specific equipment and its working. Hence, a novel optimization approach is necessary to optimise the power and cooling in the data center. This research work is performed by varying the temperature in the data center through a machine learning-based linear regression optimization technique. From the research, the ideal temperature is identified with high accuracy based on the prediction technique evolved out of the available data. With the proposed model, the PUE of the data center can be easily analysed and predicted based on temperature changes maintained in the Data Center. As the temperature is raised from 19.73 oC to 21.17 oC, then the cooling load is decreased in the range 607 KW to 414 KW. From the result, maintaining the temperature at the optimum value significantly improves the Data Center PUE and same time saves power within the permissible limits.

Author(s):  
Kamran Fouladi ◽  
Joseph Schaadt

Abstract The energy consumption for cooling electronic equipment in data centers using central systems is significant and will continue to rise. The motivation of the present research study is based on the need to determine optimization strategies to improve and optimize the thermal efficiency of data centers using a simulation-based approach. Here, simulation is used to model and optimize a proposed research data center for use as an environment to test equipment and investigate best practices and strategies such as containment and hybrid cooling. The optimization technique used in this study finds the optimal operating conditions and containment strategies of the data center while meeting specific thermal conformance criteria. More specifically, optimum supply airflow rate and temperature setpoint of cooling units are sought under different containment configurations, including both hot aisle and cold aisle containment strategies in both full and partial setups. The results of the computational fluid dynamics (CFD) simulations indicated a lower probability of hot spots with full hot aisle containment strategy in a data center operating at lower supply airflow rate and higher supply temperature setpoint. The optimization approach helped to determine a more efficient cooling system without the risk of under-provisioning. The study considered steady-state conditions with static heat load and fixed equipment layout. However, the generalized optimization process developed in the present study should add to the repertoire of tools presently used for the optimization of new air-cooled data centers.


The Intrusion is a major threat to unauthorized data or legal network using the legitimate user identity or any of the back doors and vulnerabilities in the network. IDS mechanisms are developed to detect the intrusions at various levels. The objective of the research work is to improve the Intrusion Detection System performance by applying machine learning techniques based on decision trees for detection and classification of attacks. The methodology adapted will process the datasets in three stages. The experimentation is conducted on KDDCUP99 data sets based on number of features. The Bayesian three modes are analyzed for different sized data sets based upon total number of attacks. The time consumed by the classifier to build the model is analyzed and the accuracy is done.


2017 ◽  
Author(s):  
Vinicius Da S. Segalin ◽  
Carina F. Dorneles ◽  
Mario A. R. Dantas

AA well-known challenge with long running time queries in database environments is how much time a query will take to execute. This prediction is relevant for several reasons. For instance, by knowing that a query will take longer to execute than desired, one resource reservation mechanism can be performed, which means reserving more resources in order to execute this query in a shorter time in a future request. In this research work, it is presented a proposal in which the use of an advance reservation mechanism in a cloud database environment, considering machine learning techniques, provides resource recommendation. The proposed model is presented, in addition to some experiments that evaluate benefits and the efficiency of this enhanced proposal.


2019 ◽  
Vol 21 (3) ◽  
pp. 80-92
Author(s):  
Madhuri Gupta ◽  
Bharat Gupta

Cancer is a disease in which cells in body grow and divide beyond the control. Breast cancer is the second most common disease after lung cancer in women. Incredible advances in health sciences and biotechnology have prompted a huge amount of gene expression and clinical data. Machine learning techniques are improving the prior detection of breast cancer from this data. The research work carried out focuses on the application of machine learning methods, data analytic techniques, tools, and frameworks in the field of breast cancer research with respect to cancer survivability, cancer recurrence, cancer prediction and detection. Some of the widely used machine learning techniques used for detection of breast cancer are support vector machine and artificial neural network. Apache Spark data processing engine is found to be compatible with most of the machine learning frameworks.


Author(s):  
Magesh S. ◽  
Niveditha V.R. ◽  
Rajakumar P.S. ◽  
Radha RamMohan S. ◽  
Natrayan L.

Purpose The current and on-going coronavirus (COVID-19) has disrupted many human lives all over the world and seems very difficult to confront this global crisis as the infection is transmitted by physical contact. As no vaccine or medical treatment made available till date, the only solution is to detect the COVID-19 cases, block the transmission, isolate the infected and protect the susceptible population. In this scenario, the pervasive computing becomes essential, as it is environment-centric and data acquisition via smart devices provides better way for analysing diseases with various parameters. Design/methodology/approach For data collection, Infrared Thermometer, Hikvision’s Thermographic Camera and Acoustic device are deployed. Data-imputation is carried out by principal component analysis. A mathematical model susceptible, infected and recovered (SIR) is implemented for classifying COVID-19 cases. The recurrent neural network (RNN) with long-term short memory is enacted to predict the COVID-19 disease. Findings Machine learning models are very efficient in predicting diseases. In the proposed research work, besides contribution of smart devices, Artificial Intelligence detector is deployed to reduce false alarms. A mathematical model SIR is integrated with machine learning techniques for better classification. Implementation of RNN with Long Short Term Memory (LSTM) model furnishes better prediction holding the previous history. Originality/value The proposed research collected COVID −19 data using three types of sensors for temperature sensing and detecting the respiratory rate. After pre-processing, 300 instances are taken for experimental results considering the demographic features: Sex, Patient Age, Temperature, Finding and Clinical Trials. Classification is performed using SIR mode and finally predicted 188 confirmed cases using RNN with LSTM model.


2020 ◽  
Vol 8 (5) ◽  
pp. 4624-4627

In recent years, a lot of data has been generated about students, which can be utilized for deciding the career path of the student. This paper discusses some of the machine learning techniques which can be used to predict the performance of a student and help to decide his/her career path. Some of the key Machine Learning (ML) algorithms applied in our research work are Linear Regression, Logistics Regression, Support Vector machine, Naïve Bayes Classifier and K- means Clustering. The aim of this paper is to predict the student career path using Machine Learning algorithms. We compare the efficiencies of different ML classification algorithms on a real dataset obtained from University students.


2019 ◽  
Vol 8 (4) ◽  
pp. 11704-11707

Cardiac Arrhythmia is a type of condition a human being suffers from abnormal heart rhythm. This is experienced due to the malfunctioning of electrical impulses that coordinate the heartbeat. When this happens the heartbeats slow/ fast more precisely irregularly. The rhythm of the heart is controlled by a major node called the sinus node which is present at the top of the heart, triggers the electrical pulses which make the heart to beat and pumping of blood to the body. Some of the symptoms of Cardiac Arrhythmia are fainting, unconsciousness, shortness of breath, unexpected functioning of the heart. It leads to death in minutes if medical attention is not provided. To diagnose this doctor, require to study the heart recordings evaluate heartbeats from different parts of the body accurately. It takes a lot of time to evaluate so based on the research work contributed in this field we try to propose a different approach to the same. In this paper, we compare different machine learning techniques and algorithms proposed by different authors and understand the advantages and disadvantages of the system and to bring a new system in place of the existing system where all have used the same ECG recordings from the same database of MIT-BIH. With the initial research work done by us we found out that the use of Phonocardiogram Recordings (PCG) provides more fidelity and accurate compared to ECG recordings. With the initial stage of work, we take the PCG recordings dataset and convert it to a spectrogram image and apply a convolutional neural network to predict the normal or abnormal heartbeat


Author(s):  
Deepika T. ◽  
Prakash P.

The flourishing development of the cloud computing paradigm provides several services in the industrial business world. Power consumption by cloud data centers is one of the crucial issues for service providers in the domain of cloud computing. Pursuant to the rapid technology enhancements in cloud environments and data centers augmentations, power utilization in data centers is expected to grow unabated. A diverse set of numerous connected devices, engaged with the ubiquitous cloud, results in unprecedented power utilization by the data centers, accompanied by increased carbon footprints. Nearly a million physical machines (PM) are running all over the data centers, along with (5 – 6) million virtual machines (VM). In the next five years, the power needs of this domain are expected to spiral up to 5% of global power production. The virtual machine power consumption reduction impacts the diminishing of the PM’s power, however further changing in power consumption of data center year by year, to aid the cloud vendors using prediction methods. The sudden fluctuation in power utilization will cause power outage in the cloud data centers. This paper aims to forecast the VM power consumption with the help of regressive predictive analysis, one of the Machine Learning (ML) techniques. The potency of this approach to make better predictions of future value, using Multi-layer Perceptron (MLP) regressor which provides 91% of accuracy during the prediction process.


Sign in / Sign up

Export Citation Format

Share Document