scholarly journals Process Optimization of Big-Data Cloud Centre Using Nature Inspired Firefly Algorithm and K-Means Clustering

During the last decade, the growth of big data is immeasurable in information technology. Big data has the potential to take all the decisions necessary for a company or business. But it has many challenges as well. As its size and volume are immeasurably ample it is a very challenging task to store, process and mines it. At the same time as a boon to it cloud computing has a large capacity to store this big data and provides tremendous processing power. It is a challenging task to process large amount of data frequently in the big-data cloud center through the thousands of interconnected servers. Due to the day by day growth of the big-data, big-data cloud center is forced to improve its Quality of Service (QoS) metrics like throughput, latency and response time. Hence, to develop an optimal data processing optimization method is a current research problem that has to be solved. The major intention of this paper is to develop an application that provides maximum throughput, minimum latency and reduce the response time. Toward this, we have developed an optimization technique using nature-inspired firefly optimization algorithm and k-means clustering (FA-KMeans). The developed optimization method has been evaluated with state of art algorithms. Its experimental result elucidates that our proposed method provides good throughput, reduces latency and response time.

2018 ◽  
Vol 18 (2) ◽  
pp. 98-111
Author(s):  
M. Senthilkumar

Abstract In modern times there is an increasing trend of applications for handling Big data. However, negotiating with the concepts of the Big data is an extremely difficult issue today. The MapReduce framework has been in focus recently for serious consideration. The aim of this study is to get the task-scheduling over Big data using Hadoop. Initially, we prioritize the tasks with the help of k-means clustering algorithm. Then, the MapReduce framework is employed. The available resource is optimally selected using optimization technique in map-phase. The proposed method uses the FireFly Algorithm and BAT algorithms (FFABAT) for choosing the optimal resource with minimum cost value. The bat-inspired algorithm is a meta-heuristic optimization method developed by Xin-She Yang (2010). This bat algorithm is established on the echo-location behaviour of micro-bats with variable pulse rates of emission and loudness. Finally, the tasks are scheduled with the optimal resource in reducer-phase and stored in the cloud. The performance of the algorithm is analysed, based on the total cost, time and memory utilization.


Author(s):  
Shalin Eliabeth S. ◽  
Sarju S.

Big data privacy preservation is one of the most disturbed issues in current industry. Sometimes the data privacy problems never identified when input data is published on cloud environment. Data privacy preservation in hadoop deals in hiding and publishing input dataset to the distributed environment. In this paper investigate the problem of big data anonymization for privacy preservation from the perspectives of scalability and time factor etc. At present, many cloud applications with big data anonymization faces the same kind of problems. For recovering this kind of problems, here introduced a data anonymization algorithm called Two Phase Top-Down Specialization (TPTDS) algorithm that is implemented in hadoop. For the data anonymization-45,222 records of adults information with 15 attribute values was taken as the input big data. With the help of multidimensional anonymization in map reduce framework, here implemented proposed Two-Phase Top-Down Specialization anonymization algorithm in hadoop and it will increases the efficiency on the big data processing system. By conducting experiment in both one dimensional and multidimensional map reduce framework with Two Phase Top-Down Specialization algorithm on hadoop, the better result shown in multidimensional anonymization on input adult dataset. Data sets is generalized in a top-down manner and the better result was shown in multidimensional map reduce framework by the better IGPL values generated by the algorithm. The anonymization was performed with specialization operation on taxonomy tree. The experiment shows that the solutions improves the IGPL values, anonymity parameter and decreases the execution time of big data privacy preservation by compared to the existing algorithm. This experimental result will leads to great application to the distributed environment.


Author(s):  
Matteo Zavatteri ◽  
Carlo Combi ◽  
Luca Viganò

AbstractA current research problem in the area of business process management deals with the specification and checking of constraints on resources (e.g., users, agents, autonomous systems, etc.) allowed to be committed for the execution of specific tasks. Indeed, in many real-world situations, role assignments are not enough to assign tasks to the suitable resources. It could be the case that further requirements need to be specified and satisfied. As an example, one would like to avoid that employees that are relatives are assigned to a set of critical tasks in the same process in order to prevent fraud. The formal specification of a business process and its related access control constraints is obtained through a decoration of a classic business process with roles, users, and constraints on their commitment. As a result, such a process specifies a set of tasks that need to be executed by authorized users with respect to some partial order in a way that all authorization constraints are satisfied. Controllability refers in this case to the capability of executing the process satisfying all these constraints, even when some process components, e.g., gateway conditions, can only be observed, but not decided, by the process engine responsible of the execution. In this paper, we propose conditional constraint networks with decisions (CCNDs) as a model to encode business processes that involve access control and conditional branches that may be both controllable and uncontrollable. We define weak, strong, and dynamic controllability of CCNDs as two-player games, classify their computational complexity, and discuss strategy synthesis algorithms. We provide an encoding from the business processes we consider here into CCNDs to exploit off-the-shelf their strategy synthesis algorithms. We introduce $$\textsc {Zeta}$$ Z E T A , a tool for checking controllability of CCNDs, synthesizing execution strategies, and executing controllable CCNDs, by also supporting user interactivity. We use $$\textsc {Zeta}$$ Z E T A to compare with the previous research, provide a new experimental evaluation for CCNDs, and discuss limitations.


Energies ◽  
2021 ◽  
Vol 14 (15) ◽  
pp. 4649
Author(s):  
İsmail Hakkı ÇAVDAR ◽  
Vahit FERYAD

One of the basic conditions for the successful implementation of energy demand-side management (EDM) in smart grids is the monitoring of different loads with an electrical load monitoring system. Energy and sustainability concerns present a multitude of issues that can be addressed using approaches of data mining and machine learning. However, resolving such problems due to the lack of publicly available datasets is cumbersome. In this study, we first designed an efficient energy disaggregation (ED) model and evaluated it on the basis of publicly available benchmark data from the Residential Energy Disaggregation Dataset (REDD), and then we aimed to advance ED research in smart grids using the Turkey Electrical Appliances Dataset (TEAD) containing household electricity usage data. In addition, the TEAD was evaluated using the proposed ED model tested with benchmark REDD data. The Internet of things (IoT) architecture with sensors and Node-Red software installations were established to collect data in the research. In the context of smart metering, a nonintrusive load monitoring (NILM) model was designed to classify household appliances according to TEAD data. A highly accurate supervised ED is introduced, which was designed to raise awareness to customers and generate feedback by demand without the need for smart sensors. It is also cost-effective, maintainable, and easy to install, it does not require much space, and it can be trained to monitor multiple devices. We propose an efficient BERT-NILM tuned by new adaptive gradient descent with exponential long-term memory (Adax), using a deep learning (DL) architecture based on bidirectional encoder representations from transformers (BERT). In this paper, an improved training function was designed specifically for tuning of NILM neural networks. We adapted the Adax optimization technique to the ED field and learned the sequence-to-sequence patterns. With the updated training function, BERT-NILM outperformed state-of-the-art adaptive moment estimation (Adam) optimization across various metrics on REDD datasets; lastly, we evaluated the TEAD dataset using BERT-NILM training.


Author(s):  
Patrick Nwafor ◽  
Kelani Bello

A Well placement is a well-known technique in the oil and gas industry for production optimization and are generally classified into local and global methods. The use of simulation software often deployed under the direct optimization technique called global method. The production optimization of L-X field which is at primary recovery stage having five producing wells was the focus of this work. The attempt was to optimize L-X field using a well placement technique.The local methods are generally very efficient and require only a few forward simulations but can get stuck in a local optimal solution. The global methods avoid this problem but require many forward simulations. With the availability of simulator software, such problem can be reduced thus using the direct optimization method. After optimization an increase in recovery factor of over 20% was achieved. The results provided an improvement when compared with other existing methods from the literatures.


Author(s):  
Dhaval Desai ◽  
Jiang Zhou

In a world where the increasing demand on developing energy-efficient systems is probably the most stringent design constraint, the trend in engineering research in recent years has been to optimize the existing technologies rather than to implement new ones. The present work addresses a robust axial-type fan design technique developed using an optimization technique. A fan is indispensable equipment for primary and local ventilation in mining industries. We always pursue the fan with high working efficiency and low noise. In this paper, an optimization method is developed to improve the pneumatic properties of the fan based on the blade element theory. A new type of fan used in local ventilation is designed with the help of computer. It is shown that the new design enhanced the efficient up to 88%. Numerical analysis is also conducted to validate the optimization design results.


2021 ◽  
Vol 30 (2) ◽  
pp. 354-364
Author(s):  
Firas Al-Mashhadani ◽  
Ibrahim Al-Jadir ◽  
Qusay Alsaffar

In this paper, this method is intended to improve the optimization of the classification problem in machine learning. The EKH as a global search optimization method, it allocates the best representation of the solution (krill individual) whereas it uses the simulated annealing (SA) to modify the generated krill individuals (each individual represents a set of bits). The test results showed that the KH outperformed other methods using the external and internal evaluation measures.


2019 ◽  
Vol 2 ◽  
pp. 1-6
Author(s):  
Wenjuan Lu ◽  
Aiguo Liu ◽  
Chengcheng Zhang

<p><strong>Abstract.</strong> With the development of geographic information technology, the way to get geographical information is constantly, and the data of space-time is exploding, and more and more scholars have started to develop a field of data processing and space and time analysis. In this, the traditional data visualization technology is high in popularity and simple and easy to understand, through simple pie chart and histogram, which can reveal and analyze the characteristics of the data itself, but still cannot combine with the map better to display the hidden time and space information to exert its application value. How to fully explore the spatiotemporal information contained in massive data and accurately explore the spatial distribution and variation rules of geographical things and phenomena is a key research problem at present. Based on this, this paper designed and constructed a universal thematic data visual analysis system that supports the full functions of data warehousing, data management, data analysis and data visualization. In this paper, Weifang city is taken as the research area, starting from the aspects of rainfall interpolation analysis and population comprehensive analysis of Weifang, etc., the author realizes the fast and efficient display under the big data set, and fully displays the characteristics of spatial and temporal data through the visualization effect of thematic data. At the same time, Cassandra distributed database is adopted in this research, which can also store, manage and analyze big data. To a certain extent, it reduces the pressure of front-end map drawing, and has good query analysis efficiency and fast processing ability.</p>


2021 ◽  
Vol 2113 (1) ◽  
pp. 012022
Author(s):  
Chao Sun

Abstract In this paper, taking the feeding process as a form of impulsive and considering the time-delay in fermentation process. A robust model with the time-delay system as the control variable and the time-delay system as the constraint is established. In order to solve this optimal control problem, we have propose an particle swarm optimization method to solve problem. Numerical results show that 1,3-PD yield at the terminal time increases compared with the experimental result.


2021 ◽  
Author(s):  
Ramyar Rashed Mohassel

With the introduction of new technologies, concepts and approaches in power transmission, distribution and utilization such as Smart Grids (SG), Advanced Metering Infrastructures (AMI), Distributed Energy Resources (DER) and Demand Side Management (DSM), new capabilities have emerged that enable efficient use and management of power consumption. These capabilities are applicable at micro level in households and building complexes as well as at macro level for utility providers in form of resource and revenue management initiatives. On the other hand, integration of Information Technology (IT) and instrumentation has brought Building Management Systems (BMS) to our homes and has made it possible for the ordinary users to take advantage of more complex and sophisticated energy and cost management features as an integral part of their BMS. The idea of combining capabilities and advantages offered by SG, AMI, DER, DSM and BMS is the backbone of this thesis and has resulted in developing a unique, two-level optimization method for effective deployment of DSM at households and residential neighborhoods. The work consists of an optimization algorithm for households to maximize utilization of DER as the lower level of the envisioned two-level optimization technique while using a customized Game Theoretic optimization for optimizing revenue of utility providers for residential neighborhood as the upper level. This work will also introduce a power management unit, called Load Moderation Center (LMC), to host the developed optimization algorithms as an integrated part of BMS. LMC, upon successful completion, will be able to automatically plan consumption, effectively utilize available sources including grid, renewable energies and storages, and eliminate the need for residences to manually program their BMS for different market scenarios.


Sign in / Sign up

Export Citation Format

Share Document