FLNL: Fuzzy entropy and lion neural learner for EDoS attack mitigation in cloud computing

Author(s):  
Sukhada Bhingarkar ◽  
Deven Shah

Cloud computing is a technology that allows the end-users to access the network through a shared area of resources. As the demand for the cloud computing increases, vulnerabilities in the service provision also increase. EDoS is one of the attacks that take over the provider, financially affecting the various organizations which use the cloud data. This paper utilizes fuzzy entropy and lion neural learner (FLNL) for the classification of cloud users to mitigate EDoS attacks in the cloud. This technique includes a training phase, which creates a log file using various parameters and then transforms the features into database considering certain key features. There are two important stages in this classification approach: feature selection and classification. Here, the fuzzy entropy function is utilized for feature selection which effectively selects useful features without information loss. The classification is performed using lion neural learner (LNL) which incorporates Lion algorithm (LA) into the neural network and uses Levenberg–Marquardt (LM) algorithm. The experimental results finalize that the proposed FLNL is effective with 89% precision, 78% recall, and 83.13% of f-measure compared with the existing Naïve Bayes (NB), Neural [Formula: see text] Propagation [Formula: see text], and Neural [Formula: see text]–Marquardt [Formula: see text].

2014 ◽  
Vol 13 (9) ◽  
pp. 5011-5014
Author(s):  
Vikas Lonare ◽  
Prof. J.N. Nandimath

Cloud computing is used to provide scalable services which are easily used over the internet as per the requirement. A major feature of the cloud services is that users data are remotely processed in unknown machines that users do not know or users are not operating these machines. While using these services provided by cloud computing, users fear of losing their own data. The content of data can be financial, health, personal. To resolve this problem, we use information accountability in decentralized format to keep track of the usage of the users data over the cloud. It is object oriented approach that enables enclosing our logging mechanism together with users data and apply access policies with respect to users data. We use JAR programming which provides the dynamic and traveling object functionality and to track any access to userâs data will call authentication and automated logging mechanism to the JAR files. Each access to users cloud data will be getting recorded in separate log file. To provide robust users control, distributed auditing functionality is also provided to track the usage of data. The proposed system also provides the authentication mechanism using external channels and also makes a log of user details from which the cloud data is accessed. Only data owner can retrieve the detail log of his data as per requirement. Data owner will provide the type of access to his data and based on that, authorized data user can access the data over the cloud environment.


2014 ◽  
Vol 13 (7) ◽  
pp. 4625-4632
Author(s):  
Jyh-Shyan Lin ◽  
Kuo-Hsiung Liao ◽  
Chao-Hsing Hsu

Cloud computing and cloud data storage have become important applications on the Internet. An important trend in cloud computing and cloud data storage is group collaboration since it is a great inducement for an entity to use a cloud service, especially for an international enterprise. In this paper we propose a cloud data storage scheme with some protocols to support group collaboration. A group of users can operate on a set of data collaboratively with dynamic data update supported. Every member of the group can access, update and verify the data independently. The verification can also be authorized to a third-party auditor for convenience.


2014 ◽  
Vol 1008-1009 ◽  
pp. 1513-1516
Author(s):  
Hai Na Song ◽  
Xiao Qing Zhang ◽  
Zhong Tang He

Cloud computing environment is regarded as a kind of multi-tenant computing mode. With virtulization as a support technology, cloud computing realizes the integration of multiple workloads in one server through the package and seperation of virtual machines. Aiming at the contradiction between the heterogeneous applications and uniform shared resource pool, using the idea of bin packing, the multidimensional resource scheduling problem is analyzed in this paper. We carry out some example analysis in one-dimensional resource scheduling, two-dimensional resource schduling and three-dimensional resource scheduling. The results shows that the resource utilization of cloud data centers will be improved greatly when the resource sheduling is conducted after reorganizing rationally the heterogeneous demands.


2016 ◽  
Vol 2016 ◽  
pp. 1-15 ◽  
Author(s):  
Franco Callegati ◽  
Walter Cerroni ◽  
Chiara Contoli

The emerging Network Function Virtualization (NFV) paradigm, coupled with the highly flexible and programmatic control of network devices offered by Software Defined Networking solutions, enables unprecedented levels of network virtualization that will definitely change the shape of future network architectures, where legacy telco central offices will be replaced by cloud data centers located at the edge. On the one hand, this software-centric evolution of telecommunications will allow network operators to take advantage of the increased flexibility and reduced deployment costs typical of cloud computing. On the other hand, it will pose a number of challenges in terms of virtual network performance and customer isolation. This paper intends to provide some insights on how an open-source cloud computing platform such as OpenStack implements multitenant network virtualization and how it can be used to deploy NFV, focusing in particular on packet forwarding performance issues. To this purpose, a set of experiments is presented that refer to a number of scenarios inspired by the cloud computing and NFV paradigms, considering both single tenant and multitenant scenarios. From the results of the evaluation it is possible to highlight potentials and limitations of running NFV on OpenStack.


Author(s):  
VINITHA S P ◽  
GURUPRASAD E

Cloud computing has been envisioned as the next generation architecture of IT enterprise. It moves the application software and databases to the centralized large data centers where management of data and services may not be fully trustworthy. This unique paradigm brings out many new security challenges like, maintaining correctness and integrity of data in cloud. Integrity of cloud data may be lost due to unauthorized access, modification or deletion of data. Lacking of availability of data may be due to the cloud service providers (CSP), in order to increase their margin of profit by reducing the cost, CSP may discard rarely accessed data without detecting in timely fashion. To overcome above issues, flexible distributed storage, token utilizing, signature creations used to ensure integrity of data, auditing mechanism used assists in maintaining the correctness of data and also locating, identifying of server where exactly the data has been corrupted and also dependability and availability of data achieved through distributed storage of data in cloud. Further in order to ensure authorized access to cloud data a admin module has been proposed in our previous conference paper, which prevents unauthorized users from accessing data and also selective storage scheme based on different parameters of cloud servers proposed in previous paper, in order to provide efficient storage of data in the cloud. In order to provide more efficiency in this paper dynamic data operations are supported such as updating, deletion and addition of data.


2021 ◽  
Vol 2021 ◽  
pp. 1-11
Author(s):  
Amr M. Sauber ◽  
Passent M. El-Kafrawy ◽  
Amr F. Shawish ◽  
Mohamed A. Amin ◽  
Ismail M. Hagag

The main goal of any data storage model on the cloud is accessing data in an easy way without risking its security. A security consideration is a major aspect in any cloud data storage model to provide safety and efficiency. In this paper, we propose a secure data protection model over the cloud. The proposed model presents a solution to some security issues of cloud such as data protection from any violations and protection from a fake authorized identity user, which adversely affects the security of the cloud. This paper includes multiple issues and challenges with cloud computing that impairs security and privacy of data. It presents the threats and attacks that affect data residing in the cloud. Our proposed model provides the benefits and effectiveness of security in cloud computing such as enhancement of the encryption of data in the cloud. It provides security and scalability of data sharing for users on the cloud computing. Our model achieves the security functions over cloud computing such as identification and authentication, authorization, and encryption. Also, this model protects the system from any fake data owner who enters malicious information that may destroy the main goal of cloud services. We develop the one-time password (OTP) as a logging technique and uploading technique to protect users and data owners from any fake unauthorized access to the cloud. We implement our model using a simulation of the model called Next Generation Secure Cloud Server (NG-Cloud). These results increase the security protection techniques for end user and data owner from fake user and fake data owner in the cloud.


Author(s):  
Robert Vrbić

Cloud computing provides a powerful, scalable and flexible infrastructure into which one can integrate, previously known, techniques and methods of Data Mining. The result of such integration should be strong and capacitive platform that will be able to deal with the increasing production of data, or that will create the conditions for the efficient mining of massive amounts of data from various data warehouses with the aim of creating (useful) information or the production of new knowledge. This paper discusses such technology - the technology of big data mining, known as Cloud Data Mining (CDM).


2021 ◽  
Vol 9 (1) ◽  
pp. 41-50
Author(s):  
Ruhul Amin ◽  
Siddhartha Vadlamudi

Cloud data migration is the process of moving data, localhost applications, services, and data to the distributed cloud processing framework. The success of this data migration measure is relying upon a few viewpoints like planning and impact analysis of existing enterprise systems. Quite possibly the most widely recognized process is moving locally stored data in a public cloud computing environment. Cloud migration comes along with both challenges and advantages, so there are different academic research and technical applications on data migration to the cloud that will be discussed throughout this paper. By breaking down the research achievement and application status, we divide the existing migration techniques into three strategies as indicated by the cloud service models essentially. Various processes should be considered for different migration techniques, and various tasks will be included accordingly. The similarities and differences between the migration strategies are examined, and the challenges and future work about data migration to the cloud are proposed. This paper, through a research survey, recognizes the key benefits and challenges of migrating data into the cloud. There are different cloud migration procedures and models recommended to assess the presentation, identifying security requirements, choosing a cloud provider, calculating the expense, and making any essential organizational changes. The results of this research paper can give a roadmap for data migration and can help decision-makers towards a secure and productive migration to a cloud computing environment.


Sign in / Sign up

Export Citation Format

Share Document