DDoS Detection System: Utilizing Gradient Boosting Algorithm and Apache Spark

Author(s):  
Amjad Alsirhani ◽  
Srinivas Sampalli ◽  
Peter Bodorik
2020 ◽  
Vol 38 (5) ◽  
pp. 6527-6535
Author(s):  
Nilesh Vishwasrao Patil ◽  
C. Rama Krishna ◽  
Krishan Kumar

2021 ◽  
Vol 5 (1) ◽  
Author(s):  
Osman Mamun ◽  
Madison Wenzlick ◽  
Arun Sathanur ◽  
Jeffrey Hawk ◽  
Ram Devanathan

AbstractThe Larson–Miller parameter (LMP) offers an efficient and fast scheme to estimate the creep rupture life of alloy materials for high-temperature applications; however, poor generalizability and dependence on the constant C often result in sub-optimal performance. In this work, we show that the direct rupture life parameterization without intermediate LMP parameterization, using a gradient boosting algorithm, can be used to train ML models for very accurate prediction of rupture life in a variety of alloys (Pearson correlation coefficient >0.9 for 9–12% Cr and >0.8 for austenitic stainless steels). In addition, the Shapley value was used to quantify feature importance, making the model interpretable by identifying the effect of various features on the model performance. Finally, a variational autoencoder-based generative model was built by conditioning on the experimental dataset to sample hypothetical synthetic candidate alloys from the learnt joint distribution not existing in both 9–12% Cr ferritic–martensitic alloys and austenitic stainless steel datasets.


Author(s):  
Hai Tao ◽  
Maria Habib ◽  
Ibrahim Aljarah ◽  
Hossam Faris ◽  
Haitham Abdulmohsin Afan ◽  
...  

2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Colin Griesbach ◽  
Benjamin Säfken ◽  
Elisabeth Waldmann

Abstract Gradient boosting from the field of statistical learning is widely known as a powerful framework for estimation and selection of predictor effects in various regression models by adapting concepts from classification theory. Current boosting approaches also offer methods accounting for random effects and thus enable prediction of mixed models for longitudinal and clustered data. However, these approaches include several flaws resulting in unbalanced effect selection with falsely induced shrinkage and a low convergence rate on the one hand and biased estimates of the random effects on the other hand. We therefore propose a new boosting algorithm which explicitly accounts for the random structure by excluding it from the selection procedure, properly correcting the random effects estimates and in addition providing likelihood-based estimation of the random effects variance structure. The new algorithm offers an organic and unbiased fitting approach, which is shown via simulations and data examples.


2021 ◽  
Vol 9 (2) ◽  
pp. 85-100
Author(s):  
Md Saikat Hosen ◽  
Ruhul Amin

Gradient boosting machines, the learning process successively fits fresh prototypes to offer a more precise approximation of the response parameter. The principle notion associated with this algorithm is that a fresh base-learner construct to be extremely correlated with the “negative gradient of the loss function” related to the entire ensemble. The loss function's usefulness can be random, nonetheless, for a clearer understanding of this subject, if the “error function is the model squared-error loss”, then the learning process would end up in sequential error-fitting. This study is aimed at delineating the significance of the gradient boosting algorithm in data management systems. The article will dwell much the significance of gradient boosting algorithm in text classification as well as the limitations of this model. The basic methodology as well as the basic-learning algorithm of the gradient boosting algorithms originally formulated by Friedman, is presented in this study. This may serve as an introduction to gradient boosting algorithms. This article has displayed the approach of gradient boosting algorithms. Both the hypothetical system and the plan choices were depicted and outlined. We have examined all the basic stages of planning a specific demonstration for one’s experimental needs. Elucidation issues have been tended to and displayed as a basic portion of the investigation. The capabilities of the gradient boosting algorithms were examined on a set of real-world down-to-earth applications such as text classification.


2019 ◽  
pp. 1952-1983
Author(s):  
Pourya Shamsolmoali ◽  
Masoumeh Zareapoor ◽  
M.Afshar Alam

Distributed Denial of Service (DDoS) attacks have become a serious attack for internet security and Cloud Computing environment. This kind of attacks is the most complex form of DoS (Denial of Service) attacks. This type of attack can simply duplicate its source address, such as spoofing attack, which defending methods do not able to disguises the real location of the attack. Therefore, DDoS attack is the most significant challenge for network. In this chapter we present different aspect of security in Cloud Computing, mostly we concentrated on DDOS Attacks. The Authors illustrated all types of Dos Attacks and discussed the most effective detection methods.


2019 ◽  
Vol 47 (W1) ◽  
pp. W136-W141 ◽  
Author(s):  
Emidio Capriotti ◽  
Ludovica Montanucci ◽  
Giuseppe Profiti ◽  
Ivan Rossi ◽  
Diana Giannuzzi ◽  
...  

Abstract As the amount of genomic variation data increases, tools that are able to score the functional impact of single nucleotide variants become more and more necessary. While there are several prediction servers available for interpreting the effects of variants in the human genome, only few have been developed for other species, and none were specifically designed for species of veterinary interest such as the dog. Here, we present Fido-SNP the first predictor able to discriminate between Pathogenic and Benign single-nucleotide variants in the dog genome. Fido-SNP is a binary classifier based on the Gradient Boosting algorithm. It is able to classify and score the impact of variants in both coding and non-coding regions based on sequence features within seconds. When validated on a previously unseen set of annotated variants from the OMIA database, Fido-SNP reaches 88% overall accuracy, 0.77 Matthews correlation coefficient and 0.91 Area Under the ROC Curve.


Sign in / Sign up

Export Citation Format

Share Document