scholarly journals Hybrid vs Ensemble Classification Models for Phishing Websites

2020 ◽  
pp. 3387-3396
Author(s):  
Sakinat Oluwabukonla Folorunso ◽  
Femi Emmanuel Ayo ◽  
Khadijah-Kuburah Adebisi Abdullah ◽  
Peter Ibikunle Ogunyinka

Phishing is an internet crime achieved by imitating a legitimate website of a host in order to steal confidential information. Many researchers have developed phishing classification models that are limited in real-time and computational efficiency.  This paper presents an ensemble learning model composed of DTree and NBayes, by STACKING method, with DTree as base learner. The aim is to combine the advantages of simplicity and effectiveness of DTree with the lower complexity time of NBayes. The models were integrated and appraised independently for data training and the probabilities of each class were averaged by their accuracy on the trained data through testing process. The present results of the empirical study on phishing website dataset suggest that the ensemble model significantly outperformed the hybrid model in terms of the measures used. Finally, DTree and STACKING methods showed superior performances compared to the other models.

2018 ◽  
Vol 8 (2) ◽  
pp. 35-48
Author(s):  
Jiří Rybička ◽  
Petra Čačková

One of the tools to determine the recommended order of the courses to be taught is to set the prerequisites, that is, the conditions that have to be fulfilled before commencing the study of the course. The recommended sequence of courses is to follow logical links between their logical units, as the basic aim is to provide students with a coherent system according to the Comenius' principle of continuity. Declared continuity may, on the other hand, create organizational complications when passing through the study, as failure to complete one course may result in a whole sequence of forced deviations from the recommended curriculum and ultimately in the extension of the study period. This empirical study deals with the quantitative evaluation of the influence of the level of initial knowledge given by the previous study on the overall results in a certain follow-up course. In this evaluation, data were obtained that may slightly change the approach to determining prerequisites for higher education courses.


Author(s):  
Amos Golan

In this chapter I provide additional rationalization for using the info-metrics framework. This time the justifications are in terms of the statistical, mathematical, and information-theoretic properties of the formalism. Specifically, in this chapter I discuss optimality, statistical and computational efficiency, sufficiency, the concentration theorem, the conditional limit theorem, and the concept of information compression. These properties, together with the other properties and measures developed in earlier chapters, provide logical, mathematical, and statistical justifications for employing the info-metrics framework.


Religions ◽  
2021 ◽  
Vol 12 (3) ◽  
pp. 199
Author(s):  
Maria Ledstam

This article engages with how religion and economy relate to each other in faith-based businesses. It also elaborates on a recurrent idea in theological literature that reflections on different visions of time can advance theological analyses of the relationship between Christianity and capitalism. More specifically, this article brings results from an ethnographic study of two faith-based businesses into conversation with the ethicist Luke Bretherton’s presentation of different understandings of the relationship between Christianity and capitalism. Using Theodore Schatzki’s theory of timespace, the article examines how time and space are constituted in two small faith-based businesses that are part of the two networks Business as Mission (evangelical) and Economy of Communion (catholic) and how the different timespaces affect the religious-economic configurations in the two cases and with what moral implications. The overall findings suggest that the timespace in the Catholic business was characterized by struggling caused by a tension between certain ideals on how religion and economy should relate to each other on the one hand and how the practice evolved on the other hand. Furthermore, the timespace in the evangelical business was characterized by confidence, caused by the business having a rather distinct and achievable goal when it came to how they wanted to be different and how religion should relate to economy. There are, however, nuances and important resemblances between the cases that cannot be explained by the businesses’ confessional and theological affiliations. Rather, there seems to be something about the phenomenon of tension-filled and confident faith-based businesses that causes a drive in the practices towards the common good. After mapping the results of the empirical study, I discuss some contributions that I argue this study brings to Bretherton’s presentation of the relationship between Christianity and capitalism.


Energies ◽  
2021 ◽  
Vol 14 (11) ◽  
pp. 3322
Author(s):  
Sara Alonso ◽  
Jesús Lázaro ◽  
Jaime Jiménez ◽  
Unai Bidarte ◽  
Leire Muguira

Smart grid endpoints need to use two environments within a processing system (PS), one with a Linux-type operating system (OS) using the Arm Cortex-A53 cores for management tasks, and the other with a standalone execution or a real-time OS using the Arm Cortex-R5 cores. The Xen hypervisor and the OpenAMP framework allow this, but they may introduce a delay in the system, and some messages in the smart grid need a latency lower than 3 ms. In this paper, the Linux thread latencies are characterized by the Cyclictest tool. It is shown that when Xen hypervisor is used, this scenario is not suitable for the smart grid as it does not meet the 3 ms timing constraint. Then, standalone execution as the real-time part is evaluated, measuring the delay to handle an interrupt created in programmable logic (PL). The standalone application was run in A53 and R5 cores, with Xen hypervisor and OpenAMP framework. These scenarios all met the 3 ms constraint. The main contribution of the present work is the detailed characterization of each real-time execution, in order to facilitate selecting the most suitable one for each application.


2020 ◽  
Vol 2020 ◽  
pp. 1-6
Author(s):  
Jian-ye Yuan ◽  
Xin-yuan Nan ◽  
Cheng-rong Li ◽  
Le-le Sun

Considering that the garbage classification is urgent, a 23-layer convolutional neural network (CNN) model is designed in this paper, with the emphasis on the real-time garbage classification, to solve the low accuracy of garbage classification and recycling and difficulty in manual recycling. Firstly, the depthwise separable convolution was used to reduce the Params of the model. Then, the attention mechanism was used to improve the accuracy of the garbage classification model. Finally, the model fine-tuning method was used to further improve the performance of the garbage classification model. Besides, we compared the model with classic image classification models including AlexNet, VGG16, and ResNet18 and lightweight classification models including MobileNetV2 and SuffleNetV2 and found that the model GAF_dense has a higher accuracy rate, fewer Params, and FLOPs. To further check the performance of the model, we tested the CIFAR-10 data set and found the accuracy rates of the model (GAF_dense) are 0.018 and 0.03 higher than ResNet18 and SufflenetV2, respectively. In the ImageNet data set, the accuracy rates of the model (GAF_dense) are 0.225 and 0.146 higher than Resnet18 and SufflenetV2, respectively. Therefore, the garbage classification model proposed in this paper is suitable for garbage classification and other classification tasks to protect the ecological environment, which can be applied to classification tasks such as environmental science, children’s education, and environmental protection.


Author(s):  
N. Bosso ◽  
A. Gugliotta ◽  
N. Zampieri

Determination of contact forces exchanged between wheel and rail is one of the most important topics in railway dynamics. Recent studies are oriented to improve the existing contact methods in terms of computational efficiency on one side and on the other side to develop more complex and precise representation of the contact problem. This work shows some new results of the contact code developed at Politecnico di Torino identified as RTCONTACT; this code, which is an improvement of the CONPOL algorithm, is the result of long term activities, early versions were used in conjunction with MBS codes or in Matlab® environment to simulate vehicle behaviour. The code has been improved also using experimental tests performed on a scaled roller-rig. More recently the contact model was improved in order to obtain a higher computational efficiency that is a required for the use inside of a Real Time process. Benefit of a Real Time contact algorithm is the possibility to use complex simulation models in diagnostic or control systems in order to improve their performances. This work shows several comparisons of the RTCONTACT contact code respect commercial codes, standards and benchmark results.


2006 ◽  
Vol 55 (9) ◽  
pp. 1229-1235 ◽  
Author(s):  
Catharina F. M. Linssen ◽  
Jan A. Jacobs ◽  
Pieter Beckers ◽  
Kate E. Templeton ◽  
Judith Bakkers ◽  
...  

Pneumocystis jiroveci pneumonia (PCP) is an opportunistic infection affecting immunocompromised patients. While conventional diagnosis of PCP by microscopy is cumbersome, the use of PCR to diagnose PCP has great potential. Nevertheless, inter-laboratory validation and standardization of PCR assays is lacking. The aim of this study was to evaluate the inter-laboratory agreement of three independently developed real-time PCR assays for the detection of P. jiroveci in bronchoalveolar lavage fluid samples. Therefore, 124 samples were collected in three tertiary care laboratories (Leiden University Medical Center, Maastricht Infection Center and Radboud University Nijmegen Medical Centre) and were tested by both microscopy and real-time PCR. Of 41 samples positive for P. jiroveci by microscopy, 40 were positive in all three PCR assays. The remaining sample was positive in a single assay only. Out of 83 microscopy-negative samples, 69 were negative in all three PCR assays. The other 14 samples were found positive, either in all three assays (n=5), in two (n=2) or in one of the assays (n=7). The data demonstrate high inter-laboratory agreement among real-time PCR assays for the detection of P. jiroveci.


2017 ◽  
Author(s):  
Andrew B. Hall ◽  
Connor Huff ◽  
Shiro Kuriwaki

How did personal wealth and slaveownership affect the likelihood southerners fought for the Confederate Army in the American Civil War? On the one hand, wealthy southerners had incentives to free-ride on poorer southerners and avoid fighting; on the other hand, wealthy southerners were disproportionately slaveowners, and thus had more at stake in the outcome of the war. We assemble a dataset on roughly 3.9 million free citizens in the Confederacy, and show that slaveowners were more likely to fight than non-slaveowners. We then exploit a randomized land lottery held in 1832 in Georgia. Households of lottery winners owned more slaves in 1850 and were more likely to have sons who fought in the Confederate Army. We conclude that slaveownership, in contrast to some other kinds of wealth, compelled southerners to fight despite free-rider incentives because it raised their stakes in the war’s outcome.


2019 ◽  
Vol 3 ◽  
Author(s):  
Charlotte Olivia Brand ◽  
James Patrick Ounsley ◽  
Daniel Job Van der Post ◽  
Thomas Joshua Henry Morgan

This paper introduces a statistical technique known as “posterior passing” in which the results of past studies can be used to inform the analyses carried out by subsequent studies. We first describe the technique in detail and show how it can be implemented by individual researchers on an experiment by experiment basis. We then use a simulation to explore its success in identifying true parameter values compared to current statistical norms (ANOVAs and GLMMs). We find that posterior passing allows the true effect in the population to be found with greater accuracy and consistency than the other analysis types considered. Furthermore, posterior passing performs almost identically to a data analysis in which all data from all simulated studies are combined and analysed as one dataset. On this basis, we suggest that posterior passing is a viable means of implementing cumulative science. Furthermore, because it prevents the accumulation of large bodies of conflicting literature, it alleviates the need for traditional meta-analyses. Instead, posterior passing cumulatively and collaboratively provides clarity in real time as each new study is produced and is thus a strong candidate for a new, cumulative approach to scientific analyses and publishing.


Sign in / Sign up

Export Citation Format

Share Document