scholarly journals Data mining in web personalization using the blended deep learning model

Author(s):  
Qusay Abdullah Abed ◽  
Osamah Mohammed Fadhil ◽  
Wathiq Laftah Al-Yaseen

In general, multidimensional data (mobile application for example) contain a large number of unnecessary information. Web app users find it difficult to get the information needed quickly and effectively due to the sheer volume of data (big data produced per second). In this paper, we tend to study the data mining in web personalization using blended deep learning model. So, one of the effective solutions to this problem is web personalization. As well as, explore how this model helps to analyze and estimate the huge amounts of operations. Providing personalized recommendations to improve reliability depends on the web application using useful information in the web application. The results of this research are important for the training and testing of large data sets for a map of deep mixed learning based on the model of back-spread neural network. The HADOOP framework was used to perform a number of experiments in a different environment with a learning rate between -1 and +1. Also, using the number of techniques to evaluate the number of parameters, true positive cases are represent and fall into positive cases in this example to evaluate the proposed model.

Cancers ◽  
2021 ◽  
Vol 14 (1) ◽  
pp. 12
Author(s):  
Jose M. Castillo T. ◽  
Muhammad Arif ◽  
Martijn P. A. Starmans ◽  
Wiro J. Niessen ◽  
Chris H. Bangma ◽  
...  

The computer-aided analysis of prostate multiparametric MRI (mpMRI) could improve significant-prostate-cancer (PCa) detection. Various deep-learning- and radiomics-based methods for significant-PCa segmentation or classification have been reported in the literature. To be able to assess the generalizability of the performance of these methods, using various external data sets is crucial. While both deep-learning and radiomics approaches have been compared based on the same data set of one center, the comparison of the performances of both approaches on various data sets from different centers and different scanners is lacking. The goal of this study was to compare the performance of a deep-learning model with the performance of a radiomics model for the significant-PCa diagnosis of the cohorts of various patients. We included the data from two consecutive patient cohorts from our own center (n = 371 patients), and two external sets of which one was a publicly available patient cohort (n = 195 patients) and the other contained data from patients from two hospitals (n = 79 patients). Using multiparametric MRI (mpMRI), the radiologist tumor delineations and pathology reports were collected for all patients. During training, one of our patient cohorts (n = 271 patients) was used for both the deep-learning- and radiomics-model development, and the three remaining cohorts (n = 374 patients) were kept as unseen test sets. The performances of the models were assessed in terms of their area under the receiver-operating-characteristic curve (AUC). Whereas the internal cross-validation showed a higher AUC for the deep-learning approach, the radiomics model obtained AUCs of 0.88, 0.91 and 0.65 on the independent test sets compared to AUCs of 0.70, 0.73 and 0.44 for the deep-learning model. Our radiomics model that was based on delineated regions resulted in a more accurate tool for significant-PCa classification in the three unseen test sets when compared to a fully automated deep-learning model.


2017 ◽  
Vol 7 (1.1) ◽  
pp. 286
Author(s):  
B. Sekhar Babu ◽  
P. Lakshmi Prasanna ◽  
P. Vidyullatha

 In current days, World Wide Web has grown into a familiar medium to investigate the new information, Business trends, trading strategies so on. Several organizations and companies are also contracting the web in order to present their products or services across the world. E-commerce is a kind of business or saleable transaction that comprises the transfer of statistics across the web or internet. In this situation huge amount of data is obtained and dumped into the web services. This data overhead tends to arise difficulties in determining the accurate and valuable information, hence the web data mining is used as a tool to determine and mine the knowledge from the web. Web data mining technology can be applied by the E-commerce organizations to offer personalized E-commerce solutions and better meet the desires of customers. By using data mining algorithm such as ontology based association rule mining using apriori algorithms extracts the various useful information from the large data sets .We are implementing the above data mining technique in JAVA and data sets are dynamically generated while transaction is processing and extracting various patterns.


2021 ◽  
Vol 7 ◽  
pp. e551
Author(s):  
Nihad Karim Chowdhury ◽  
Muhammad Ashad Kabir ◽  
Md. Muhtadir Rahman ◽  
Noortaz Rezoana

The goal of this research is to develop and implement a highly effective deep learning model for detecting COVID-19. To achieve this goal, in this paper, we propose an ensemble of Convolutional Neural Network (CNN) based on EfficientNet, named ECOVNet, to detect COVID-19 from chest X-rays. To make the proposed model more robust, we have used one of the largest open-access chest X-ray data sets named COVIDx containing three classes—COVID-19, normal, and pneumonia. For feature extraction, we have applied an effective CNN structure, namely EfficientNet, with ImageNet pre-training weights. The generated features are transferred into custom fine-tuned top layers followed by a set of model snapshots. The predictions of the model snapshots (which are created during a single training) are consolidated through two ensemble strategies, i.e., hard ensemble and soft ensemble, to enhance classification performance. In addition, a visualization technique is incorporated to highlight areas that distinguish classes, thereby enhancing the understanding of primal components related to COVID-19. The results of our empirical evaluations show that the proposed ECOVNet model outperforms the state-of-the-art approaches and significantly improves detection performance with 100% recall for COVID-19 and overall accuracy of 96.07%. We believe that ECOVNet can enhance the detection of COVID-19 disease, and thus, underpin a fully automated and efficacious COVID-19 detection system.


Electronics ◽  
2021 ◽  
Vol 10 (7) ◽  
pp. 850
Author(s):  
Pablo Zinemanas ◽  
Martín Rocamora ◽  
Marius Miron ◽  
Frederic Font ◽  
Xavier Serra

Deep learning models have improved cutting-edge technologies in many research areas, but their black-box structure makes it difficult to understand their inner workings and the rationale behind their predictions. This may lead to unintended effects, such as being susceptible to adversarial attacks or the reinforcement of biases. There is still a lack of research in the audio domain, despite the increasing interest in developing deep learning models that provide explanations of their decisions. To reduce this gap, we propose a novel interpretable deep learning model for automatic sound classification, which explains its predictions based on the similarity of the input to a set of learned prototypes in a latent space. We leverage domain knowledge by designing a frequency-dependent similarity measure and by considering different time-frequency resolutions in the feature space. The proposed model achieves results that are comparable to that of the state-of-the-art methods in three different sound classification tasks involving speech, music, and environmental audio. In addition, we present two automatic methods to prune the proposed model that exploit its interpretability. Our system is open source and it is accompanied by a web application for the manual editing of the model, which allows for a human-in-the-loop debugging approach.


2019 ◽  
Vol 8 (4) ◽  
pp. 8123-8127

Health is one of the rising subjects utilized for surveying Health condition among patients who experience the ill effects of explicit sickness or infections. The Health searchers have numerous on the web and disconnected techniques to get the data mentioned by them. However, the network based Health administrations have a few characteristic impediments, for example, tedious for Health searchers and furthermore mitigate the specialists' remaining burden. In this way, programmed infection surmising is criticalness to conquer the trouble of online Health searcher. This work expects to fabricate a sickness recommendation conspire that can consequently gather the potential ailments of the given inquiries in network based Health administrations. Here propose a novel profound learning plan to induce the conceivable sickness given the subject of Health searchers. Our meagerly associated profound learning model contains five layers including the information and yield layers. The hubs in the info layer speak to crude highlights, and hubs in the yield layer mean the surmising results that are used to rough the genuine infection types. This model initially breaks down the data needs of Health searchers regarding inquiry and afterward selects those that pose for potential infections of their showed side effects for further explanatory. At that point client will look for their needs as inquiry. Next preprocesses the inquiry to locate the therapeutic qualities. At that point the preprocessed ascribes to distinguish the relating infection idea. Broad investigates a genuine world dataset named by online specialists show the noteworthy presentation additions of our plan


2018 ◽  
Author(s):  
Yu Li ◽  
Zhongxiao Li ◽  
Lizhong Ding ◽  
Yuhui Hu ◽  
Wei Chen ◽  
...  

ABSTRACTMotivationIn most biological data sets, the amount of data is regularly growing and the number of classes is continuously increasing. To deal with the new data from the new classes, one approach is to train a classification model, e.g., a deep learning model, from scratch based on both old and new data. This approach is highly computationally costly and the extracted features are likely very different from the ones extracted by the model trained on the old data alone, which leads to poor model robustness. Another approach is to fine tune the trained model from the old data on the new data. However, this approach often does not have the ability to learn new knowledge without forgetting the previously learned knowledge, which is known as the catastrophic forgetting problem. To our knowledge, this problem has not been studied in the field of bioinformatics despite its existence in many bioinformatic problems.ResultsHere we propose a novel method, SupportNet, to solve the catastrophic forgetting problem efficiently and effectively. SupportNet combines the strength of deep learning and support vector machine (SVM), where SVM is used to identify the support data from the old data, which are fed to the deep learning model together with the new data for further training so that the model can review the essential information of the old data when learning the new information. Two powerful consolidation regularizers are applied to ensure the robustness of the learned model. Comprehensive experiments on various tasks, including enzyme function prediction, subcellular structure classification and breast tumor classification, show that SupportNet drastically outperforms the state-of-the-art incremental learning methods and reaches similar performance as the deep learning model trained from scratch on both old and new data.AvailabilityOur program is accessible at: https://github.com/lykaust15/SupportNet.


2021 ◽  
Author(s):  
Mohammed Y. Alzahrani ◽  
Alwi M Bamhdi

Abstract In recent years, the use of the internet of things (IoT) has increased dramatically, and cybersecurity concerns have grown in tandem. Cybersecurity has become a major challenge for institutions and companies of all sizes, with the spread of threats growing in number and developing at a rapid pace. Artificial intelligence (AI) in cybersecurity can to a large extent help face the challenge, since it provides a powerful framework and coordinates that allow organisations to stay one step ahead of sophisticated cyber threats. AI provides real-time feedback, helping rollover daily alerts to be investigated and analysed, effective decisions to be made and enabling quick responses. AI-based capabilities make attack detection, security and mitigation more accurate for intelligence gathering and analysis, and they enable proactive protective countermeasures to be taken to overwhelm attacks. In this study, we propose a robust system specifically to help detect botnet attacks of IoT devices. This was done by innovatively combining the model of a convolutional neural network with a long short-term memory algorithm mechanism to detect two common and serious IoT attacks (BASHLITE and Mirai) on four types of security camera. The data sets, which contained normal malicious network packets, were collected from real-time lab-connected camera devices in IoT environments. The results of the experiment showed that the proposed system achieved optimal performance, according to evaluation metrics. The proposed system gave the following weighted average results for detecting the botnet on the Provision PT-737E camera: camera precision: 88%, recall: 87% and F1 score: 83%. The results of system for classifying botnet attacks and normal packets on the Provision PT-838 camera were 89% for recall, 85% for F1 score and 94%, precision. The intelligent security system using the advanced deep learning model was successful for detecting botnet attacks that infected camera devices connected to IoT applications.


2019 ◽  
Vol 38 (12) ◽  
pp. 934-942 ◽  
Author(s):  
Xing Zhao ◽  
Ping Lu ◽  
Yanyan Zhang ◽  
Jianxiong Chen ◽  
Xiaoyang Li

Noise attenuation for ordinary images using machine learning technology has achieved great success in the computer vision field. However, directly applying these models to seismic data would not be effective since the evaluation criteria from the geophysical domain require a high-quality visualized image and the ability to maintain original seismic signals from the contaminated wavelets. This paper introduces an approach equipped with a specially designed deep learning model that can effectively attenuate swell noise with different intensities and characteristics from shot gathers with a relatively simple workflow applicable to marine seismic data sets. Three significant benefits are introduced from the proposed deep learning model. First, our deep learning model doesn't need to consume a pure swell-noise model. Instead, a contaminated swell-noise model derived from field data sets (which may contain other noises or primary signals) can be used for training. Second, inspired by the conventional algorithm for coherent noise attenuation, our neural network model is designed to learn and detect the swell noise rather than inferring the attenuated seismic data. Third, several comparisons (signal-to-noise ratio, mean squared error, and intensities of residual swell noises) indicate that the deep learning approach has the capability to remove swell noise without harming the primary signals. The proposed deep learning-based approach can be considered as an alternative approach that combines and takes advantage of both the conventional and data-driven method to better serve swell-noise attenuation. The comparable results also indicate that the deep learning method has strong potential to solve other coherent noise-attenuation tasks for seismic data.


Sign in / Sign up

Export Citation Format

Share Document