scholarly journals Post Thoracic Surgery Life Expectancy Prediction Using Machine Learning

Author(s):  
Akshaya Ravichandran ◽  
Krutika Mahulikar ◽  
Shreya Agarwal ◽  
Suresh Sankaranarayanan

Lung cancer survival rate is very limited post-surgery irrespective it is “small cell and non-small cell”. Lot of work have been carried out by employing machine learning in life expectancy prediction post thoracic surgery for patients with lung cancer. Many machine learning models like Multi-layer perceptron (MLP), SVM, Naïve Bayes, Decision Tree, Random forest, Logistic regression been applied for post thoracic surgery life expectancy prediction based on data sets from UCI. Also, work has been carried out towards attribute ranking and selection in performing better in improving prediction accuracy with machine learning algorithms So accordingly, we here have developed Deep Neural Network based approach in prediction of post thoracic Life expectancy which is the most advanced form of Neural Networks . This is based on dataset obtained from Wroclaw Thoracic Surgery Centre machine learning repository which contained 470 instances On comparing the accuracy, the results indicate that the deep neural network can be efficiently used for predicting the life expectancy.

Author(s):  
Syed Khurram Jah Rizvi ◽  
Warda Aslam ◽  
Muhammad Shahzad ◽  
Shahzad Saleem ◽  
Muhammad Moazam Fraz

AbstractEnterprises are striving to remain protected against malware-based cyber-attacks on their infrastructure, facilities, networks and systems. Static analysis is an effective approach to detect the malware, i.e., malicious Portable Executable (PE). It performs an in-depth analysis of PE files without executing, which is highly useful to minimize the risk of malicious PE contaminating the system. Yet, instant detection using static analysis has become very difficult due to the exponential rise in volume and variety of malware. The compelling need of early stage detection of malware-based attacks significantly motivates research inclination towards automated malware detection. The recent machine learning aided malware detection approaches using static analysis are mostly supervised. Supervised malware detection using static analysis requires manual labelling and human feedback; therefore, it is less effective in rapidly evolutionary and dynamic threat space. To this end, we propose a progressive deep unsupervised framework with feature attention block for static analysis-based malware detection (PROUD-MAL). The framework is based on cascading blocks of unsupervised clustering and features attention-based deep neural network. The proposed deep neural network embedded with feature attention block is trained on the pseudo labels. To evaluate the proposed unsupervised framework, we collected a real-time malware dataset by deploying low and high interaction honeypots on an enterprise organizational network. Moreover, endpoint security solution is also deployed on an enterprise organizational network to collect malware samples. After post processing and cleaning, the novel dataset consists of 15,457 PE samples comprising 8775 malicious and 6681 benign ones. The proposed PROUD-MAL framework achieved an accuracy of more than 98.09% with better quantitative performance in standard evaluation parameters on collected dataset and outperformed other conventional machine learning algorithms. The implementation and dataset are available at https://bit.ly/35Sne3a.


2021 ◽  
Vol 30 (04) ◽  
pp. 2150020
Author(s):  
Luke Holbrook ◽  
Miltiadis Alamaniotis

With the increase of cyber-attacks on millions of Internet of Things (IoT) devices, the poor network security measures on those devices are the main source of the problem. This article aims to study a number of these machine learning algorithms available for their effectiveness in detecting malware in consumer internet of things devices. In particular, the Support Vector Machines (SVM), Random Forest, and Deep Neural Network (DNN) algorithms are utilized for a benchmark with a set of test data and compared as tools in safeguarding the deployment for IoT security. Test results on a set of 4 IoT devices exhibited that all three tested algorithms presented here detect the network anomalies with high accuracy. However, the deep neural network provides the highest coefficient of determination R2, and hence, it is identified as the most precise among the tested algorithms concerning the security of IoT devices based on the data sets we have undertaken.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Yoo Jin Choo ◽  
Jeoung Kun Kim ◽  
Jang Hwan Kim ◽  
Min Cheol Chang ◽  
Donghwi Park

AbstractWe investigated the potential of machine learning techniques, at an early stage after stroke, to predict the need for ankle–foot orthosis (AFO) in stroke patients. We retrospectively recruited 474 consecutive stroke patients. The need for AFO during ambulation (output variable) was classified according to the Medical Research Council (MRC) score for the ankle dorsiflexor of the affected limb. Patients with an MRC score of < 3 for the ankle dorsiflexor of the affected side were considered to require AFO, while those with scores ≥ 3 were considered not to require AFO. The following demographic and clinical data collected when patients were transferred to the rehabilitation unit (16.20 ± 6.02 days) and 6 months after stroke onset were used as input data: age, sex, type of stroke (ischemic/hemorrhagic), motor evoked potential data on the tibialis anterior muscle of the affected side, modified Brunnstrom classification, functional ambulation category, MRC score for muscle strength for shoulder abduction, elbow flexion, finger flexion, finger extension, hip flexion, knee extension, and ankle dorsiflexion of the affected side. For the deep neural network model, the area under the curve (AUC) was 0.887. For the random forest and logistic regression models, the AUC was 0.855 and 0.845, respectively. Our findings demonstrate that machine learning algorithms, particularly the deep neural network, are useful for predicting the need for AFO in stroke patients during the recovery phase.


Author(s):  
Akshay Rajendra Naik ◽  
A. V. Deorankar ◽  
P. B. Ambhore

Rainfall prediction is useful for all people for decision making in all fields, such as out door gamming, farming, traveling, and factory and for other activities. We studied various methods for rainfall prediction such as machine learning and neural networks. There is various machine learning algorithms are used in previous existing methods such as naïve byes, support vector machines, random forest, decision trees, and ensemble learning methods. We used deep neural network for rainfall prediction, and for optimization of deep neural network Adam optimizer is used for setting modal parameters, as a result our method gives better results as compare to other machine learning methods.


The Breast cancer is the most life menacing disease among women. Early prophecy assurances the endurance of patients. In this work, first Deep neural network classifiers with different hidden layers with different nodes are used to explore the anthropometric information and blood investigation strictures and to predict the disease. Then machine learning algorithms such as SVM and Decision tree are also trained with the same data. Finally the performance of each classifier was deliberated. The pre-processed data of admitted patients with the breast cancer perception are used to train and test the classifiers. This article shack glow on the concert estimation based on right and erroneous data classification


Sign in / Sign up

Export Citation Format

Share Document