A Survey on Epilepsy Seizure Detection Using Machine Learning Technique

Author(s):  
Saranya N ◽  
Karthika Renuka D

Epilepsy, One of the most prevalent neurological disorder. Its a chronic condition is characterized by voluntary, unpredictable, and recurrent seizures that affects millions of individuals worldwide. A brief alteration in normal brain function that affects the health of patients occurs in this chronic condition. Detection of epileptic seizures before the start of the onset is beneficial. Recent studies have suggested approaches to machine learning that automatically execute those diagnostic tasks by integrating statistics and computer science. Machine learning, an application of AI (Artificial Intelligence) technology, allows a machine to learn something new automatically and thereby improve its output through meaningful data. For the prediction of epileptic seizures from electroencephalogram (EEG) signals, machine learning techniques and computational methods are used. There is a vast amount of medical data available today about the disease, its symptoms, causes of illness and its effects. But this data is not analyzed properly to predict or to study a disease. The objective of this paper is to provide detailed versions of machine learning predictive models for predicting epilepsy seizure detection and describing several types of predictive models and their applications in the field of healthcare. So that seizures can be predicted earlier before it occurs, it will be useful for epilepsy patients to improve their safety and quality of their life.

Author(s):  
Saranya N ◽  
Karthika Renuka D

Epilepsy, One of the most prevalent neurological disorder. Its a chronic condition is characterized by voluntary, unpredictable, and recurrent seizures that affects millions of individuals worldwide. A brief alteration in normal brain function that affects the health of patients occurs in this chronic condition. Detection of epileptic seizures before the start of the onset is beneficial. Recent studies have suggested approaches to machine learning that automatically execute those diagnostic tasks by integrating statistics and computer science. Machine learning, an application of AI (Artificial Intelligence) technology, allows a machine to learn something new automatically and thereby improve its output through meaningful data. For the prediction of epileptic seizures from electroencephalogram (EEG) signals, machine learning techniques and computational methods are used. There is a vast amount of medical data available today about the disease, its symptoms, causes of illness and its effects. But this data is not analyzed properly to predict or to study a disease. The objective of this paper is to provide detailed versions of machine learning predictive models for predicting epilepsy seizure detection and describing several types of predictive models and their applications in the field of healthcare. So that seizures can be predicted earlier before it occurs, it will be useful for epilepsy patients to improve their safety and quality of their life.


Author(s):  
Can Eyupoglu

Epilepsy is a brain disorder that can be defined as a short-time and temporary occurrence of symptoms because of abnormal extreme or synchronous neuronal activity of the brain. Almost one percent of the world's population is struggling with epilepsy illness. The detection of epileptic seizures is mainly realized with reading the electroencephalogram (EEG) recordings by medical doctors due to the unpredictable and complex nature of the disease. This process takes much time and depends on the expert's experience. For this reason, automatic seizure detection using EEG recordings is necessary and of great importance for the comfort of medical doctors and patients. While detecting epileptic seizure automatically, machine learning techniques are used in the field of computer science. This chapter deals with the methods, approaches, models, and techniques which are utilized to detect epileptic seizures.


2020 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Kumash Kapadia ◽  
Hussein Abdel-Jaber ◽  
Fadi Thabtah ◽  
Wael Hadi

Indian Premier League (IPL) is one of the more popular cricket world tournaments, and its financial is increasing each season, its viewership has increased markedly and the betting market for IPL is growing significantly every year. With cricket being a very dynamic game, bettors and bookies are incentivised to bet on the match results because it is a game that changes ball-by-ball. This paper investigates machine learning technology to deal with the problem of predicting cricket match results based on historical match data of the IPL. Influential features of the dataset have been identified using filter-based methods including Correlation-based Feature Selection, Information Gain (IG), ReliefF and Wrapper. More importantly, machine learning techniques including Naïve Bayes, Random Forest, K-Nearest Neighbour (KNN) and Model Trees (classification via regression) have been adopted to generate predictive models from distinctive feature sets derived by the filter-based methods. Two featured subsets were formulated, one based on home team advantage and other based on Toss decision. Selected machine learning techniques were applied on both feature sets to determine a predictive model. Experimental tests show that tree-based models particularly Random Forest performed better in terms of accuracy, precision and recall metrics when compared to probabilistic and statistical models. However, on the Toss featured subset, none of the considered machine learning algorithms performed well in producing accurate predictive models.


2021 ◽  
Author(s):  
Asad Mustafa Elmgerbi ◽  
Clemens Peter Ettinger ◽  
Peter Mbah Tekum ◽  
Gerhard Thonhauser ◽  
Andreas Nascimento

Abstract Over the past decade, several models have been generated to predict Rate of Penetration (ROP) in real-time. In general, these models can be classified into two categories, model-driven (analytical models) and data-driven models (based on machine learning techniques), which is considered as cutting-edge technology in terms of predictive accuracy and minimal human interfering. Nevertheless, most existing machine learning models are mainly used for prediction, not optimization. The ROP ahead of the bit for a certain formation layer can be predicted with such methods, but the limitation of the applications of these techniques is to find an optimum set of operating parameters for the optimization of ROP. In this regard, two data-driven models for ROP prediction have been developed and thereafter have been merged into an optimizer model. The purpose of the optimization process is to seek the ideal combinations of drilling parameters that would lead to an improvement in the ROP in real-time for a given formation. This paper is mainly focused on describing the process of development to create smart data-driven models (built on MATLAB software environment) for real-time rate of penetration prediction and optimization within a sufficient time span and without disturbing the drilling process, as it is typically required by a drill-off test. The used models here can be classified into two groups: two predictive models, Artificial Neural Network (ANN) and Random Forest (RF), in addition to one optimizer, namely genetic algorithm. The process started by developing, optimizing, and validation of the predictive models, which subsequently were linked to the genetic algorithm (GA) for real-time optimization. Automated optimization algorithms were integrated into the process of developing the productive models to improve the model efficiency and to reduce the errors. In order to validate the functionalities of the developed ROP optimization model, two different cases were studied. For the first case, historical drilling data from different wells were used, and the results confirmed that for the three known controllable surface drilling parameters, weight on bit (WOB) has the highest impact on ROP, followed by flow rate (FR) and finally rotation per minute (RPM), which has the least impact. In the second case, a laboratory scaled drilling rig "CDC miniRig" was utilized to validate the developed model, during the validation only the previous named parameters were used. Several meters were drilled through sandstone cubes at different weights on bit, rotations per minute, and flow rates to develop the productive models; then the optimizer was activated to propose the optimal set of the used parameters, which likely maximize the ROP. The proposed parameters were implemented, and the results showed that ROP improved as expected.


PLoS ONE ◽  
2018 ◽  
Vol 13 (10) ◽  
pp. e0203928 ◽  
Author(s):  
Leily Farrokhvar ◽  
Azadeh Ansari ◽  
Behrooz Kamali

IEEE Access ◽  
2019 ◽  
Vol 7 ◽  
pp. 182238-182258 ◽  
Author(s):  
Waqar Hussain ◽  
Muhammad Shahid Iqbal ◽  
Jie Xiang ◽  
Bin Wang ◽  
Yan Niu ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document