Introduction and Literature Review of the Application of Machine Learning/Deep Learning to Load Forecasting in Power System

2021 ◽  
pp. 119-135
Author(s):  
Arash Moradzadeh ◽  
Amin Mansour-Saatloo ◽  
Morteza Nazari-Heris ◽  
Behnam Mohammadi-Ivatloo ◽  
Somayeh Asadi
2021 ◽  
Vol 23 (Supplement_6) ◽  
pp. vi139-vi139
Author(s):  
Jan Lost ◽  
Tej Verma ◽  
Niklas Tillmanns ◽  
W R Brim ◽  
Harry Subramanian ◽  
...  

Abstract PURPOSE Identifying molecular subtypes in gliomas has prognostic and therapeutic value, traditionally after invasive neurosurgical tumor resection or biopsy. Recent advances using artificial intelligence (AI) show promise in using pre-therapy imaging for predicting molecular subtype. We performed a systematic review of recent literature on AI methods used to predict molecular subtypes of gliomas. METHODS Literature review conforming to PRSIMA guidelines was performed for publications prior to February 2021 using 4 databases: Ovid Embase, Ovid MEDLINE, Cochrane trials (CENTRAL), and Web of Science core-collection. Keywords included: artificial intelligence, machine learning, deep learning, radiomics, magnetic resonance imaging, glioma, and glioblastoma. Non-machine learning and non-human studies were excluded. Screening was performed using Covidence software. Bias analysis was done using TRIPOD guidelines. RESULTS 11,727 abstracts were retrieved. After applying initial screening exclusion criteria, 1,135 full text reviews were performed, with 82 papers remaining for data extraction. 57% used retrospective single center hospital data, 31.6% used TCIA and BRATS, and 11.4% analyzed multicenter hospital data. An average of 146 patients (range 34-462 patients) were included. Algorithms predicting IDH status comprised 51.8% of studies, MGMT 18.1%, and 1p19q 6.0%. Machine learning methods were used in 71.4%, deep learning in 27.4%, and 1.2% directly compared both methods. The most common algorithm for machine learning were support vector machine (43.3%), and for deep learning convolutional neural network (68.4%). Mean prediction accuracy was 76.6%. CONCLUSION Machine learning is the predominant method for image-based prediction of glioma molecular subtypes. Major limitations include limited datasets (60.2% with under 150 patients) and thus limited generalizability of findings. We recommend using larger annotated datasets for AI network training and testing in order to create more robust AI algorithms, which will provide better prediction accuracy to real world clinical datasets and provide tools that can be translated to clinical practice.


Cataract is a degenerative condition that, according to estimations, will rise globally. Even though there are various proposals about its diagnosis, there are remaining problems to be solved. This paper aims to identify the current situation of the recent investigations on cataract diagnosis using a framework to conduct the literature review with the intention of answering the following research questions: RQ1) Which are the existing methods for cataract diagnosis? RQ2) Which are the features considered for the diagnosis of cataracts? RQ3) Which is the existing classification when diagnosing cataracts? RQ4) And Which obstacles arise when diagnosing cataracts? Additionally, a cross-analysis of the results was made. The results showed that new research is required in: (1) the classification of “congenital cataract” and, (2) portable solutions, which are necessary to make cataract diagnoses easily and at a low cost.


2020 ◽  
Vol 8 (6) ◽  
pp. 3034-3039

Nowadays, a lot of research is going on in healthcare. One of the significant diseases increased all over the world is Diabetes Mellitus (DM). In this paper, the literature review is done on diabetes prediction using Machine Learning and Deep Learning techniques. Various ML algorithms are used using PIDD (Pima Indian diabetes dataset), and improved k- means using logistic regression among all algorithms achieved the highest accuracy. DL algorithms like CNN and LMST used in diabetic retinopathy images.


Diagnostics ◽  
2020 ◽  
Vol 10 (8) ◽  
pp. 518 ◽  
Author(s):  
Hafsa Khalid ◽  
Muzammil Hussain ◽  
Mohammed A. Al Ghamdi ◽  
Tayyaba Khalid ◽  
Khadija Khalid ◽  
...  

The purpose of this research was to provide a “systematic literature review” of knee bone reports that are obtained by MRI, CT scans, and X-rays by using deep learning and machine learning techniques by comparing different approaches—to perform a comprehensive study on the deep learning and machine learning methodologies to diagnose knee bone diseases by detecting symptoms from X-ray, CT scan, and MRI images. This study will help those researchers who want to conduct research in the knee bone field. A comparative systematic literature review was conducted for the accomplishment of our work. A total of 32 papers were reviewed in this research. Six papers consist of X-rays of knee bone with deep learning methodologies, five papers cover the MRI of knee bone using deep learning approaches, and another five papers cover CT scans of knee bone with deep learning techniques. Another 16 papers cover the machine learning techniques for evaluating CT scans, X-rays, and MRIs of knee bone. This research compares the deep learning methodologies for CT scan, MRI, and X-ray reports on knee bone, comparing the accuracy of each technique, which can be used for future development. In the future, this research will be enhanced by comparing X-ray, CT-scan, and MRI reports of knee bone with information retrieval and big data techniques. The results show that deep learning techniques are best for X-ray, MRI, and CT scan images of the knee bone to diagnose diseases.


Energies ◽  
2020 ◽  
Vol 13 (2) ◽  
pp. 391 ◽  
Author(s):  
Salah Bouktif ◽  
Ali Fiaz ◽  
Ali Ouni ◽  
Mohamed Adel Serhani

Short term electric load forecasting plays a crucial role for utility companies, as it allows for the efficient operation and management of power grid networks, optimal balancing between production and demand, as well as reduced production costs. As the volume and variety of energy data provided by building automation systems, smart meters, and other sources are continuously increasing, long short-term memory (LSTM) deep learning models have become an attractive approach for energy load forecasting. These models are characterized by their capabilities of learning long-term dependencies in collected electric data, which lead to accurate prediction results that outperform several alternative statistical and machine learning approaches. Unfortunately, applying LSTM models may not produce acceptable forecasting results, not only because of the noisy electric data but also due to the naive selection of its hyperparameter values. Therefore, an optimal configuration of an LSTM model is necessary to describe the electric consumption patterns and discover the time-series dynamics in the energy domain. Finding such an optimal configuration is, on the one hand, a combinatorial problem where selection is done from a very large space of choices; on the other hand, it is a learning problem where the hyperparameters should reflect the energy consumption domain knowledge, such as the influential time lags, seasonality, periodicity, and other temporal attributes. To handle this problem, we use in this paper metaheuristic-search-based algorithms, known by their ability to alleviate search complexity as well as their capacity to learn from the domain where they are applied, to find optimal or near-optimal values for the set of tunable LSTM hyperparameters in the electrical energy consumption domain. We tailor both a genetic algorithm (GA) and particle swarm optimization (PSO) to learn hyperparameters for load forecasting in the context of energy consumption of big data. The statistical analysis of the obtained result shows that the multi-sequence deep learning model tuned by the metaheuristic search algorithms provides more accurate results than the benchmark machine learning models and the LSTM model whose inputs and hyperparameters were established through limited experience and a discounted number of experimentations.


Designs ◽  
2021 ◽  
Vol 5 (2) ◽  
pp. 27
Author(s):  
Navid Shirzadi ◽  
Ameer Nizami ◽  
Mohammadali Khazen ◽  
Mazdak Nik-Bakht

Due to severe climate change impact on electricity consumption, as well as new trends in smart grids (such as the use of renewable resources and the advent of prosumers and energy commons), medium-term and long-term electricity load forecasting has become a crucial need. Such forecasts are necessary to support the plans and decisions related to the capacity evaluation of centralized and decentralized power generation systems, demand response strategies, and controlling the operation. To address this problem, the main objective of this study is to develop and compare precise district level models for predicting the electrical load demand based on machine learning techniques including support vector machine (SVM) and Random Forest (RF), and deep learning methods such as non-linear auto-regressive exogenous (NARX) neural network and recurrent neural networks (Long Short-Term Memory—LSTM). A dataset including nine years of historical load demand for Bruce County, Ontario, Canada, fused with the climatic information (temperature and wind speed) are used to train the models after completing the preprocessing and cleaning stages. The results show that by employing deep learning, the model could predict the load demand more accurately than SVM and RF, with an R-Squared of about 0.93–0.96 and Mean Absolute Percentage Error (MAPE) of about 4–10%. The model can be used not only by the municipalities as well as utility companies and power distributors in the management and expansion of electricity grids; but also by the households to make decisions on the adoption of home- and district-scale renewable energy technologies.


Information ◽  
2020 ◽  
Vol 11 (7) ◽  
pp. 357
Author(s):  
Dabeeruddin Syed ◽  
Shady S. Refaat ◽  
Othmane Bouhali

Deep learning models have been applied for varied electrical applications in smart grids with a high degree of reliability and accuracy. The development of deep learning models requires the historical data collected from several electric utilities during the training of the models. The lack of historical data for training and testing of developed models, considering security and privacy policy restrictions, is considered one of the greatest challenges to machine learning-based techniques. The paper proposes the use of homomorphic encryption, which enables the possibility of training the deep learning and classical machine learning models whilst preserving the privacy and security of the data. The proposed methodology is tested for applications of fault identification and localization, and load forecasting in smart grids. The results for fault localization show that the classification accuracy of the proposed privacy-preserving deep learning model while using homomorphic encryption is 97–98%, which is close to 98–99% classification accuracy of the model on plain data. Additionally, for load forecasting application, the results show that RMSE using the homomorphic encryption model is 0.0352 MWh while RMSE without application of encryption in modeling is around 0.0248 MWh.


Sign in / Sign up

Export Citation Format

Share Document