scholarly journals Analysis of Deep Learning Tools and Applications in e-Healthcare

2018 ◽  
pp. 68-90
Author(s):  
Rojalina Priyadarshini ◽  
Rabindra K. Barik ◽  
Brojo Kishore Mishra
Keyword(s):  
2021 ◽  
Vol 11 (13) ◽  
pp. 5880
Author(s):  
Paloma Tirado-Martin ◽  
Raul Sanchez-Reillo

Nowadays, Deep Learning tools have been widely applied in biometrics. Electrocardiogram (ECG) biometrics is not the exception. However, the algorithm performances rely heavily on a representative dataset for training. ECGs suffer constant temporal variations, and it is even more relevant to collect databases that can represent these conditions. Nonetheless, the restriction in database publications obstructs further research on this topic. This work was developed with the help of a database that represents potential scenarios in biometric recognition as data was acquired in different days, physical activities and positions. The classification was implemented with a Deep Learning network, BioECG, avoiding complex and time-consuming signal transformations. An exhaustive tuning was completed including variations in enrollment length, improving ECG verification for more complex and realistic biometric conditions. Finally, this work studied one-day and two-days enrollments and their effects. Two-days enrollments resulted in huge general improvements even when verification was accomplished with more unstable signals. EER was improved in 63% when including a change of position, up to almost 99% when visits were in a different day and up to 91% if the user experienced a heartbeat increase after exercise.


2018 ◽  
Vol 68 (1) ◽  
pp. 161-181 ◽  
Author(s):  
Dan Guest ◽  
Kyle Cranmer ◽  
Daniel Whiteson

Machine learning has played an important role in the analysis of high-energy physics data for decades. The emergence of deep learning in 2012 allowed for machine learning tools which could adeptly handle higher-dimensional and more complex problems than previously feasible. This review is aimed at the reader who is familiar with high-energy physics but not machine learning. The connections between machine learning and high-energy physics data analysis are explored, followed by an introduction to the core concepts of neural networks, examples of the key results demonstrating the power of deep learning for analysis of LHC data, and discussion of future prospects and concerns.


Geophysics ◽  
2021 ◽  
pp. 1-45
Author(s):  
Runhai Feng ◽  
Dario Grana ◽  
Niels Balling

Segmentation of faults based on seismic images is an important step in reservoir characterization. With the recent developments of deep-learning methods and the availability of massive computing power, automatic interpretation of seismic faults has become possible. The likelihood of occurrence for a fault can be quantified using a sigmoid function. Our goal is to quantify the fault model uncertainty that is generally not captured by deep-learning tools. We propose to use the dropout approach, a regularization technique to prevent overfitting and co-adaptation in hidden units, to approximate the Bayesian inference and estimate the principled uncertainty over functions. Particularly, the variance of the learned model has been decomposed into aleatoric and epistemic parts. The proposed method is applied to a real dataset from the Netherlands F3 block with two different dropout ratios in convolutional neural networks. The aleatoric uncertainty is irreducible since it relates to the stochastic dependency within the input observations. As the number of Monte-Carlo realizations increases, the epistemic uncertainty asymptotically converges and the model standard deviation decreases, because the variability of model parameters is better simulated or explained with a larger sample size. This analysis can quantify the confidence to use fault predictions with less uncertainty. Additionally, the analysis suggests where more training data are needed to reduce the uncertainty in low confidence regions.


Author(s):  
Nur Farhana Hordri ◽  
Siti Sophiayati Yuhaniz ◽  
Siti Mariyam Shamsuddin ◽  
Nurulhuda Firdaus Mohd Azmi

Author(s):  
Shradha Verma ◽  
Anuradha Chug ◽  
Amit Prakash Singh ◽  
Shubham Sharma ◽  
Puranjay Rajvanshi

With the increasing computational power, areas such as machine learning, image processing, deep learning, etc. have been extensively applied in agriculture. This chapter investigates the applications of the said areas and various prediction models in plant pathology for accurate classification, identification, and quantification of plant diseases. The authors aim to automate the plant disease identification process. To accomplish this objective, CNN has been utilized for image classification. Research shows that deep learning architectures outperform other machine learning tools significantly. To this effect, the authors have implemented and trained five CNN models, namely Inception ResNet v2, VGG16, VGG19, ResNet50, and Xception, on PlantVillage dataset for tomato leaf images. The authors analyzed 18,160 tomato leaf images spread across 10 class labels. After comparing their performance measures, ResNet50 proved to be the most accurate prediction tool. It was employed to create a mobile application to classify and identify tomato plant diseases successfully.


Author(s):  
Pawan Kumar Chaurasia

This chapter conducts a critical review on ML and deep learning tools and techniques in the field of heart disease related to heart disease complexity, prediction, and diagnosis. Only specific papers are selected for the study to extract useful information, which stimulated a new hypothesis to understand further investigation of the heart disease patient.


2020 ◽  
pp. 1826-1838
Author(s):  
Rojalina Priyadarshini ◽  
Rabindra K. Barik ◽  
Chhabi Panigrahi ◽  
Harishchandra Dubey ◽  
Brojo Kishore Mishra

This article describes how machine learning (ML) algorithms are very useful for analysis of data and finding some meaningful information out of them, which could be used in various other applications. In the last few years, an explosive growth has been seen in the dimension and structure of data. There are several difficulties faced by conventional ML algorithms while dealing with such highly voluminous and unstructured big data. The modern ML tools are designed and used to deal with all sorts of complexities of data. Deep learning (DL) is one of the modern ML tools which are commonly used to find the hidden structure and cohesion among these large data sets by giving proper training in parallel platforms with intelligent optimization techniques to further analyze and interpret the data for future prediction and classification. This article focuses on the use of DL tools and software which are used in past couple of years in various areas and especially in the area of healthcare applications.


Sign in / Sign up

Export Citation Format

Share Document