scholarly journals Evaluation of the COVID-19 Era by Using Machine Learning and Interpretation of Confidential Dataset

Electronics ◽  
2021 ◽  
Vol 10 (23) ◽  
pp. 2910
Author(s):  
Andreas Andreou ◽  
Constandinos X. Mavromoustakis ◽  
George Mastorakis ◽  
Jordi Mongay Batalla ◽  
Evangelos Pallis

Various research approaches to COVID-19 are currently being developed by machine learning (ML) techniques and edge computing, either in the sense of identifying virus molecules or in anticipating the risk analysis of the spread of COVID-19. Consequently, these orientations are elaborating datasets that derive either from WHO, through the respective website and research portals, or from data generated in real-time from the healthcare system. The implementation of data analysis, modelling and prediction processing is performed through multiple algorithmic techniques. The lack of these techniques to generate predictions with accuracy motivates us to proceed with this research study, which elaborates an existing machine learning technique and achieves valuable forecasts by modification. More specifically, this study modifies the Levenberg–Marquardt algorithm, which is commonly beneficial for approaching solutions to nonlinear least squares problems, endorses the acquisition of data driven from IoT devices and analyses these data via cloud computing to generate foresight about the progress of the outbreak in real-time environments. Hence, we enhance the optimization of the trend line that interprets these data. Therefore, we introduce this framework in conjunction with a novel encryption process that we are proposing for the datasets and the implementation of mortality predictions.

2021 ◽  
Vol 40 (4) ◽  
pp. 694-702
Author(s):  
O.E. Aru ◽  
K.C. Adimora ◽  
F.J. Nwankwo

The advent of 5G has improved greatly the speed of data transmission in wireless mobile technology. On the other hand, it has put society in suspense due to ailments that came along with its deployment. Many attributed the emission of 5G radiation as the main cause of cancer today and that has led to the writing of this article paper. The research study employed a machine learning technique that is based on an artificial neural network in modeling the 5G wireless technology. MATLAB, Simulink was used to analyze the absorption and penetration level of 5G electromagnetic energy pattern into biological tissue Deoxyribonucleic Acid (DNA). Our research result revealed that the energy produced by 5G radiation at the non-ionizing region of the electromagnetic spectrum is small and cannot break into the chemical bonds of biological tissue Deoxyribonucleic Acid (DNA) or cause changes to cells that will result in either cancer or viral disease.


2018 ◽  
Vol 42 (2) ◽  
pp. 35-51 ◽  
Author(s):  
Michael Krzyzaniak

This article presents a machine-learning technique to analyze and produce statistical patterns in rhythm through real-time observation of human musicians. Here, timbre is considered an integral part of rhythm, as might be exemplified by hand-drum music. Moreover, this article considers challenges (such as mechanical timing delays, that are negligible in digitally synthesized music) that arise when the algorithm is executed on percussion robots. The algorithm's performance is analyzed in a variety of contexts, such as learning specific rhythms, learning a corpus of rhythms, responding to signal rhythms that signal musical transitions, improvising in different ways with a human partner, and matching the meter and the “syncopicity” of improvised music.


2022 ◽  
pp. 131-142
Author(s):  
Jeya Mala D. ◽  
Pradeep Reynold A.

Edge analytics are tools and algorithms that are deployed in the internal storage of IoT devices or IoT gateways that collect, process, and analyze the data locally rather than transmitting it to the cloud for analysis. Edge analytics is applied in a wide range of applications in which immediate decision making is required. In the case of general IoT data analytics on the cloud, the data need to be collected from the IoT devices and to be sent to the cloud for further processing and decision making. In life-critical applications such as healthcare, the time taken to send the data to the cloud and then getting back the processed data to take decisions will not be acceptable. Hence, in these kinds of MIoT applications, it is essential to have analytics to be done on the edge in order to avoid such delays. Hence, this chapter is providing an abstract view on the application of machine learning in MIoT so that the data analytics provides fruitful results to the stakeholders.


2021 ◽  
Vol 42 ◽  
pp. 103012
Author(s):  
Van Minh Duong ◽  
Thanh Nhan Tran ◽  
Akhil Garg ◽  
Thinh Gia Phung ◽  
Van Man Tran ◽  
...  

Author(s):  
Ryan Jackson ◽  
Michael Jump ◽  
Peter Green

Physical-law based models are widely utilized in the aerospace industry. One such use is to provide flight dynamics models for use in flight simulators. For human-in-the-loop use, such simulators must run in real-time. Due to the complex physics of rotorcraft flight, to meet this real-time requirement, simplifications to the underlying physics sometimes have to be applied to the model, leading to model response errors in the predictions compared to the real vehicle. This study investigated whether a machine-learning technique could be employed to provide rotorcraft dynamic response predictions, with the ultimate aim of this model taking over when the physics-based model's accuracy degrades. In the current work, a machine-learning technique was employed to train a model to predict the dynamic response of a rotorcraft. Machine learning was facilitated using a Gaussian Process (GP) non-linear autoregressive model, which predicted the on-axis pitch rate, roll rate, yaw rate and heave responses of a Bo105 rotorcraft. A variational sparse GP model was then developed to reduce the computational cost of implementing the approach on large data sets. It was found that both of the GP models were able to provide accurate on-axis response predictions, particularly when the input contained all four control inceptors and one lagged on-axis response term. The predictions made showed improvement compared to a corresponding physics-based model. The reduction of training data to one-third (rotational axes) or one-half (heave axis) resulted in only minor degradation of the GP model predictions.


2021 ◽  
Author(s):  
MONALISHA PATTNAIK ◽  
ARYAN PATTNAIK

The COVID-19 is declared as a public health emergency of global concern by World Health Organisation (WHO) affecting a total of 201 countries across the globe during the period December 2019 to January 2021. As of January 25, 2021, it has caused a pandemic outbreak with more than 99 million confirmed cases and more than 2 million deaths worldwide. The crisp of this paper is to estimate the global risk in terms of CFR of the COVID-19 pandemic for seventy deeply affected countries. An optimal regression tree algorithm under machine learning technique is applied which identified four significant features like diabetes prevalence, total number of deaths in thousands, total number of confirmed cases in thousands, and hospital beds per 1000 out of fifteen input features. This real-time estimation will provide deep insights into the early detection of CFR for the countries under study.


Sign in / Sign up

Export Citation Format

Share Document