scholarly journals Machine learning and artificial intelligence: applications in healthcare epidemiology

Author(s):  
Alisa J. Hamilton ◽  
Alexandra T. Strauss ◽  
Diego A. Martinez ◽  
Jeremiah S. Hinson ◽  
Scott Levin ◽  
...  

Abstract Artificial intelligence (AI) refers to the performance of tasks by machines ordinarily associated with human intelligence. Machine learning (ML) is a subtype of AI; it refers to the ability of computers to draw conclusions (ie, learn) from data without being directly programmed. ML builds from traditional statistical methods and has drawn significant interest in healthcare epidemiology due to its potential for improving disease prediction and patient care. This review provides an overview of ML in healthcare epidemiology and practical examples of ML tools used to support healthcare decision making at 4 stages of hospital-based care: triage, diagnosis, treatment, and discharge. Examples include model-building efforts to assist emergency department triage, predicting time before septic shock onset, detecting community-acquired pneumonia, and classifying COVID-19 disposition risk level. Increasing availability and quality of electronic health record (EHR) data as well as computing power provides opportunities for ML to increase patient safety, improve the efficiency of clinical management, and reduce healthcare costs.

2021 ◽  
Author(s):  
S. H. Al Gharbi ◽  
A. A. Al-Majed ◽  
A. Abdulraheem ◽  
S. Patil ◽  
S. M. Elkatatny

Abstract Due to high demand for energy, oil and gas companies started to drill wells in remote areas and unconventional environments. This raised the complexity of drilling operations, which were already challenging and complex. To adapt, drilling companies expanded their use of the real-time operation center (RTOC) concept, in which real-time drilling data are transmitted from remote sites to companies’ headquarters. In RTOC, groups of subject matter experts monitor the drilling live and provide real-time advice to improve operations. With the increase of drilling operations, processing the volume of generated data is beyond a human's capability, limiting the RTOC impact on certain components of drilling operations. To overcome this limitation, artificial intelligence and machine learning (AI/ML) technologies were introduced to monitor and analyze the real-time drilling data, discover hidden patterns, and provide fast decision-support responses. AI/ML technologies are data-driven technologies, and their quality relies on the quality of the input data: if the quality of the input data is good, the generated output will be good; if not, the generated output will be bad. Unfortunately, due to the harsh environments of drilling sites and the transmission setups, not all of the drilling data is good, which negatively affects the AI/ML results. The objective of this paper is to utilize AI/ML technologies to improve the quality of real-time drilling data. The paper fed a large real-time drilling dataset, consisting of over 150,000 raw data points, into Artificial Neural Network (ANN), Support Vector Machine (SVM) and Decision Tree (DT) models. The models were trained on the valid and not-valid datapoints. The confusion matrix was used to evaluate the different AI/ML models including different internal architectures. Despite the slowness of ANN, it achieved the best result with an accuracy of 78%, compared to 73% and 41% for DT and SVM, respectively. The paper concludes by presenting a process for using AI technology to improve real-time drilling data quality. To the author's knowledge based on literature in the public domain, this paper is one of the first to compare the use of multiple AI/ML techniques for quality improvement of real-time drilling data. The paper provides a guide for improving the quality of real-time drilling data.


2020 ◽  
Vol 18 (2) ◽  
Author(s):  
Nedeljko Šikanjić ◽  
Zoran Ž. Avramović ◽  
Esad Jakupović

In today’s world, devices with possibility to communicate, are emerging and growing daily. This advanced technology is bringing ideas of how to use these devices, in order to gain financial benefits for enterprises, business and economy in general. Purpose of research in this scientific paper is to discover, what are the trends in connecting these devices, called internet of things (IoT), what are financial aspects of implementing IoT solutions and how leaders in area of cloud computing and IoT, are implementing additional advanced technologies such as machine learning and artificial intelligence, to improve processes and gain increase in revenue, while bringing automation in place for the end users. Development of informational society is not only bringing innovation to everyday life, but is also providing effect on the economy. This effect reflects on various business platforms, companies and organizations while increasing the quality of the end product or service that is being provided.


2021 ◽  
Vol 18 (1) ◽  
pp. 27-35
Author(s):  
Roman B. Kupriyanov ◽  
Dmitry L. Agranat ◽  
Ruslan S. Suleymanov

Problem and goal. Developed and tested solutions for building individual educational trajectories of students, focused on improving the educational process by forming a personalized set of recommendations from the optional disciplines. Methodology. Data mining and machine learning methods were used to process both numeric and textual data. The approaches based on collaborative and content filtering to generate recommendations for students were also used. Results. Testing of the developed system was carried out in the context of several periods of elective courses selection, in which 4,769 first- and second-year students took part. A set of recommendations was automatically generated for each student, and then the quality of the recommendations was evaluated based on the percentage of students who used these recommendations. According to the results of testing, the recommendations were used by 1,976 students, which was 41.43% of the total number of participants. Conclusion. In the study, a recommendation system was developed that performs automatic ranking of subjects of choice and forms a personalized set of recommendations for each student based on their interests for building individual educational trajectories.


Author(s):  
Venkat Narayana Rao T. ◽  
Manogna Thumukunta ◽  
Muralidhar Kurni ◽  
Saritha K.

Artificial intelligence and automation are believed by many to be the new age of industrial revolution. Machine learning is an artificial intelligence section that recognizes patterns from vast amounts of data and projects useful information. Prediction, as an application of machine learning, has been sought after by all kinds of industries. Predictive models with higher efficiencies have proven effective in reducing market risks, predicting natural disasters, indicating health risks, and predicting stock values. The quality of decision making through these algorithms has left a lasting impression on several businesses and is bound to alter how the world looks at analytics. This chapter includes an introduction to machine learning and prediction using machine learning. It also sheds light on its approach and its applications.


2021 ◽  
Author(s):  
Andrei Popa ◽  
Ben Amaba ◽  
Jeff Daniels

Abstract A practical framework that outlines the critical steps of a successful process that uses data, machine learning (Ml), and artificial intelligence (AI) is presented in this study. A practical case study is included to demonstrate the process. The use of artificial intelligent and machine learning has not only enhanced but also sped up problem-solving approaches in many domains, including the oil and gas industry. Moreover, these technologies are revolutionizing all key aspects of engineering including; framing approaches, techniques, and outcomes. The proposed framework includes key components to ensure integrity, quality, and accuracy of data and governance centered on principles such as responsibility, equitability, and reliability. As a result, the industry documentation shows that technology coupled with process advances can improve productivity by 20%. A clear work-break-down structure (WBS) to create value using an engineering framework has measurable outcomes. The AI and ML technologies enable the use of large amounts of information, combining static & dynamic data, observations, historical events, and behaviors. The Job Task Analysis (JTA) model is a proven framework to manage processes, people, and platforms. JTA is a modern data-focused approach that prioritizes in order: problem framing, analytics framing, data, methodology, model building, deployment, and lifecycle management. The case study exemplifies how the JTA model optimizes an oilfield production plant, similar to a manufacturing facility. A data-driven approach was employed to analyze and evaluate the production fluid impact during facility-planned or un-planned system disruptions. The workflows include data analytics tools such as ML&AI for pattern recognition and clustering for prompt event mitigation and optimization. The paper demonstrates how an integrated framework leads to significant business value. The study integrates surface and subsurface information to characterize and understand the production impact due to planned and unplanned plant events. The findings led to designing a relief system to divert the back pressure during plant shutdown. The study led to cost avoidance of a new plant, saving millions of dollars, environment impact, and safety considerations, in addition to unnecessary operating costs and maintenance. Moreover, tens of millions of dollars value per year by avoiding production loss of plant upsets or shutdown was created. The study cost nothing to perform, about two months of not focused time by a team of five engineers and data scientists. The work provided critical steps in "creating a trusting" model and "explainability’. The methodology was implemented using existing available data and tools; it was the process and engineering knowledge that led to the successful outcome. Having a systematic WBS has become vital in data analytics projects that use AI and ML technologies. An effective governance system creates 25% productivity improvement and 70% capital improvement. Poor requirements can consume 40%+ of development budget. The process, models, and tools should be used on engineering projects where data and physics are present. The proposed framework demonstrates the business impact and value creation generated by integrating models, data, AI, and ML technologies for modeling and optimization. It reflects the collective knowledge and perspectives of diverse professionals from IBM, Lockheed Martin, and Chevron, who joined forces to document a standard framework for achieving success in data analytics/AI projects.


2020 ◽  
pp. 1-12
Author(s):  
Suhua Bu

In the era of the Internet of Things, smart logistics has become an important means to improve people’s life rhythm and quality of life. At present, some problems in logistics engineering have caused logistics efficiency to fail to meet people’s expected goals. Based on this, this paper proposes a logistics engineering optimization system based on machine learning and artificial intelligence technology. Moreover, based on the classifier chain and the combined classifier chain, this paper proposes an improved multi-label chain learning method for high-dimensional data. In addition, this study combines the actual needs of logistics transportation and the constraints of the logistics transportation process to use multi-objective optimization to optimize logistics engineering and output the optimal solution through an artificial intelligence model. In order to verify the effectiveness of the model, the performance of the method proposed in this paper is verified by designing a control experiment. The research results show that the logistics engineering optimization based on machine learning and artificial intelligence technology proposed in this paper has a certain practical effect.


2020 ◽  
Vol 5 (17) ◽  
pp. 1-5
Author(s):  
Jitendrea Kumar Saha ◽  
Kailash Patidar ◽  
Rishi Kushwah ◽  
Gaurav Saxena

Software quality estimation is an important aspect as it eliminates design and code defects. Object- oriented quality metrics prediction can help in the estimation of software quality of any defects and the chances of errors. In this paper a survey and the case analytics have been presented for the object-oriented quality prediction. It shows the analytical and experimental aspects of previous methodologies. This survey also elaborates different object-oriented parameters which is useful for the same problem. It also elaborates the problem aspects as well the limitations for the future directions. Machine learning and artificial intelligence methods have been considered mostly for this survey. The parameters considered are inheritance, dynamic behavior, encapsulation, objects etc.


Today is the generation of Machine Learning and Artificial Intelligence. Machine Learning is a field of scientific study and statistical models to predict the answers of never before asked questions. Machine Learning algorithms use a huge quantity of sample data that is further used to generate model. The higher amount and quality of training set lead to higher accuracy in approximate result calculation. ML is the most popular field to research and also helpful in pattern finding, artificial intelligence and data analysis. In this paper we are going to explain the basic concept of Machine Learning with its various types of methods. These methods can be used according to user’s requirement. Machine Learning tasks are divided into various categories . These tasks are accomplished by computer system without being explicitly programmed.


2021 ◽  
Author(s):  
Yew Kee Wong

In the information era, enormous amounts of data have become available on hand to decision makers. Big data refers to datasets that are not only big, but also high in variety and velocity, which makes them difficult to handle using traditional tools and techniques. Due to the rapid growth of such data, solutions need to be studied and provided in order to handle and extract value and knowledge from these datasets. Machine learning is a method of data analysis that automates analytical model building. It is a branch of artificial intelligence based on the idea that systems can learn from data, identify patterns and make decisions with minimal human intervention. Such minimal human intervention can be provided using machine learning, which is the application of advanced deep learning techniques on big data. This paper aims to analyse some of the different machine learning and deep learning algorithms and methods, aswell as the opportunities provided by the AI applications in various decision making domains.


2020 ◽  
pp. 30-37
Author(s):  
Anandakumar Haldorai ◽  
Shrinand Anandakumar

The ideology of explainability in Artificial Intelligence (AI) is a prevailing issue which requires attention in the healthcare sector. The issue of explain ability is as ancient as AI and the sophisticated AI signified an understandable retraceable technique. Nonetheless, their demerits were in handling the uncertainties of the actual world. As a result of the advent of probabilistic education, applications have now been considered successful and considerably invisible. Comprehensive AI handles the implementation of traceability and transparency of statistical black box techniques of Machine Learning (ML), certainly Deep Learning (DL). Based on the approach of this paper, it can be argued that there is need for researchers to go beyond the comprehensive AI. To accomplish the dimension of explainability in the healthcare sector, causability aspects have to be incorporated. In the same manner that usability incorporates measurements for the quality of usage, causability incorporates the evaluation of explainable quality. In this research, we provide a number of fundamental definitions to effectively discriminate between causability and explainability, including the application case of DL and human comprehensibility in the field of histopathology. The fundamental contribution of this paper is the ideology of causability that has been differentiated from the notion of explainability whereby causability is based on personal property whereas explainability is the system property.


Sign in / Sign up

Export Citation Format

Share Document